Science.gov

Sample records for accident analysis methodology

  1. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  2. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Gregg L. Sharp; R. T. McCracken

    2003-06-01

    The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  3. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Sharp, G.L.; McCracken, R.T.

    2003-05-13

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  4. A DOE-STD-3009 hazard and accident analysis methodology for non-reactor nuclear facilities

    SciTech Connect

    MAHN,JEFFREY A.; WALKER,SHARON ANN

    2000-03-23

    This paper demonstrates the use of appropriate consequence evaluation criteria in conjunction with generic likelihood of occurrence data to produce consistent hazard analysis results for nonreactor nuclear facility Safety Analysis Reports (SAR). An additional objective is to demonstrate the use of generic likelihood of occurrence data as a means for deriving defendable accident sequence frequencies, thereby enabling the screening of potentially incredible events (<10{sup {minus}6} per year) from the design basis accident envelope. Generic likelihood of occurrence data has been used successfully in performing SAR hazard and accident analyses for two nonreactor nuclear facilities at Sandia National Laboratories. DOE-STD-3009-94 addresses and even encourages use of a qualitative binning technique for deriving and ranking nonreactor nuclear facility risks. However, qualitative techniques invariably lead to reviewer requests for more details associated with consequence or likelihood of occurrence bin assignments in the test of the SAR. Hazard analysis data displayed in simple worksheet format generally elicits questions about not only the assumptions behind the data, but also the quantitative bases for the assumptions themselves (engineering judgment may not be considered sufficient by some reviewers). This is especially true where the criteria for qualitative binning of likelihood of occurrence involves numerical ranges. Oftentimes reviewers want to see calculations or at least a discussion of event frequencies or failure probabilities to support likelihood of occurrence bin assignments. This may become a significant point of contention for events that have been binned as incredible. This paper will show how the use of readily available generic data can avoid many of the reviewer questions that will inevitably arise from strictly qualitative analyses, while not significantly increasing the overall burden on the analyst.

  5. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  6. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  7. Reactivity Insertion Accident Analysis with Coupled RETRAN

    SciTech Connect

    Kim, Yo-Han; Yang, Chang-Keun; Sung, Chang-Kyung; Lee, Chang-Sup

    2004-07-01

    Because of analysis field to be wider and complicated is required, it faces up hardships due to the narrow analysis scopes and limited functions of current vendor code systems. To overcome it KEPRI develop the in-house safety analysis methodology based on the available exquisite codes. For the development, the RETRAN code was modified and coupled to compensate for the lack of capabilities. To estimate the feasibility of the methodology and code system, some reactivity insertion accidents were analyzed and compared with the results mentioned in final safety analysis report of the plant. (authors)

  8. Accident Tolerant Fuel Analysis

    SciTech Connect

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  9. Accident tolerant fuel analysis

    SciTech Connect

    Smith, Curtis; Chichester, Heather; Johns, Jesse; Teague, Melissa; Tonks, Michael Idaho National Laboratory; Youngblood, Robert

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  10. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  11. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    SciTech Connect

    Brereton, S.; Shinn, J.; Hesse, D; Kaninich, D.; Lazaro, M.; Mubayi, V.

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  12. Nuclear fuel cycle facility accident analysis handbook

    SciTech Connect

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  13. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    SciTech Connect

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.; Ellison, Paul B.; Francis, M.; Gabor, John D.; Gauntt, R.; Henry, C.; Linthicum, R.; Luangdilok, W.; Lutz, R.; Paik, C.; Plys, M.; Rabiti, Cristian; Rempe, J.; Robb, K.; Wachowiak, R.

    2015-01-31

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affect reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).

  14. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  15. Decision-problem state analysis methodology

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

  16. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  17. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  18. Corporate cost of occupational accidents: an activity-based analysis.

    PubMed

    Rikhardsson, Pall M; Impgaard, Martin

    2004-03-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish companies within three different industry sectors and the costs of 27 selected occupational accidents in these companies were calculated. One of the main conclusions is that the SACA method could be used in all of the companies without revisions. The evaluation of accident cost showed that 2/3 of the costs of occupational accidents are visible in the Danish corporate accounting systems reviewed while 1/3 is hidden from management view. The highest cost of occupational accidents for a company with 3.600 employees was estimated to approximately US$ 682.000. The paper includes an introduction regarding accident cost analysis in companies, a presentation of the SACA project methodology and the SACA method itself, a short overview of some of the results of the SACA project and a conclusion. Further information about the project is available at http://www.asb.dk/saca. PMID:14642872

  19. Corporate cost of occupational accidents: an activity-based analysis.

    PubMed

    Rikhardsson, Pall M; Impgaard, Martin

    2004-03-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish companies within three different industry sectors and the costs of 27 selected occupational accidents in these companies were calculated. One of the main conclusions is that the SACA method could be used in all of the companies without revisions. The evaluation of accident cost showed that 2/3 of the costs of occupational accidents are visible in the Danish corporate accounting systems reviewed while 1/3 is hidden from management view. The highest cost of occupational accidents for a company with 3.600 employees was estimated to approximately US$ 682.000. The paper includes an introduction regarding accident cost analysis in companies, a presentation of the SACA project methodology and the SACA method itself, a short overview of some of the results of the SACA project and a conclusion. Further information about the project is available at http://www.asb.dk/saca.

  20. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G.

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

  1. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  2. Calculation of relative tube/tube support plate displacements in steam generators under accident condition loads using non-linear dynamic analysis methodologies

    SciTech Connect

    Smith, R.E.; Waisman, R.; Hu, M.H.; Frick, T.M.

    1995-12-01

    A non-linear analysis has been performed to determine relative motions between tubes and tube support plates (TSP) during a steam line break (SLB) event for steam generators. The SLB event results in blowdown of steam and water out of the steam generator. The fluid blowdown generates pressure drops across the TSPS, resulting in out-of-plane motion. The SLB induced pressure loads are calculated with a computer program that uses a drift-flux modeling of the two-phase flow. In order to determine the relative tube/TSP motions, a nonlinear dynamic time-history analysis is performed using a structural model that considers all of the significant component members relative to the tube support system. The dynamic response of the structure to the pressure loads is calculated using a special purpose computer program. This program links the various substructures at common degrees of freedom into a combined mass and stiffness matrix. The program accounts for structural non-linearities, including potential tube and TSP interaction at any given tube position. The program also accounts for structural damping as part of the dynamic response. Incorporating all of the above effects, the equations of motion are solved to give TSP displacements at the reduced set of DOF. Using the displacement results from the dynamic analysis, plate stresses are then calculated using the detailed component models. Displacements form the dynamic analysis are imposed as boundary conditions at the DOF locations, and the finite element program then solves for the overall distorted geometry. Calculations are also performed to assure that assumptions regarding elastic response of the various structural members and support points are valid.

  3. Safety analysis of surface haulage accidents

    SciTech Connect

    Randolph, R.F.; Boldt, C.M.K.

    1996-12-31

    Research on improving haulage truck safety, started by the U.S. Bureau of Mines, is being continued by its successors. This paper reports the orientation of the renewed research efforts, beginning with an update on accident data analysis, the role of multiple causes in these accidents, and the search for practical methods for addressing the most important causes. Fatal haulage accidents most often involve loss of control or collisions caused by a variety of factors. Lost-time injuries most often involve sprains or strains to the back or multiple body areas, which can often be attributed to rough roads and the shocks of loading and unloading. Research to reduce these accidents includes improved warning systems, shock isolation for drivers, encouraging seatbelt usage, and general improvements to system and task design.

  4. HTGR severe accident sequence analysis

    SciTech Connect

    Harrington, R.M.; Ball, S.J.; Kornegay, F.C.

    1982-01-01

    Thermal-hydraulic, fission product transport, and atmospheric dispersion calculations are presented for hypothetical severe accident release paths at the Fort St. Vrain (FSV) high temperature gas cooled reactor (HTGR). Off-site radiation exposures are calculated for assumed release of 100% of the 24 hour post-shutdown core xenon and krypton inventory and 5.5% of the iodine inventory. The results show conditions under which dose avoidance measures would be desirable and demonstrate the importance of specific release characteristics such as effective release height. 7 tables.

  5. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  6. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  7. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2012-01-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  8. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2011-12-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  9. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  10. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  11. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  12. Cross-analysis of hazmat road accidents using multiple databases.

    PubMed

    Trépanier, Martin; Leroux, Marie-Hélène; de Marcellis-Warin, Nathalie

    2009-11-01

    Road selection for hazardous materials transportation relies heavily on risk analysis. With risk being generally expressed as a product of the probability of occurrence and the expected consequence, one will understand that risk analysis is data intensive. However, various authors have noticed the lack of statistical reliability of hazmat accident databases due to the systematic underreporting of such events. Also, official accident databases alone are not always providing all the information required (economical impact, road conditions, etc.). In this paper, we attempt to integrate many data sources to analyze hazmat accidents in the province of Quebec, Canada. Databases on dangerous goods accidents, road accidents and work accidents were cross-analyzed. Results show that accidents can hardly be matched and that these databases suffer from underreporting. Police records seem to have better coverage than official records maintained by hazmat authorities. Serious accidents are missing from government's official databases (some involving deaths or major spills) even though their declaration is mandatory.

  13. Cross-analysis of hazmat road accidents using multiple databases.

    PubMed

    Trépanier, Martin; Leroux, Marie-Hélène; de Marcellis-Warin, Nathalie

    2009-11-01

    Road selection for hazardous materials transportation relies heavily on risk analysis. With risk being generally expressed as a product of the probability of occurrence and the expected consequence, one will understand that risk analysis is data intensive. However, various authors have noticed the lack of statistical reliability of hazmat accident databases due to the systematic underreporting of such events. Also, official accident databases alone are not always providing all the information required (economical impact, road conditions, etc.). In this paper, we attempt to integrate many data sources to analyze hazmat accidents in the province of Quebec, Canada. Databases on dangerous goods accidents, road accidents and work accidents were cross-analyzed. Results show that accidents can hardly be matched and that these databases suffer from underreporting. Police records seem to have better coverage than official records maintained by hazmat authorities. Serious accidents are missing from government's official databases (some involving deaths or major spills) even though their declaration is mandatory. PMID:19819367

  14. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  15. Canister storage building design basis accident analysis documentation

    SciTech Connect

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  16. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  17. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  18. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  19. Exploratory analysis of Spanish energetic mining accidents.

    PubMed

    Sanmiquel, Lluís; Freijo, Modesto; Rossell, Josep M

    2012-01-01

    Using data on work accidents and annual mining statistics, the paper studies work-related accidents in the Spanish energetic mining sector in 1999-2008. The following 3 parameters are considered: age, experience and size of the mine (in number of workers) where the accident took place. The main objective of this paper is to show the relationship between different accident indicators: risk index (as an expression of the incidence), average duration index for the age and size of the mine variables (as a measure of the seriousness of an accident), and the gravity index for the various sizes of mines (which measures the seriousness of an accident, too). The conclusions of this study could be useful to develop suitable prevention policies that would contribute towards a decrease in work-related accidents in the Spanish energetic mining industry. PMID:22721539

  20. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2131982

  1. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  3. Coupled thermal analysis applied to the study of the rod ejection accident

    SciTech Connect

    Gonnet, M.

    2012-07-01

    An advanced methodology for the assessment of fuel-rod thermal margins under RIA conditions has been developed by AREVA NP SAS. With the emergence of RIA analytical criteria, the study of the Rod Ejection Accident (REA) would normally require the analysis of each fuel rod, slice by slice, over the whole core. Up to now the strategy used to overcome this difficulty has been to perform separate analyses of sampled fuel pins with conservative hypotheses for thermal properties and boundary conditions. In the advanced methodology, the evaluation model for the Rod Ejection Accident (REA) integrates the node average fuel and coolant properties calculation for neutron feedback purpose as well as the peak fuel and coolant time-dependent properties for criteria checking. The calculation grid for peak fuel and coolant properties can be specified from the assembly pitch down to the cell pitch. The comparative analysis of methodologies shows that coupled methodology allows reducing excessive conservatism of the uncoupled approach. (authors)

  4. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  5. Methodological development for selection of significant predictors explaining fatal road accidents.

    PubMed

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. PMID:26928290

  6. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  7. Aircraft accidents.method of analysis

    NASA Technical Reports Server (NTRS)

    1937-01-01

    This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

  8. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2015-08-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  9. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  10. Chemical considerations in severe accident analysis

    SciTech Connect

    Malinauskas, A.P.; Kress, T.S.

    1988-01-01

    The Reactor Safety Study presented the first systematic attempt to include fission product physicochemical effects in the determination of expected consequences of hypothetical nuclear reactor power plant accidents. At the time, however, the data base was sparse, and the treatment of fission product behavior was not entirely consistent or accurate. Considerable research has since been performed to identify and understand chemical phenomena that can occur in the course of a nuclear reactor accident, and how these phenomena affect fission product behavior. In this report, the current status of our understanding of the chemistry of fission products in severe core damage accidents is summarized and contrasted with that of the Reactor Safety Study.

  11. Study of possibility using LANL PSA-methodology for accident probability RBMK researches

    SciTech Connect

    Petrin, S.V.; Yuferev, V.Y.; Zlobin, A.M.

    1995-12-31

    The reactor facility probabilistic safety analysis methodologies are considered which are used at U.S. LANL and RF NIKIET. The methodologies are compared in order to reveal their similarity and differences, determine possibilities of using the LANL technique for RBMK type reactor safety analysis. It is found that at the PSA-1 level the methodologies practically do not differ. At LANL the PHA, HAZOP hazards analysis methods are used for more complete specification of the accounted initial event list which can be also useful at performance of PSA for RBMK. Exchange of information regarding the methodology of detection of dependent faults and consideration of human factor impact on reactor safety is reasonable. It is accepted as useful to make a comparative study result analysis for test problems or PSA fragments using various computer programs employed at NIKIET and LANL.

  12. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.

  13. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

  14. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  15. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units.

  16. Bus accident analysis of routes with/without bus priority.

    PubMed

    Goh, Kelvin Chun Keong; Currie, Graham; Sarvi, Majid; Logan, David

    2014-04-01

    This paper summarises findings on road safety performance and bus-involved accidents in Melbourne along roads where bus priority measures had been applied. Results from an empirical analysis of the accident types revealed significant reduction in the proportion of accidents involving buses hitting stationary objects and vehicles, which suggests the effect of bus priority in addressing manoeuvrability issues for buses. A mixed-effects negative binomial (MENB) regression and back-propagation neural network (BPNN) modelling of bus accidents considering wider influences on accident rates at a route section level also revealed significant safety benefits when bus priority is provided. Sensitivity analyses done on the BPNN model showed general agreement in the predicted accident frequency between both models. The slightly better performance recorded by the MENB model results suggests merits in adopting a mixed effects modelling approach for accident count prediction in practice given its capability to account for unobserved location and time-specific factors. A major implication of this research is that bus priority in Melbourne's context acts to improve road safety and should be a major consideration for road management agencies when implementing bus priority and road schemes.

  17. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  18. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  19. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  20. Hanford Waste Tank Bump Accident and Consequence Analysis

    SciTech Connect

    BRATZEL, D.R.

    2000-06-20

    This report provides a new evaluation of the Hanford tank bump accident analysis and consequences for incorporation into the Authorization Basis. The analysis scope is for the safe storage of waste in its current configuration in single-shell and double-shell tanks.

  1. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  2. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    SciTech Connect

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  3. Human factors review for Severe Accident Sequence Analysis (SASA)

    SciTech Connect

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

  4. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  5. Criticality accident dosimetry by chromosomal analysis.

    PubMed

    Voisin, P; Roy, L; Hone, P A; Edwards, A A; Lloyd, D C; Stephan, G; Romm, H; Groer, P G; Brame, R

    2004-01-01

    The technique of measuring the frequency of dicentric chromosomal aberrations in blood lymphocytes was used to estimate doses in a simulated criticality accident. The simulation consisted of three exposures; approximately 5 Gy with a bare source and 1 and 2 Gy with a lead-shielded source. Three laboratories made separate estimates of the doses. These were made by the iterative method of apportioning the observed dicentric frequencies between the gamma and neutron components, taking account of a given gamma/neutron dose ratio, and referring the separated dicentric frequencies to dose-response calibration curves. An alternative method, based on Bayesian ideas, was employed. This was developed for interpreting dicentric frequencies in situations where the gamma/neutron ratio is uncertain. Both methods gave very similar results. One laboratory produced dose estimates close to the eventual exercise reference doses and the other laboratories estimated slightly higher values. The main reason for the higher values was the calibration relationships for fission neutrons.

  6. Accident analysis of the windowless target system

    SciTech Connect

    Bianchi, F.; Ferri, R.

    2006-07-01

    Transmutation systems are able to reduce the radio-toxicity and amount of High-Level Wastes (HLW), which are the main concerns related to the peaceful use of nuclear energy, and therefore they should make nuclear energy more easily acceptable by population. A transmutation system consists of a sub-critical fast reactor, an accelerator and a Target System, where the spallation reactions needed to sustain the chain reaction take place. Three options were proposed for the Target System within the European project PDS-XADS (Preliminary Design Studies on an Experimental Accelerator Driven System): window, windowless and solid. This paper describes the constraints taken into account in the design of the windowless Target System for the large Lead-Bismuth-Eutectic cooled XADS and deals with the results of the calculations performed to assess the behaviour of the target during some accident sequences related to pump trips. (authors)

  7. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  8. Code System for Toxic Gas Accident Analysis.

    2001-09-24

    Version 00 TOXRISK is an interactive program developed to aid in the evaluation of nuclear power plant control room habitability in the event of a nearby toxic material release. The program uses a model which is consistent with the approach described in the NRC Regulatory Guide 1.78. Release of the gas is treated as an initial puff followed by a continuous plume. The relative proportions of these as well as the plume release rate aremore » supplied by the user. Transport of the gas is modeled as a Gaussian distribution and occurs through the action of a constant velocity, constant direction wind. Dispersion or diffusion of the gas during transport is described by modified Pasquill-Gifford dispersion coefficients. Great flexibility is afforded the user in specifying the release description, meteorological conditions, relative geometry of the accident and plant, and the plant ventilation system characteristics. Two types of simulation can be performed: multiple case (parametric) studies and probabilistic analyses.« less

  9. Analysis of the temporal properties in car accident time series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele

    2008-05-01

    In this paper we study the time-clustering behavior of sequences of car accidents, using data from a freely available database in the internet. The Allan Factor analysis, which is a well-suited method to investigate time-dynamical behaviors in point processes, has revealed that the car accident sequences are characterized by a general time-scaling behavior, with the presence of cyclic components. These results indicate that the time dynamics of the events are not Poissonian but long range correlated with periodicities ranging from 12 h to 1 year.

  10. Severe Accident Analysis Code SAMPSON Improvement for IMPACT Project

    NASA Astrophysics Data System (ADS)

    Ujita, Hiroshi; Ikeda, Takashi; Naitoh, Masanori

    SAMPSON is the integral code for severe accident analysis in detail with modular structure, developed in the IMPACT project. Each module can run independently and communication with multiple analysis modules supervised by the analysis control module makes an integral analysis possible. At the end of Phase 1 (1994-1997), demonstration simulations by combinations of up to 11 analysis modules had been performed and physical models in the code had been verified by separate-effect tests and validated by inegral tests. Multi-dimensional mechanistic models and theoretical-based conservation equations have been applied, during Phase 2 (1998-2000). New models for Accident Management evaluation have been also developed. Verificaton and validation have been performed by analysing separate-effect tests and inegral tests, while actual plant analyses are also being in progress.

  11. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  12. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  13. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    SciTech Connect

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

  14. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  15. APR1400 Reactivity Insertion Accident Analysis Using KNAP

    SciTech Connect

    Chang-Keun, Yang; Yo-Han, Kim; Chang-Kyung, Sung

    2006-07-01

    The Korea Electric Power Research Institute had decided to develop the new safety analysis code system for the Optimized Power Reactor 1000 (OPR1000) in Korea by the fund of the Ministry of Commerce, Industry and Energy. In this paper, some results of the Advanced Power Reactor 1400(APR1400) using the RETRAN code for some reactivity insertion accident are introduced to expand application from safety analysis experience of OPR1000. (authors)

  16. Similar methodological analysis involving the user experience.

    PubMed

    Almeida e Silva, Caio Márcio; Okimoto, Maria Lúcia R L; Tanure, Raffaela Leane Zenni

    2012-01-01

    This article deals with the use of a protocol for analysis of similar methodological analysis related to user experience. For both, were selected articles recounting experiments in the area. They were analyze based on the similar analysis protocol and finally, synthesized and associated.

  17. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  18. Accident analysis of heavy water cooled thorium breeder reactor

    SciTech Connect

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  19. Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

  20. Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

  1. Offsite radiological consequence analysis for the bounding flammable gas accident

    SciTech Connect

    CARRO, C.A.

    2003-03-19

    The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  2. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  3. [Epidemiological aspects and methodological difficulties in establishing the causal relationship of Chernobyl nuclear accident with cancer].

    PubMed

    Ivan, A; Azoicăi, Doina

    2002-01-01

    The epidemiological, etiologic and clinical polymorphism of the morbid states, those accounted by radioactivity included, create major obstacles in the standardization of the methods for assessing the incidence of some diseases and their lethality. The many risk factors that can associate to radiations from various sources make the epidemiological data difficult to evaluate. The Chernobyl nuclear accident has revived the concerns and research in the field of prevention, early diagnosis and intervention of the effects of nuclear radiations by development of comparative, spatial and temporal, researches based on standardized methods, thus comparable results being provided. In Romania, the multidisciplinary epidemiological researches on the effects of Chernobyl accident upon health are still limited, and consequently insignificant for global conclusions.

  4. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  5. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  6. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    SciTech Connect

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  7. Accident Analysis for the Plutonium Finishing Plant Polycube Stabilization Process

    SciTech Connect

    NELSON-MAKI, B.B.

    2001-05-14

    The Polycube Stabilization Project involves low temperature oxidation, without combustion, of polystyrene cubes using the production muffle furnaces in Glovebox HC-21C located in the Remote Mechanical ''C'' (RMC) Line in Room 230A in the 234-52 Facility. Polycubes are polystyrene cubes containing various concentrations of plutonium and uranium oxides. Hundreds of these cubes were manufactured for criticality experiments, and currently exist as unstabilized storage forms at the Plutonium Finishing Plant (PFP). This project is designed to stabilize and prepare the polycube material for stable storage using a process very similar to the earlier processing of sludges in these furnaces. The significant difference is the quantity of hydrogenous material present, and the need to place additional controls on the heating rate of the material. This calculation note documents the analyses of the Representative Accidents identified in Section 2.4.4 of Hazards Analysis for the Plutonium Finishing Plant Polycube Stabilization Process, HNF-7278 (HNF 2000). These two accidents, ''Deflagration in Glovebox HC-21C due to Loss of Power'' and ''Seismic Failure of Glovebox HC-21C'', will be further assessed in this accident analysis.

  8. Comprehensive Analysis of Two Downburst-Related Aircraft Accidents

    NASA Technical Reports Server (NTRS)

    Shen, J.; Parks, E. K.; Bach, R. E.

    1996-01-01

    Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

  9. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.

  10. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  11. Loss-of-coolant accident analysis of the Savannah River new production reactor design

    SciTech Connect

    Maloney, K.J.; Pryor, R.J.

    1990-11-01

    This document contains the loss-of-coolant accident analysis of the representative design for the Savannah River heavy water new production reactor. Included in this document are descriptions of the primary system, reactor vessel, and loss-of-coolant accident computer input models, the results of the cold leg and hot leg loss-of-coolant accident analyses, and the results of sensitivity calculations for the cold leg loss-of-coolant accident. 5 refs., 50 figs., 4 tabs.

  12. Analysis of offsite Emergency Planning Zones (EPZs) for the Rocky Flats Plant. Phase 3, Sitewide spectrum-of-accidents and bounding EPZ analysis

    SciTech Connect

    Petrocchi, A.J.; Zimmerman, G.A.

    1994-03-14

    During Phase 3 of the EPZ project, a sitewide analysis will be performed applying a spectrum-of-accidents approach to both radiological and nonradiological hazardous materials release scenarios. This analysis will include the MCA but will be wider in scope and will produce options for the State of Colorado for establishing a bounding EPZ that is intended to more comprehensively update the interim, preliminary EPZ developed in Phase 2. EG&G will propose use of a hazards assessment methodology that is consistent with the DOE Emergency Management Guide for Hazards Assessments and other methods required by DOE orders. This will include hazards, accident, safety, and risk analyses. Using this methodology, EG&G will develop technical analyses for a spectrum of accidents. The analyses will show the potential effects from the spectrum of accidents on the offsite population together with identification of offsite vulnerable zones and areas of concern. These analyses will incorporate state-of-the-art technology for accident analysis, atmospheric plume dispersion modeling, consequence analysis, and the application of these evaluations to the general public population at risk. The analyses will treat both radiological and nonradiological hazardous materials and mixtures of both released accidentally to the atmosphere. DOE/RFO will submit these results to the State of Colorado for the State`s use in determining offsite emergency planning zones for the Rocky Flats Plant. In addition, the results will be used for internal Rocky Flats Plant emergency planning.

  13. Some methodological and practical perspectives on severe-accident issue resolution

    SciTech Connect

    Theofanous, T.G.

    1995-12-31

    Severe accidents involve intense multiphase interactions in highly complex and grossly evolving geometries, These processes can impact containment integrity in a variety of ways and to varying degrees of extent. When there is disagreement among experts about the magnitude of this impact and associated consequences, we have an {open_quotes}issue.{close_quotes} The appearance of such issues is a natural consequence of the complexity of the underlying long sequences, and their potential impact has been dramatized by NUREG-1150. Perhaps the nuclear field, especially through the severe-accident problem, has led the way, but the situation is similar in other aspects of human endeavor when we are asking to quantify risks in the absence of direct empirical evidence. Basically, one needs to make predictions on the basis of incomplete information, and the related use, and misuse, of expert opinion are well known. Issues that remain stubbornly unresolved for extended time spans are highly detrimental to public confidence regarding acceptability of the technology (or natural hazard) that give rise to them. On the other hand, when potential problems are not recognized, or when issues are artificially closed, this can lead to public mistrust about the {open_quote}keepers{close_quotes} of the technology as well - with a similar outcome. Worse, this can lead to damaging and unexpected consequences. Clearly, it is of major importance that potential problems are recognized and that issues are identified and resolved in a timely manner. But it is also extremely important that resolutions are robust.

  14. Three Dimensional Analysis of 3-Loop PWR RCCA Ejection Accident for High Burnup

    SciTech Connect

    Marciulescu, Cristian; Sung, Yixing; Beard, Charles L.

    2006-07-01

    The Rod Control Cluster Assembly (RCCA) ejection accident is a Condition IV design basis reactivity insertion event for Pressurized Water Reactors (PWR). The event is historically analyzed using a one-dimensional (1D) neutron kinetic code to meet the current licensing criteria for fuel rod burnup to 62,000 MWD/MTU. The Westinghouse USNRC-approved three-dimensional (3D) analysis methodology is based on the neutron kinetics version of the ANC code (SPNOVA) coupled with Westinghouse's version of the EPRI core thermal-hydraulic code VIPRE-01. The 3D methodology provides a more realistic yet conservative analysis approach to meet anticipated reduction in the licensing fuel enthalpy rise limit for high burnup fuel. A rod ejection analysis using the 3D methodology was recently performed for a Westinghouse 3-loop PWR at an up-rated core power of 3151 MWt with reload cores that allow large flexibility in assembly shuffling and a fuel hot rod burnup to 75,000 MWD/MTU. The analysis considered high enrichment fuel assemblies at the control rod locations as well as bounding rodded depletions in the end of life, zero power and full power conditions. The analysis results demonstrated that the peak fuel enthalpy rise is less than 100 cal/g for the transient initiated at the hot zero power condition. The maximum fuel enthalpy is less than 200 cal/g for the transient initiated from the full power condition. (authors)

  15. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  16. Extension of ship accident analysis to multiple-package shipments

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.

    1997-11-01

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

  17. A general approach to critical infrastructure accident consequences analysis

    NASA Astrophysics Data System (ADS)

    Bogalecka, Magda; Kołowrocki, Krzysztof; Soszyńska-Budny, Joanna

    2016-06-01

    The probabilistic general model of critical infrastructure accident consequences including the process of the models of initiating events generated by its accident, the process of environment threats and the process of environment degradation is presented.

  18. Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.

    PubMed

    Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

    2014-04-01

    According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

  19. Offsite radiological consequence analysis for the bounding aircraft crash accident

    SciTech Connect

    OBERG, B.D.

    2003-03-22

    The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

  20. Analysis of surface powered haulage accidents, January 1990--July 1996

    SciTech Connect

    Fesak, G.M.; Breland, R.M.; Spadaro, J.

    1996-12-31

    This report addresses surface haulage accidents that occurred between January 1990 and July 1996 involving haulage trucks (including over-the-road trucks), front-end-loaders, scrapers, utility trucks, water trucks, and other mobile haulage equipment. The study includes quarries, open pits and surface coal mines utilizing self-propelled mobile equipment to transport personnel, supplies, rock, overburden material, ore, mine waste, or coal for processing. A total of 4,397 accidents were considered. This report summarizes the major factors that led to the accidents and recommends accident prevention methods to reduce the frequency of these accidents.

  1. Decontamination analysis of the NUWAX-83 accident site using DECON

    SciTech Connect

    Tawil, J.J.

    1983-11-01

    This report presents an analysis of the site restoration options for the NUWAX-83 site, at which an exercise was conducted involving a simulated nuclear weapons accident. This analysis was performed using a computer program deveoped by Pacific Northwest Laboratory. The computer program, called DECON, was designed to assist personnel engaged in the planning of decontamination activities. The many features of DECON that are used in this report demonstrate its potential usefulness as a site restoration planning tool. Strategies that are analyzed with DECON include: (1) employing a Quick-Vac option, under which selected surfaces are vacuumed before they can be rained on; (2) protecting surfaces against precipitation; (3) prohibiting specific operations on selected surfaces; (4) requiring specific methods to be used on selected surfaces; (5) evaluating the trade-off between cleanup standards and decontamination costs; and (6) varying of the cleanup standards according to expected exposure to surface.

  2. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  3. An analysis of evacuation options for nuclear accidents

    SciTech Connect

    Tawil, J.J.; Strenge, D.L.; Schultz, R.W.

    1987-11-01

    In this report we consider the threat posed by the accidental release of radionuclides from a nuclear power plant. The objective is to establish relationships between radiation dose and the cost of evacuation under a wide variety of conditions. The dose can almost always be reduced by evacuating the population from a larger area. However, extending the evacuation zone outward will cause evacuation costs to increase. The purpose of this analysis was to provide the Environmental Protection Agency (EPA) a data base for evaluating whether implementation costs and risks averted could be used to justify evacuation at lower doses. The procedures used and results of these analyses are being made available as background information for use by others. We develop cost/dose relationships for 54 scenarios that are based upon the severity of the reactor accident, meteorological conditions during the release of radionuclides into the environment, and the angular width of the evacuation zone. The 54 scenarios are derived from combinations of three accident severity levels, six meteorological conditions and evacuation zone widths of 70{degree}, 90{degree}, and 180{degree}.

  4. Summary of the SRS Severe Accident Analysis Program, 1987--1992

    SciTech Connect

    Long, T.A.; Hyder, M.L.; Britt, T.E.; Allison, D.K.; Chow, S.; Graves, R.D.; DeWald, A.B. Jr.; Monson, P.R. Jr.; Wooten, L.A.

    1992-11-01

    The Severe Accident Analysis Program (SAAP) is a program of experimental and analytical studies aimed at characterizing severe accidents that might occur in the Savannah River Site Production Reactors. The goals of the Severe Accident Analysis Program are: To develop an understanding of severe accidents in SRS reactors that is adequate to support safety documentation for these reactors, including the Safety Analysis Report (SAR), the Probabilistic Risk Assessment (PRA), and other studies evaluating the safety of reactor operation; To provide tools and bases for the evaluation of existing or proposed safety related equipment in the SRS reactors; To provide bases for the development of accident management procedures for the SRS reactors; To develop and maintain on the site a sufficient body of knowledge, including documents, computer codes, and cognizant engineers and scientists, that can be used to authoritatively resolve questions or issues related to reactor accidents. The Severe Accident Analysis Program was instituted in 1987 and has already produced a substantial amount of information, and specialized calculational tools. Products of the Severe Accident Analysis Program (listed in Section 9 of this report) have been used in the development of the Safety Analysis Report (SAR) and the Probabilistic Risk Assessment (PRA), and in the development of technical specifications for the SRS reactors. A staff of about seven people is currently involved directly in the program and in providing input on severe accidents to other SRS activities.

  5. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  6. Requirements Analysis in the Value Methodology

    SciTech Connect

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  7. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  8. Accident sequence analysis for sites producing and storing explosives.

    PubMed

    Papazoglou, Ioannis A; Aneziris, Olga; Konstandinidou, Myrto; Giakoumatos, Ieronymos

    2009-11-01

    This paper presents a QRA-based approach for assessing and evaluating the safety of installations handling explosive substances. Comprehensive generic lists of immediate causes and initiating events of detonation and deflagration of explosive substances as well as safety measures preventing these explosions are developed. Initiating events and corresponding measures are grouped under the more general categories of explosion due to shock wave, explosion due to mechanical energy, thermal energy, electrical energy, chemical energy, and electromagnetic radiation. Generic accident sequences are developed using Event Trees. This analysis is adapted to plant-specific conditions and potentially additional protective measures are rank-ordered in terms of the induced reduction in the frequency of explosion, by including also uncertainty. This approach has been applied to 14 plants in Greece with very satisfactory results. PMID:19819362

  9. Accident sequence analysis for sites producing and storing explosives.

    PubMed

    Papazoglou, Ioannis A; Aneziris, Olga; Konstandinidou, Myrto; Giakoumatos, Ieronymos

    2009-11-01

    This paper presents a QRA-based approach for assessing and evaluating the safety of installations handling explosive substances. Comprehensive generic lists of immediate causes and initiating events of detonation and deflagration of explosive substances as well as safety measures preventing these explosions are developed. Initiating events and corresponding measures are grouped under the more general categories of explosion due to shock wave, explosion due to mechanical energy, thermal energy, electrical energy, chemical energy, and electromagnetic radiation. Generic accident sequences are developed using Event Trees. This analysis is adapted to plant-specific conditions and potentially additional protective measures are rank-ordered in terms of the induced reduction in the frequency of explosion, by including also uncertainty. This approach has been applied to 14 plants in Greece with very satisfactory results.

  10. Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident

    PubMed Central

    Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

    2012-01-01

    In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

  11. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  12. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  13. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  14. Advanced accident sequence precursor analysis level 1 models

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O.

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  15. An analysis of accident data for franchised public buses in Hong Kong.

    PubMed

    Evans, W A; Courtney, A J

    1985-10-01

    This paper analyses data on accidents involving franchised public buses operating in Hong Kong. The data were obtained from the Royal Hong Kong Police, the Hong Kong Government Transport Department, the two major franchised bus operators and international sources. The analysis includes an international comparison of accidents with emphasis on the situation in Hong Kong compared to urban areas in the United Kingdom. An attempt has been made to identify the characteristics of bus accidents; accident incidence has been related to time of day, day of the week, time of year, weather conditions, driver's age and experience, hours on duty and policy-reported cause. The results indicate that Hong Kong has a high accident rate compared to Japan, the U.K. and the U.S.A., with particularly high pedestrian involvement rates. Bus accidents peak at around 9:00 AM and 4:00 PM but the accident rate is high throughout the day. Monday and Saturday appear to have a higher than average accident rate. The variability of accident rate throughout the year does not seem to be significant and the accident rate does not appear to be influenced by weather conditions. Older, more experienced drivers generally have a safer driving record than their younger, less experienced colleagues. Accident occurrence is related to the time the driver has been on duty. The paper questions the reliability of police-reported accident causation data and suggests improvements in the design of the accident report form and in the training of police investigators. The relevance of the Hong Kong study for accident research in general is also discussed.

  16. Accident involvement among learner drivers--an analysis of the consequences of supervised practice.

    PubMed

    Gregersen, Nils Petter; Nyberg, Anders; Berg, Hans-Yngve

    2003-09-01

    It is a well-known fact that experience is important for safe driving. Previously, this presented a problem since experience was mostly gained during the most dangerous period of driving-the first years with a licence. In many countries, this "experience paradox" has been addressed by providing increased opportunities to gain experience through supervised practice. One question, however, which still needs to be answered is what has been lost and what has been gained through supervised practice. Does this method lead to fewer accidents after licensing and/or has the number of accidents in driving practice increased? There were three aims in the study. The first was to calculate the size of the accident problem in terms of the number of accidents, health risk and accident risk during practising. The second aim was to evaluate the solution of the "experience paradox" that supervised practice suggests by calculating the costs in terms of accidents during driving practice and the benefits in terms of reduced accident involvement after obtaining a licence. The third aim was to analyse conflict types that occur during driving practice. National register data on licence holders and police-reported injury accidents and self-reported exposure were used. The results show that during the period 1994-2000, 444 driving practice injury accidents were registered, compared to 13657 accidents during the first 2 years with a licence. The health risk during the period after licensing was 33 times higher and the accident risk 10 times higher than the corresponding risk during practice. The cost-benefit analysis showed that the benefits in terms of accident reduction after licensing were 30 times higher than the costs in terms of driving practice accidents. It is recommended that measures to reduce such accidents should focus on better education of the lay instructor, but not on introducing measures to reduce the amount of lay-instructed practice. PMID:12850073

  17. An analysis of accident data for franchised public buses in Hong Kong.

    PubMed

    Evans, W A; Courtney, A J

    1985-10-01

    This paper analyses data on accidents involving franchised public buses operating in Hong Kong. The data were obtained from the Royal Hong Kong Police, the Hong Kong Government Transport Department, the two major franchised bus operators and international sources. The analysis includes an international comparison of accidents with emphasis on the situation in Hong Kong compared to urban areas in the United Kingdom. An attempt has been made to identify the characteristics of bus accidents; accident incidence has been related to time of day, day of the week, time of year, weather conditions, driver's age and experience, hours on duty and policy-reported cause. The results indicate that Hong Kong has a high accident rate compared to Japan, the U.K. and the U.S.A., with particularly high pedestrian involvement rates. Bus accidents peak at around 9:00 AM and 4:00 PM but the accident rate is high throughout the day. Monday and Saturday appear to have a higher than average accident rate. The variability of accident rate throughout the year does not seem to be significant and the accident rate does not appear to be influenced by weather conditions. Older, more experienced drivers generally have a safer driving record than their younger, less experienced colleagues. Accident occurrence is related to the time the driver has been on duty. The paper questions the reliability of police-reported accident causation data and suggests improvements in the design of the accident report form and in the training of police investigators. The relevance of the Hong Kong study for accident research in general is also discussed. PMID:4096796

  18. MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents

    SciTech Connect

    Foppe, T.L.; Peterson, V.L.

    1993-10-01

    The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

  19. 300-Area accident analysis for Emergency Planning Zones

    SciTech Connect

    Pillinger, W.L.

    1983-06-27

    The Department of Energy has requested SRL assistance in developing offsite Emergency Planning Zones (EPZs) for the Savannah River Plant, based on projected dose consequences of atmospheric releases of radioactivity from potential credible accidents in the SRP operating areas. This memorandum presents the assessment of the offsite doses via the plume exposure pathway from the 300-Area potential accidents. 8 refs., 3 tabs.

  20. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. )

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  1. Swimming pool immersion accidents: an analysis from the Brisbane Drowning Study

    PubMed Central

    Pearn, John H; Nixon, James

    1997-01-01

    An analysis of a consecutive series of 66 swimming pool immersion accidents is presented; 74% of these occurred in in-ground swimming pools. The estimated accident rate per pool is fives times greater for in-ground pools compared with above-ground pools, where pools are inadequately fenced. Backyard swimming pools account for 74% of pool accidents. Motel and caravan park pools account for 9% of childhood immersion accidents, but the survival rate (17%) is very low. Fifty per cent of pool accidents occur in the family's own backyard pool, and 13.6% in a neighbour's pool; in the latter the survival rate is still low at only 33%. In only one of the 66 cases was there an adequate safety fence; in 76% of cases there was no fence or barrier whatsoever. Tables of swimming pool accidents by age, season, site, and outcome are presented. PMID:9493630

  2. Progress in accident analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Latkowski, J F; Gomez del Rio, J; Sanz, J

    2000-10-11

    The present work continues our effort to perform an integrated safety analysis for the HYLIFE-II inertial fusion energy (IFE) power plant design. Recently we developed a base case for a severe accident scenario in order to calculate accident doses for HYLIFE-II. It consisted of a total loss of coolant accident (LOCA) in which all the liquid flibe (Li{sub 2}BeF{sub 4}) was lost at the beginning of the accident. Results showed that the off-site dose was below the limit given by the DOE Fusion Safety Standards for public protection in case of accident, and that his dose was dominated by the tritium released during the accident.

  3. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  4. Analysis of 139 spinal cord injuries due to accidents in water sports.

    PubMed

    Steinbrück, K; Paeslack, V

    1980-04-01

    Between 1967 and 1978, a total of 2587 patients received primary treatment in the Spinal Cord Injury Centre at the University of Heidelberg. In 212 cases the paralysis was caused by sports or diving accidents. Injuries resulting from accidents in water sports totalled 139, 131(61.7 per cent) of which could be classified as actual diving accidents. These 131 cases consisted of 129 tetraplegias and only 2 paraplegias. In 5 cases, the tetraplegia resulted from high diving and in 3 cases from scuba-diving. The subjects of the analysis are causes of accidents, segmental diagnosis of neurological deficiency symptoms and prognosis.

  5. Analysis of construction accidents in Turkey and responsible parties.

    PubMed

    Gürcanli, G Emre; Müngen, Uğur

    2013-01-01

    Construction is one of the world's biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972-2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00-12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  6. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, Uğur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  7. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  8. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  9. Systems biology data analysis methodology in pharmacogenomics

    PubMed Central

    Rodin, Andrei S; Gogoshin, Grigoriy; Boerwinkle, Eric

    2012-01-01

    Pharmacogenetics aims to elucidate the genetic factors underlying the individual’s response to pharmacotherapy. Coupled with the recent (and ongoing) progress in high-throughput genotyping, sequencing and other genomic technologies, pharmacogenetics is rapidly transforming into pharmacogenomics, while pursuing the primary goals of identifying and studying the genetic contribution to drug therapy response and adverse effects, and existing drug characterization and new drug discovery. Accomplishment of both of these goals hinges on gaining a better understanding of the underlying biological systems; however, reverse-engineering biological system models from the massive datasets generated by the large-scale genetic epidemiology studies presents a formidable data analysis challenge. In this article, we review the recent progress made in developing such data analysis methodology within the paradigm of systems biology research that broadly aims to gain a ‘holistic’, or ‘mechanistic’ understanding of biological systems by attempting to capture the entirety of interactions between the components (genetic and otherwise) of the system. PMID:21919609

  10. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  11. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    PubMed Central

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  12. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  13. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents.

  14. A Review of Citation Analysis Methodologies for Collection Management

    ERIC Educational Resources Information Center

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  15. Safety analysis results for cryostat ingress accidents in ITER

    SciTech Connect

    Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

    1996-12-31

    Accidents involving the ingress of air or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

  16. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    SciTech Connect

    Kohout, E.F.; Folga, S.; Mueller, C.; Nabelssi, B.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.

  17. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  18. School sports accidents: analysis of causes, modes, and frequencies.

    PubMed

    Kelm, J; Ahlhelm, F; Pape, D; Pitsch, W; Engel, C

    2001-01-01

    About 5% of all school children are seriously injured during physical education every year. Because of its influence on children's attitude toward sports and the economic aspects, an evaluation of causes and medical consequences is necessary. In this study, 213 school sports accidents were investigated. Besides diagnosis, the localization of injuries, as well as the duration of the sick leave were documented. Average age of injured students was 13 years. Most of the injured students blamed themselves for the accident. The most common injuries were sprains, contusions, and fractures. Main reasons for the accidents were faults in basic motion training. Playing soccer and basketball were the most frequent reasons for injuries. The upper extremity was more frequently involved than the lower extremity. Sports physicians and teachers should work out a program outlining the individual needs and capabilities of the injured students to reintegrate them into physical education.

  19. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  20. System response of a DOE Defense Program package in a transportation accident environment

    SciTech Connect

    Chen, T.F.; Hovingh, J.; Kimura, C.Y.

    1992-10-15

    The system response in a transportation accident environment is an element to be considered in an overall Transportation System Risk Assessment (TSRA) framework. The system response analysis uses the accident conditions and the subsequent accident progression analysis to develop the accident source term, which in turn, is used in the consequence analysis. This paper proposes a methodology for the preparation of the system response aspect of the TSRA.

  1. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  2. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    SciTech Connect

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-10-15

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

  3. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  4. RADIS - a regional nuclear accident consequence analysis model for Hong Kong

    SciTech Connect

    Yeung, Mankit Ray; Ching, E.M.K. )

    1993-02-01

    An atmospheric dispersion and consequence model called RADIS has been developed by the University of Hong Kong for nuclear accident consequence analysis. The model uses a two-dimensional plume trajectory derived from wind data for Hong Kong. Dose, health effects, and demographic models are also developed and implemented in RADIS so that accident consequences in 15 major population centers of Greater Hong Kong can be determined individually. In addition, benchmark testing results are give, and comparisons with the analytical solution and CRAC2 results are consistent and satisfactory. Sample calculational results for severe accident consequences are also presented to demonstrate the applicability of RADIS for dry and wet weather conditions.

  5. Experimental methodologies to support aircraft icing analysis

    NASA Technical Reports Server (NTRS)

    Hansman, R. John, Jr.; Kirby, Mark S.

    1987-01-01

    The experimental methodologies are illustrated by graphs, charts and line drawings. Typical ultrasonic echo signals for dry and wet ice growth, ice accretion rates for various tunnel configurations, the experimental configuration for flight tests of the ultrasonic measuring system and heat balance models used to predict ice growth are among the topics that are illustrated and briefly discussed.

  6. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  7. Advanced accident sequence precursor analysis level 2 models

    SciTech Connect

    Galyean, W.J.; Brownson, D.A.; Rempe, J.L.

    1996-03-01

    The U.S. Nuclear Regulatory Commission Accident Sequence Precursor program pursues the ultimate objective of performing risk significant evaluations on operational events (precursors) occurring in commercial nuclear power plants. To achieve this objective, the Office of Nuclear Regulatory Research is supporting the development of simple probabilistic risk assessment models for all commercial nuclear power plants (NPP) in the U.S. Presently, only simple Level 1 plant models have been developed which estimate core damage frequencies. In order to provide a true risk perspective, the consequences associated with postulated core damage accidents also need to be considered. With the objective of performing risk evaluations in an integrated and consistent manner, a linked event tree approach which propagates the front end results to back end was developed. This approach utilizes simple plant models that analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude and timing of a radioactive release to the environment, and calculate the consequences for a given release. Detailed models and results from previous studies, such as the NUREG-1150 study, are used to quantify these simple models. These simple models are then linked to the existing Level 1 models, and are evaluated using the SAPHIRE code. To demonstrate the approach, prototypic models have been developed for a boiling water reactor, Peach Bottom, and a pressurized water reactor, Zion.

  8. Accident sequence precursor analysis level 2/3 model development

    SciTech Connect

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-02-01

    The US Nuclear Regulatory Commission`s Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models.

  9. Risk analysis using a hybrid Bayesian-approximate reasoning methodology.

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2001-01-01

    Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

  10. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  11. Approaches to accident analysis in recent US Department of Energy environmental impact statements

    SciTech Connect

    Mueller, C.; Folga, S.; Nabelssi, B.

    1996-12-31

    A review of accident analyses in recent US Department of Energy (DOE) Environmental Impact Statements (EISs) was conducted to evaluate the consistency among approaches and to compare these approaches with existing DOE guidance. The review considered several components of an accident analysis: the overall scope, which in turn should reflect the scope of the EIS; the spectrum of accidents considered; the methods and assumptions used to determine frequencies or frequency ranges for the accident sequences; and the assumption and technical bases for developing radiological and chemical atmospheric source terms and for calculating the consequences of airborne releases. The review also considered the range of results generated with respect to impacts on various worker and general populations. In this paper, the findings of these reviews are presented and methods recommended for improving consistency among EISs and bringing them more into line with existing DOE guidance.

  12. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    SciTech Connect

    O'Kula, K.R.; Sharp, D.A. ); Amos, C.N.; Wagner, K.C.; Bradley, D.R. )

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained.

  13. Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory

    SciTech Connect

    Kim, S.H.; Taleyarkhan, R.P.

    1994-01-01

    A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

  14. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    SciTech Connect

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  15. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  16. Methodological Aspects Regarding The Organizational Stress Analysis

    NASA Astrophysics Data System (ADS)

    Irimie, Sabina; Pricope (Muntean), Luminiţa Doina; Pricope, Sorin; Irimie, Sabin Ioan

    2015-07-01

    This work presents a research of methodology in occupational stress analyse in the educational field, as a part of a larger study. The objectives of the work are in finding accents in existence of significant relations between stressors and effects, meaning the differences between the indicators of occupational stress to teaching staff in primary and gymnasium school, taking notice of each specific condition: the institution as an entity, the working community, the discipline he/she is teaching others, the geographic and administrative district (urban/rural) and the quantification of stress level.

  17. Tularosa Basin Play Fairway Analysis: Methodology Flow Charts

    SciTech Connect

    Adam Brandt

    2015-11-15

    These images show the comprehensive methodology used for creation of a Play Fairway Analysis to explore the geothermal resource potential of the Tularosa Basin, New Mexico. The deterministic methodology was originated by the petroleum industry, but was custom-modified to function as a knowledge-based geothermal exploration tool. The stochastic PFA flow chart uses weights of evidence, and is data-driven.

  18. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  19. [Comparative analysis of the radionuclide composition in fallout after the Chernobyl and the Fukushima accidents].

    PubMed

    Kotenko, K V; Shinkarev, S M; Abramov, Iu V; Granovskaia, E O; Iatsenko, V N; Gavrilin, Iu I; Margulis, U Ia; Garetskaia, O S; Imanaka, T; Khoshi, M

    2012-01-01

    The nuclear accident occurred at Fukushima Dai-ichi Nuclear Power Plant (NPP) (March 11, 2011) similarly to the accident at the Chernobyl NPP (April 26, 1986) is related to the level 7 of the INES. It is of interest to make an analysis of the radionuclide composition of the fallout following the both accidents. The results of the spectrometric measurements were used in that comparative analysis. Two areas following the Chernobyl accident were considered: (1) the near zone of the fallout - the Belarusian part of the central spot extended up to 60 km around the Chernobyl NPS and (2) the far zone of the fallout--the "Gomel-Mogilev" spot centered 200 km to the north-northeast of the damaged reactor. In the case of Fukushima accident the near zone up to about 60 km considered. The comparative analysis has been done with respect to refractory radionuclides (95Zr, 95Nb, 141Ce, 144Ce), as well as to the intermediate and volatile radionuclides 103Ru, 106Ru, 131I, 134Cs, 137Cs, 140La, 140Ba and the results of such a comparison have been discussed. With respect to exposure to the public the most important radionuclides are 131I and 137Cs. For the both accidents the ratios of 131I/137Cs in the considered soil samples are in the similar ranges: (3-50) for the Chernobyl samples and (5-70) for the Fukushima samples. Similarly to the Chernobyl accident a clear tendency that the ratio of 131I/137Cs in the fallout decreases with the increase of the ground deposition density of 137Cs within the trace related to a radioactive cloud has been identified for the Fukushima accident. It looks like this is a universal tendency for the ratio of 131I/137Cs versus the 137Cs ground deposition density in the fallout along the trace of a radioactive cloud as a result of a heavy accident at the NPP with radionuclides releases into the environment. This tendency is important for an objective reconstruction of 131I fallout based on the results of 137Cs measurements of soil samples carried out at

  20. Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

    1994-12-31

    The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia`s Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented .

  1. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  2. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery

    PubMed Central

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005–2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  3. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  4. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  5. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. )

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  6. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    SciTech Connect

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

    2012-03-18

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

  7. Accident analysis for transuranic waste management alternatives in the U.S. Department of Energy waste management program

    SciTech Connect

    Nabelssi, B.; Mueller, C.; Roglans-Ribas, J.; Folga, S.; Tompkins, M.; Jackson, R.

    1995-03-01

    Preliminary accident analyses and radiological source term evaluations have been conducted for transuranic waste (TRUW) as part of the US Department of Energy (DOE) effort to manage storage, treatment, and disposal of radioactive wastes at its various sites. The approach to assessing radiological releases from facility accidents was developed in support of the Office of Environmental Management Programmatic Environmental Impact Statement (EM PEIS). The methodology developed in this work is in accordance with the latest DOE guidelines, which consider the spectrum of possible accident scenarios in the implementation of various actions evaluated in an EIS. The radiological releases from potential risk-dominant accidents in storage and treatment facilities considered in the EM PEIS TRUW alternatives are described in this paper. The results show that significant releases can be predicted for only the most severe and extremely improbable accidents sequences.

  8. Radiochemical Analysis Methodology for uranium Depletion Measurements

    SciTech Connect

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  9. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    SciTech Connect

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  10. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  11. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    SciTech Connect

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask

  12. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  13. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  14. Impact of spatial kinetics in severe accident analysis for a large HWR

    SciTech Connect

    Morris, E.E.

    1994-03-01

    The impact on spatial kinetics on the analysis of severe accidents initiated by the unprotected withdrawal of one or more control rods is investigated for a large heavy water reactor. Large inter- and intra-assembly power shifts are observed, and the importance of detailed geometrical modeling of fuel assemblies is demonstrated. Neglect of space-time effects is shown to lead to erroneous estimates of safety margins, and of accident consequences in the event safety margins are exceeded. The results and conclusions are typical of what would be expected for any large, loosely coupled core.

  15. Analysis of fission product release behavior during the TMI-2 accident

    SciTech Connect

    Petti, D. A.; Adams, J. P.; Anderson, J. L.; Hobbins, R. R.

    1987-01-01

    An analysis of fission product release during the Three Mile Island Unit 2 (TMI-2) accident has been initiated to provide an understanding of fission product behavior that is consistent with both the best estimate accident scenario and fission product results from the ongoing sample acquisition and examination efforts. ''First principles'' fission product release models are used to describe release from intact, disrupted, and molten fuel. Conclusions relating to fission product release, transport, and chemical form are drawn. 35 refs., 12 figs., 7 tabs.

  16. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  17. On image analysis in fractography (Methodological Notes)

    NASA Astrophysics Data System (ADS)

    Shtremel', M. A.

    2015-10-01

    As other spheres of image analysis, fractography has no universal method for information convolution. An effective characteristic of an image is found by analyzing the essence and origin of every class of objects. As follows from the geometric definition of a fractal curve, its projection onto any straight line covers a certain segment many times; therefore, neither a time series (one-valued function of time) nor an image (one-valued function of plane) can be a fractal. For applications, multidimensional multiscale characteristics of an image are necessary. "Full" wavelet series break the law of conservation of information.

  18. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  19. The accident evolution and barrier function (AEB) model applied to incident analysis in the processing industries.

    PubMed

    Svenson, O

    1991-09-01

    This study develops a theoretical model for accident evolutions and how they can be arrested. The model describes the interaction between technical and human-organizational systems which may lead to an accident. The analytic tool provided by the model gives equal weight to both these types of systems and necessitates simultaneous and interactive accident analysis by engineers and human factors specialists. It can be used in predictive safety analyses as well as in post hoc incident analyses. To illustrate this, the AEB model is applied to an incident reported by the nuclear industry in Sweden. In general, application of the model will indicate where and how safety can be improved, and it also raises questions about issues such as the cost, feasibility, and effectiveness of different ways of increasing safety.

  20. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H.; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  1. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  2. NMR methodologies in the analysis of blueberries.

    PubMed

    Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

    2014-06-01

    An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-α-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering.

  3. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  4. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.

  5. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system. PMID:19250750

  6. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    NASA Astrophysics Data System (ADS)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  7. Formal Analysis of an Airplane Accident in N{Σ}-Labeled Calculus

    NASA Astrophysics Data System (ADS)

    Mizutani, Tetsuya; Igarashi, Shigeru; Ikeda, Yasuwo; Shio, Masayuki

    N{Σ}-labeled calculus is a formal system for representation, verification and analysis of time-concerned recognition, knowledge, belief and decision of humans or computer programs together with related external physical or logical phenomena.In this paper, a formal verification and analysis of the JAL near miss accident is presented as an example of cooperating systems controlling continuously changing objects including human factor with misunderstanding or incorrect recognition.

  8. Uncertainty analysis of preclosure accident doses for the Yucca Mountain repository

    SciTech Connect

    Ma, C.W.; Miller, D.D.; Zavoshy, S.J.; Jardine, L.J.

    1990-12-31

    This study presents a generic methodology that can be used to evaluate the uncertainty in the calculated accidental offsite doses at the Yucca Mountain repository during the preclosure period. For demonstration purposes, this methodology is applied to two specific accident scenarios: the first involves a crane dropping an open container with consolidated fuel rods, the second involves container failure during emplacement or removal operations. The uncertainties of thirteen parameters are quantified by various types of probability distributions. The Latin Hypercube Sampling method is used to evaluate the uncertainty of the offsite dose. For the crane-drop scenario with concurrent filter failure, the doses due to the release of airborne fuel particles are calculated to be 0.019, 0.32, and 2.8 rem at confidence levels of 10%, 50%, and 90%, respectively. For the container failure scenario with concurrent filter failure, the 90% confidence-level dose is 0.21 rem. 20 refs., 4 figs., 3 tabs.

  9. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  10. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  11. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    SciTech Connect

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural, technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective

  12. Spatiotemporal Analysis for Wildlife-Vehicle Based on Accident Statistics of the County Straubing-Bogen in Lower Bavaria

    NASA Astrophysics Data System (ADS)

    Pagany, R.; Dorner, W.

    2016-06-01

    During the last years the numbers of wildlife-vehicle-collisions (WVC) in Bavaria increased considerably. Despite the statistical registration of WVC and preventive measures at areas of risk along the roads, the number of such accidents could not be contained. Using geospatial analysis on WVC data of the last five years for county Straubing-Bogen, Bavaria, a small-scale methodology was found to analyse the risk of WVC along the roads in the investigated area. Various indicators were examined, which may be related to WVC. The risk depends on the time of the day and year which shows correlations in turn to the traffic density and wildlife population. Additionally the location of the collision depends on the species and on different environmental parameters. Accidents seem to correlate with the land use left and right of the street. Land use data and current vegetation were derived from remote sensing data, providing information of the general land use, also considering the vegetation period. For this a number of hot spots was selected to identify potential dependencies between land use, vegetation and season. First results from these hotspots show, that WVCs do not only depend on land use, but may show a correlation with the vegetation period. With regard to agriculture and seasonal as well as annual changes this indicates that warnings will fail due to their static character in contrast to the dynamic situation of land use and resulting risk for WVCs. This shows that there is a demand for remote sensing data with a high spatial and temporal resolution as well as a methodology to derive WVC warnings considering land use and vegetation. With remote sensing data, it could become possible to classify land use and calculate risk levels for WVC. Additional parameters, derived from remote sensed data that could be considered are relief and crops as well as other parameters such as ponds, natural and infrastructural barriers that could be related to animal behaviour and

  13. Limitations of risk analysis in the determination of medical factors in road vehicle accidents.

    PubMed

    Spencer, Michael B; Carter, Tim; Nicholson, Anthony N

    2004-01-01

    The purpose of risk analysis in the determination of medical factors in road vehicle accidents is to evaluate the risks that are associated with different strategies for accident reduction, so that the subsequent decision making process can be based on a best assessment of the likely benefits. However, it is vital to appreciate the limitations of such an approach, especially where the conclusions depend heavily on the accuracy of the assumptions made. In this paper the assumptions used in some recent analyses concerned with incapacitation, epilepsy, hypoglycaemia and psycho-active medication are explored, and the additional information required to reduce the uncertainty in the estimation of risk indicated. The conclusions from this analysis do not invalidate the use of risk assessment, but draw attention to its limitations and show how a sensitivity analysis can help to identify those areas where more precise information is needed before such an approach can be used confidently in a policy setting. PMID:14998267

  14. Overview of Sandia National Laboratories and Khlopin Radium Institute collaborative radiological accident consequence analysis efforts

    SciTech Connect

    Young, M.L.; Carlson, D.D.; Lazarev, L.N.; Petrov, B.F.; Romanovskiy, V.N.

    1997-05-01

    In January, 1995 a collaborative effort to improve radiological consequence analysis methods and tools was initiated between the V.G. Khlopin Institute (KRI) and Sandia National Laboratories (SNL). The purpose of the collaborative effort was to transfer SNL`s consequence analysis methods to KRI and identify opportunities for collaborative efforts to solve mutual problems relating to the safety of radiochemical facilities. A second purpose was to improve SNL`s consequence analysis methods by incorporating the radiological accident field experience of KRI scientists (e.g. the Chernobyl and Kyshtym accidents). The initial collaborative effort focused on the identification of: safety criteria that radiochemical facilities in Russia must meet; analyses/measures required to demonstrate that safety criteria have been met; and data required to complete the analyses/measures identified to demonstrate the safety basis of a facility.

  15. Interpersonal Dynamics in a Simulated Prison: A Methodological Analysis

    ERIC Educational Resources Information Center

    Banuazizi, Ali; Movahedi, Siamak

    1975-01-01

    A critical overview is presented of the Stanford Prison Experiment, conducted by Zimbardo and his coinvestigators in which they attempted a structural analysis of the problems of imprisonment. Key assumptions are questioned, primarily on methodological grounds, which casts doubts on the plausibility of the experimenters' final causal inferences.…

  16. Methodology for Analysis of IAI District Level Data Bases.

    ERIC Educational Resources Information Center

    Milazzo, Patricia; And Others

    Instructional Accomplishment Information (IAI) Systems data bases provide the opportunity for new and powerful studies relevant to educational policy issues at a local and/or national level. This report discusses the methodology for "schooling policy studies." The procedures are illustrated using a yet-to-be-completed analysis of the Los Angeles…

  17. Microgenetic Learning Analysis: A Methodology for Studying Knowledge in Transition

    ERIC Educational Resources Information Center

    Parnafes, O.; diSessa, A. A.

    2013-01-01

    This paper introduces and exemplifies a qualitative method for studying learning, "microgenetic learning analysis" (MLA), which is aimed jointly at developing theory and at establishing useful empirical results. Among modern methodologies, the focus on theory is somewhat distinctive. We use two strategies to describe MLA. First, we develop a…

  18. [Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].

    PubMed

    Onishchenko, G G

    2013-01-01

    Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia.

  19. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    SciTech Connect

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditional Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.

  20. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    USGS Publications Warehouse

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  1. Pressure vessels and piping design, analysis, and severe accidents. PVP-Volume 331

    SciTech Connect

    Dermenjian, A.A.

    1996-12-31

    The primary objective of the Design and Analysis Committee of the ASME Pressure Vessels and Piping Division is to provide a forum for the dissemination of information and the advancement of current theories and practices in the design and analysis of pressure vessels, piping systems, and components. This volume is divided into the following six sections: power plant piping and supports 1--3; applied dynamic response analysis; severe accident analysis; and student papers. Separate abstracts were prepared for 22 papers in this volume.

  2. A flammability and combustion model for integrated accident analysis. [Advanced light water reactors

    SciTech Connect

    Plys, M.G.; Astleford, R.D.; Epstein, M. )

    1988-01-01

    A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs.

  3. Research methodologies in palliative care: a bibliometric analysis.

    PubMed

    Payne, S A; Turner, J M

    2008-06-01

    The aspiration to design and conduct high-quality research in palliative care has been an important but elusive goal. The article evaluates the nature of research methodologies presented in published research within the broad remit of palliative care. A systematic search of the Medline database between 1997 and 2006, using the keywords 'palliative care' or 'end-of-life care' and 'research methodology', identified over 318 publications. A bibliometric analysis indicates an incremental increase in published outputs per year, from 27 countries, with articles widely distributed across 108 journals. The heterogeneity of the research methodologies and the journals publishing them, present challenges in defining what constitutes 'high quality'. We argue that although this diversity leads to a lack of coherence for a single disciplinary paradigm for palliative care, there is a greater acknowledgement of the differing epistemological and theoretical frameworks used by researchers. This could be regarded as enriching our understanding of what it means to be dying in contemporary society.

  4. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    SciTech Connect

    Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

    2009-05-23

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

  5. Behavior analysis and training-a methodology for behavior engineering.

    PubMed

    Colombetti, M; Dorigo, M; Borghi, G

    1996-01-01

    We propose Behavior Engineering as a new technological area whose aim is to provide methodologies and tools for developing autonomous robots. Building robots is a very complex engineering enterprise that requires the exact definition and scheduling of the activities which a designer, or a team of designers, should follow. Behavior Engineering is, within the autonomous robotics realm, the equivalent of more established disciplines like Software Engineering and Knowledge Engineering. In this article we first give a detailed presentation of a Behavior Engineering methodology, which we call Behavior Analysis and Training (BAT), where we stress the role of learning and training. Then we illustrate the application of the BAT methodology to three cases involving different robots: two mobile robots and a manipulator. Results show the feasibility of the proposed approach.

  6. Two methodologies for optical analysis of contaminated engine lubricants

    NASA Astrophysics Data System (ADS)

    Aghayan, Hamid; Bordatchev, Evgueni; Yang, Jun

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  7. Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.

    PubMed

    Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

    2013-10-01

    Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions

  8. An association between dietary habits and traffic accidents in patients with chronic liver disease: A data-mining analysis

    PubMed Central

    KAWAGUCHI, TAKUMI; SUETSUGU, TAKURO; OGATA, SHYOU; IMANAGA, MINAMI; ISHII, KUMIKO; ESAKI, NAO; SUGIMOTO, MASAKO; OTSUYAMA, JYURI; NAGAMATSU, AYU; TANIGUCHI, EITARO; ITOU, MINORU; ORIISHI, TETSUHARU; IWASAKI, SHOKO; MIURA, HIROKO; TORIMURA, TAKUJI

    2016-01-01

    The incidence of traffic accidents in patients with chronic liver disease (CLD) is high in the USA. However, the characteristics of patients, including dietary habits, differ between Japan and the USA. The present study investigated the incidence of traffic accidents in CLD patients and the clinical profiles associated with traffic accidents in Japan using a data-mining analysis. A cross-sectional study was performed and 256 subjects [148 CLD patients (CLD group) and 106 patients with other digestive diseases (disease control group)] were enrolled; 2 patients were excluded. The incidence of traffic accidents was compared between the two groups. Independent factors for traffic accidents were analyzed using logistic regression and decision-tree analyses. The incidence of traffic accidents did not differ between the CLD and disease control groups (8.8 vs. 11.3%). The results of the logistic regression analysis showed that yoghurt consumption was the only independent risk factor for traffic accidents (odds ratio, 0.37; 95% confidence interval, 0.16–0.85; P=0.0197). Similarly, the results of the decision-tree analysis showed that yoghurt consumption was the initial divergence variable. In patients who consumed yoghurt habitually, the incidence of traffic accidents was 6.6%, while that in patients who did not consume yoghurt was 16.0%. CLD was not identified as an independent factor in the logistic regression and decision-tree analyses. In conclusion, the difference in the incidence of traffic accidents in Japan between the CLD and disease control groups was insignificant. Furthermore, yoghurt consumption was an independent negative risk factor for traffic accidents in patients with digestive diseases, including CLD. PMID:27123257

  9. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  10. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  11. Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems

    SciTech Connect

    Peplow, Douglas E.; Wetzel, Larry

    2012-01-01

    -to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose

  12. Comparative analysis of EPA cost-benefit methodologies

    SciTech Connect

    Poch, L.; Gillette, J.; Veil, J.

    1998-05-01

    In recent years, reforming the regulatory process has received much attention from diverse groups such as environmentalists, the government, and industry. A cost-benefit analysis can be a useful way to organize and compare the favorable and unfavorable impacts a proposed action night have on society. Since 1981, two Executive Orders have required the U.S. Environmental Protection Agency (EPA) and other regulatory agencies to perform cost-benefit analyses in support of regulatory decision making. At the EPA, a cost-benefit analysis is published as a document called a regulatory impact analysis (RIA). This report reviews cost-benefit methodologies used by three EPA program offices: Office of Air and Radiation, Office of Solid Waste, and Office of Water. These offices were chosen because they promulgate regulations that affect the policies of this study`s sponsor (U.S. Department of Energy, Office of Fossil Energy) and the technologies it uses. The study was conducted by reviewing 11 RIAs recently published by the three offices and by interviewing staff members in the offices. To draw conclusions about the EPA cost-benefit methodologies, their components were compared with those of a standard methodology (i.e., those that should be included in a comprehensive cost-benefit methodology). This study focused on the consistency of the approaches as well as their strengths and weaknesses, since differences in the cost-benefit methodologies themselves or in their application can cause confusion and preclude consistent comparison of regulations both within and among program offices.

  13. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  14. Impact of traffic congestion on road accidents: a spatial analysis of the M25 motorway in England.

    PubMed

    Wang, Chao; Quddus, Mohammed A; Ison, Stephen G

    2009-07-01

    Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and as such this poses a potential dilemma for transport policy makers. This study aims to explore the impact of traffic congestion on the frequency of road accidents using a spatial analysis approach, while controlling for other relevant factors that may affect road accidents. The M25 London orbital motorway, divided into 70 segments, was chosen to conduct this study and relevant data on road accidents, traffic and road characteristics were collected. A robust technique has been developed to map M25 accidents onto its segments. Since existing studies have often used a proxy to measure the level of congestion, this study has employed a precise congestion measurement. A series of Poisson based non-spatial (such as Poisson-lognormal and Poisson-gamma) and spatial (Poisson-lognormal with conditional autoregressive priors) models have been used to account for the effects of both heterogeneity and spatial correlation. The results suggest that traffic congestion has little or no impact on the frequency of road accidents on the M25 motorway. All other relevant factors have provided results consistent with existing studies.

  15. Review of accident analysis calculations, 232-Z seismic scenario

    SciTech Connect

    Ballinger, M.Y.

    1993-05-01

    The 232-Z Building houses what was previously the incinerator facility, which is no longer in service. It is constructed out of concrete blocks and is approximately 37 ft wide by 57 ft long. The building has a single story over the process areas and two stories over the service areas at the north end of the building. The respective roofs are 15 ft and 19 ft above grade and consist of concrete over a metal decking, with insulation and a built-up asphalt gravel covering. This facility is assumed to collapse in the seismic event evaluated in the safety analyses, resulting in the release of a portion of the residual plutonium inventory remaining in the building. The seismic scenario for 232-Z assumes that the block concrete walls collapse, allowing the roof to fall, crushing the contaminated duct and gloveboxes within. This paper is a review of the scenario and methods used to calculate the source term from the seismic event as presented in the Plutonium Finishing Plant Final Safety Analysis Report (WHC 1991) also referred to as the PFP FSAR. Alternate methods of estimating the source term are presented. The calculation of source terms based on the mechanisms of release expected in worst-case scenario is recommended.

  16. Safety culture and accident analysis--a socio-management approach based on organizational safety social capital.

    PubMed

    Rao, Suman

    2007-04-11

    One of the biggest challenges for organizations in today's competitive business environment is to create and preserve a self-sustaining safety culture. Typically, the key drivers of safety culture in many organizations are regulation, audits, safety training, various types of employee exhortations to comply with safety norms, etc. However, less evident factors like networking relationships and social trust amongst employees, as also extended networking relationships and social trust of organizations with external stakeholders like government, suppliers, regulators, etc., which constitute the safety social capital in the Organization--seem to also influence the sustenance of organizational safety culture. Can erosion in safety social capital cause deterioration in safety culture and contribute to accidents? If so, how does it contribute? As existing accident analysis models do not provide answers to these questions, CAMSoC (Curtailing Accidents by Managing Social Capital), an accident analysis model, is proposed. As an illustration, five accidents: Bhopal (India), Hyatt Regency (USA), Tenerife (Canary Islands), Westray (Canada) and Exxon Valdez (USA) have been analyzed using CAMSoC. This limited cross-industry analysis provides two key socio-management insights: the biggest source of motivation that causes deviant behavior leading to accidents is 'Faulty Value Systems'. The second biggest source is 'Enforceable Trust'. From a management control perspective, deterioration in safety culture and resultant accidents is more due to the 'action controls' rather than explicit 'cultural controls'. Future research directions to enhance the model's utility through layering are addressed briefly.

  17. Safety culture and accident analysis--a socio-management approach based on organizational safety social capital.

    PubMed

    Rao, Suman

    2007-04-11

    One of the biggest challenges for organizations in today's competitive business environment is to create and preserve a self-sustaining safety culture. Typically, the key drivers of safety culture in many organizations are regulation, audits, safety training, various types of employee exhortations to comply with safety norms, etc. However, less evident factors like networking relationships and social trust amongst employees, as also extended networking relationships and social trust of organizations with external stakeholders like government, suppliers, regulators, etc., which constitute the safety social capital in the Organization--seem to also influence the sustenance of organizational safety culture. Can erosion in safety social capital cause deterioration in safety culture and contribute to accidents? If so, how does it contribute? As existing accident analysis models do not provide answers to these questions, CAMSoC (Curtailing Accidents by Managing Social Capital), an accident analysis model, is proposed. As an illustration, five accidents: Bhopal (India), Hyatt Regency (USA), Tenerife (Canary Islands), Westray (Canada) and Exxon Valdez (USA) have been analyzed using CAMSoC. This limited cross-industry analysis provides two key socio-management insights: the biggest source of motivation that causes deviant behavior leading to accidents is 'Faulty Value Systems'. The second biggest source is 'Enforceable Trust'. From a management control perspective, deterioration in safety culture and resultant accidents is more due to the 'action controls' rather than explicit 'cultural controls'. Future research directions to enhance the model's utility through layering are addressed briefly. PMID:16911855

  18. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  19. Preliminary analysis of graphite dust releasing behavior in accident for HTR

    SciTech Connect

    Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J.

    2012-07-01

    The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

  20. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  1. Speech analysis as an index of alcohol intoxication--the Exxon Valdez accident.

    PubMed

    Brenner, M; Cash, J R

    1991-09-01

    As part of its investigation of the EXXON VALDEZ tankship accident and oil spill, the National Transportation Safety Board (NTSB) examined the master's speech for alcohol-related effects. Recorded speech samples were obtained from marine radio communications tapes. The samples were tested for four effects associated with alcohol consumption is available scientific literature: slowed speech, speech errors, misarticulation of difficult sounds ("slurring"), and audible changes in speech quality. It was found that speech immediately before and after the accident displayed large changes of the sort associated with alcohol consumption. These changes were not readily explained by fatigue, psychological stress, drug effects, or medical problems. Speech analysis appears to be a useful technique to provide secondary evidence of alcohol impairment. PMID:1930083

  2. Speech analysis as an index of alcohol intoxication--the Exxon Valdez accident.

    PubMed

    Brenner, M; Cash, J R

    1991-09-01

    As part of its investigation of the EXXON VALDEZ tankship accident and oil spill, the National Transportation Safety Board (NTSB) examined the master's speech for alcohol-related effects. Recorded speech samples were obtained from marine radio communications tapes. The samples were tested for four effects associated with alcohol consumption is available scientific literature: slowed speech, speech errors, misarticulation of difficult sounds ("slurring"), and audible changes in speech quality. It was found that speech immediately before and after the accident displayed large changes of the sort associated with alcohol consumption. These changes were not readily explained by fatigue, psychological stress, drug effects, or medical problems. Speech analysis appears to be a useful technique to provide secondary evidence of alcohol impairment.

  3. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  4. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, J. R.

    2002-02-05

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  5. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, James Robert

    2002-05-01

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  6. [Accidents between motorcycles: analysis of cases that occurred in the state of Paraná between July 2010 and June 2011].

    PubMed

    Golias, Andrey Rogério Campos; Caetano, Rosângela

    2013-05-01

    Statistics for accidents between two motorcycles have been overlooked in the vast number of traffic accidents in Brazil, though they deserve closer analysis. This study sought to conduct an epidemiological analysis into accidents between two motorcycles compared with other accidents based on data in the state of Paraná. Information from the Fire Department site was collected for a period of one year (July 2010 to June 2011), reporting the number and type of accident, day of the week, time, number of victims, gender, age and severity of injuries. Accidents involving two motorcycles represented 3.4% of traffic accidents and 6.2% of accidents involving motorcycles; and the victims of these accidents accounted respectively for 4.4% of victims of traffic accidents and 8.5% victims of motorcycle accidents. Accidents occurring on Saturdays involving males aged between 20 and 29 were more common. Among the ten most populated cities in the state, some revealed high accident rate between two motorcycles, which appears to be related to the total number of motorcycles in the cities concerned. Thus, constant analysis of these indices is essential together with the implementation of measures to ensure safer highway traffic. PMID:23670451

  7. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  8. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error."

  9. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." PMID:27085591

  10. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  11. Simplifying multivariate survival analysis using global score test methodology

    NASA Astrophysics Data System (ADS)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz

    2015-12-01

    In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve multiple endpoints, and this situation further complicates the analysis of survival data. In the case of tumor patients, endpoints concerning survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For each patient, these endpoints are correlated, and the estimation of the correlation between two score statistics is fundamental in derivation of overall treatment advantage. In this paper, the bivariate survival analysis method using the global score test methodology is extended to multivariate setting.

  12. Analysis on the Density Driven Air-Ingress Accident in VHTRs

    SciTech Connect

    Eung Soo Kim; Chang Oh; Richard Schultz; David Petti

    2008-11-01

    Air-ingress following the pipe rupture is considered to be the most serious accident in the VHTRs due to its potential problems such as core heat-up, structural integrity and toxic gas release. Previously, it has been believed that the main air-ingress mechanism of this accident is the molecular diffusion process between the reactor core and the cavity. However, according to some recent studies, there is another fast air-ingress process that has not been considered before. It is called density-driven stratified flow. The potential for density-driven stratified air ingress into the VHTR following a large-break LOCA was first described in the NGNP Methods Technical Program based on stratified flow studies performed with liquid. Studies on densitygradient driven stratified flow in advanced reactor systems has been the subject of active research for well over a decade since density-gradient dominated stratified flow is an inherent characteristic of passive systems used in advanced reactors. Recently, Oh et al. performed a CFD analysis on the stratified flow in the VHTR, and showed that this effect can significantly accelerate the air-ingress process in the VHTRs. They also proposed to replace the original air-ingress scenario based on the molecular diffusion with the one based on the stratified flow. This paper is focusing on the effect of stratified flow on the results of the air-ingress accident in VHTR

  13. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    SciTech Connect

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

  14. Daya Bay power station accident-consequence analysis for Hong Kong

    SciTech Connect

    Yeung, Mankit Ray; Ching, Ming Kam )

    1989-01-01

    In the past decade, nations in Southeast Asia have been turning to nuclear power to meet their electricity demand. The Daya Bay nuclear power station, a joint venture of the People's Republic of China and Hong Kong, was initiated in the late 1970s and formalized in 1986. The site of the station is {approximately}50 km from downtown Hong Kong and <30 km from some part of the New Territories. Due to its proximity, the project has raised considerable concern for the consequences of potential accidents. The objective of this paper is to present some of the results of the accident-consequence analysis for Hong Kong. In general, the risk to the public near a nuclear power plant from a particular accident is controlled by three factors: (1) source terms, (2) dilution due to atmospheric dispersion, and (3) population distribution surrounding the plant. The present study uses a Gaussian-type puff dispersion/consequence model RADIS with the appropriate parameters for the transport of important radioactive isotopes. RADIS uses a two-dimensional trajectory that models plume movement in a way that is close to the real meteorological data. Modified Pasquill-Gilford diffusion parameters are used to account for vertical and transverse diffusions, and the model takes dry deposition, wet deposition, and groundshine effects into consideration. To demonstrate the applicability of the present model, some sample calculational results are shown.

  15. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  16. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Gomez del Rio, J; Sanz, J

    2000-02-23

    Previous studies of the safety and environmental (S and E) aspects of the HYLIFE-II inertial fusion energy (IFE) power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work a set of computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) has been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here the authors consider a severe lost of coolant accident (LOCA) producing simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the containment) and of the two barriers surrounding the chamber (inner shielding and containment building it self). Even though containment failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product release and transport. The results of these calculations show that the estimated off-site dose is less than 6 mSv (0.6 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  17. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    NASA Astrophysics Data System (ADS)

    Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

    2001-05-01

    Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  18. Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

    2000-01-01

    The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

  19. Analysis and design methodology for VLSI computing networks. Final report

    SciTech Connect

    Lev-Ari, H.

    1984-08-01

    Several methods for modeling and analysis of parallel algorithms and architectures have been proposed in the recent years. These include recursion-type methods, like recursion equations, z-transform descriptions and do-loops in high-level programming languages, and precedence-graph-type methods like data-flow graphs (marked graphs) and related Petri-net derived models. Most efforts have been recently directed towards developing methodologies for structured parallel algorithms and architectures and, in particular, for systolic-array-like systems. Some important properties of parallel algorithms have been identified in the process of this research effort. These include executability (the absence of deadlocks) pipelinability, regularity of structure, locality of interconnections, and dimensionality. The research has also demonstrated the feasibility of multirate systolic arrays with different rates of data propagation along different directions in the array. This final report presents a new methodology for modeling and analysis of parallel algorithms and architectures. This methodology provides a unified conceptual framework, which is called modular computing network, that clearly displays the key properties of parallel systems.

  20. Analysis of National Major Work Safety Accidents in China, 2003–2012

    PubMed Central

    YE, Yunfeng; ZHANG, Siheng; RAO, Jiaming; WANG, Haiqing; LI, Yang; WANG, Shengyong; DONG, Xiaomei

    2016-01-01

    Background: This study provides a national profile of major work safety accidents in China, which cause more than 10 fatalities per accident, intended to provide scientific basis for prevention measures and strategies to reduce major work safety accidents and deaths. Methods: Data from 2003–2012 Census of major work safety accidents were collected from State Administration of Work Safety System (SAWS). Published literature and statistical yearbook were also included to implement information. We analyzed the frequency of accidents and deaths, trend, geographic distribution and injury types. Additionally, we discussed the severity and urgency of emergency rescue by types of accidents. Results: A total of 877 major work safety accidents were reported, resulting in 16,795 deaths and 9,183 injuries. The numbers of accidents and deaths, mortality rate and incidence of major accidents have declined in recent years. The mortality rate and incidence was 0.71 and 1.20 per 106 populations in 2012, respectively. Transportation and mining contributed to the highest number of major accidents and deaths. Major aviation and railway accidents caused more casualties per incident, while collapse, machinery, electrical shock accidents and tailing dam accidents were the most severe situation that resulted in bigger proportion of death. Conclusion: Ten years’ major work safety accident data indicate that the frequency of accidents and number of eaths was declined and several safety concerns persist in some segments. PMID:27057515

  1. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  2. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  3. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D C; Blond, R M

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  4. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  5. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  6. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  7. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  8. Research methodologies in palliative care: a bibliometric analysis.

    PubMed

    Payne, S A; Turner, J M

    2008-06-01

    The aspiration to design and conduct high-quality research in palliative care has been an important but elusive goal. The article evaluates the nature of research methodologies presented in published research within the broad remit of palliative care. A systematic search of the Medline database between 1997 and 2006, using the keywords 'palliative care' or 'end-of-life care' and 'research methodology', identified over 318 publications. A bibliometric analysis indicates an incremental increase in published outputs per year, from 27 countries, with articles widely distributed across 108 journals. The heterogeneity of the research methodologies and the journals publishing them, present challenges in defining what constitutes 'high quality'. We argue that although this diversity leads to a lack of coherence for a single disciplinary paradigm for palliative care, there is a greater acknowledgement of the differing epistemological and theoretical frameworks used by researchers. This could be regarded as enriching our understanding of what it means to be dying in contemporary society. PMID:18541637

  9. Hypothetical accident condition thermal analysis and testing of a Type B drum package

    SciTech Connect

    Hensel, S.J.; Alstine, M.N. Van; Gromada, R.J.

    1995-07-01

    A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two Type B drum package fire test results. The model 9973 package was fire tested after a 30 ft. top down drop and puncture, and an undamaged model 9975 package containing a heater (21W) was fire tested to determine content heat source effects. Analysis results using a refined version of a previously developed HAC fiberboard model compared well against the test data from both the 9973 and 9975 packages.

  10. Methodological challenges of genome-wide association analysis in Africa

    PubMed Central

    Teo, Yik-Ying; Small, Kerrin S.; Kwiatkowski, Dominic P.

    2013-01-01

    Medical research in Africa has yet to benefit from the advent of genome-wide association (GWA) analysis, partly because the genotyping tools and statistical methods that have been developed for European and Asian populations struggle to deal with the high levels of genome diversity and population structure in Africa. However, the haplotypic diversity of African populations might help to overcome one of the major roadblocks in GWA research, the fine mapping of causal variants. We review the methodological challenges and consider how GWA studies in Africa will be transformed by new approaches in statistical imputation and large-scale genome sequencing. PMID:20084087

  11. A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies

    SciTech Connect

    Woodward, C S; Estep, D; Sandelin, J; Wang, H

    2009-02-26

    This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

  12. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  13. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    SciTech Connect

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  14. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  15. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)

    SciTech Connect

    Whitehead, D.; Darby, J.; Yakle, J.

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

  16. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a

  17. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  18. A Content Analysis of News Media Coverage of the Accident at Three Mile Island.

    ERIC Educational Resources Information Center

    Stephens, Mitchell; Edison, Nadyne G.

    A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

  19. Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR

    SciTech Connect

    Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T.; Shirakawa, N.

    2012-07-01

    The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

  20. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

  1. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

  2. Safety analysis report for the Galileo Mission. Volume 2, book 1: Accident model document

    NASA Astrophysics Data System (ADS)

    1988-12-01

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence.

  3. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis.

  4. Segment clustering methodology for unsupervised Holter recordings analysis

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  5. Process hazards analysis (PrHA) program, bridging accident analyses and operational safety

    SciTech Connect

    Richardson, J. A.; McKernan, S. A.; Vigil, M. J.

    2003-01-01

    Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker

  6. Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.

    PubMed

    Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

    2014-01-01

    The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary.

  7. Accident management information needs

    SciTech Connect

    Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. )

    1990-04-01

    In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

  8. Visualization of Traffic Accidents

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong; Khattak, Asad

    2010-01-01

    Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.

  9. Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report

    SciTech Connect

    Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

    1986-09-01

    The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

  10. Investigating accident causation through information network modelling.

    PubMed

    Griffin, T G C; Young, M S; Stanton, N A

    2010-02-01

    Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction. PMID:20099174

  11. Analysis of station blackout accidents for the Bellefonte pressurized water reactor

    SciTech Connect

    Gasser, R D; Bieniarz, P P; Tills, J L

    1986-09-01

    An analysis has been performed for the Bellefonte PWR Unit 1 to determine the containment loading and the radiological releases into the environment from a station blackout accident. A number of issues have been addressed in this analysis which include the effects of direct heating on containment loading, and the effects of fission product heating and natural convection on releases from the primary system. The results indicate that direct heating which involves more than about 50% of the core can fail the Bellefonte containment, but natural convection in the RCS may lead to overheating and failure of the primary system piping before core slump, thus, eliminating or mitigating direct heating. Releases from the primary system are significantly increased before vessel breach due to natural circulation and after vessel breach due to reevolution of retained fission products by fission product heating of RCS structures.

  12. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  13. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  14. Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors

    SciTech Connect

    Pate-Cornell, M.E. )

    1993-04-01

    The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

  15. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  16. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  17. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  19. Bond energy analysis revisited and designed toward a rigorous methodology

    NASA Astrophysics Data System (ADS)

    Nakai, Hiromi; Ohashi, Hideaki; Imamura, Yutaka; Kikuchi, Yasuaki

    2011-09-01

    The present study theoretically revisits and numerically assesses two-body energy decomposition schemes including a newly proposed one. The new decomposition scheme is designed to make the equilibrium bond distance equivalent with the minimum point of bond energies. Although the other decomposition schemes generally predict the wrong order of the C-C bond strengths of C2H2, C2H4, and C2H6, the new decomposition scheme is capable of reproducing the C-C bond strengths. Numerical assessment on a training set of molecules demonstrates that the present scheme exhibits a stronger correlation with bond dissociation energies than the other decomposition schemes do, which suggests that the new decomposition scheme is a reliable and powerful analysis methodology.

  20. Computational methodology for ChIP-seq analysis

    PubMed Central

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  1. Development and application of proton NMR methodology to lipoprotein analysis

    NASA Astrophysics Data System (ADS)

    Korhonen, Ari Juhani

    1998-11-01

    The present thesis describes the development of 1H NMR spectroscopy and its applications to lipoprotein analysis in vitro, utilizing biochemical prior knowledge and advanced lineshape fitting analysis in the frequency domain. A method for absolute quantification of lipoprotein lipids and proteins directly from the terminal methyl-CH3 resonance region of 1H NMR spectra of human blood plasma is described. Then the use of NMR methodology in time course studies of the oxidation process of LDL particles is presented. The function of the cholesteryl ester transfer protein (CETP) in lipoprotein mixtures was also assessed by 1H NMR, which allows for dynamic follow-up of the lipid transfer reactions between VLDL, LDL, and HDL particles. The results corroborated the suggestion that neutral lipid mass transfer among lipoproteins is not an equimolar heteroexchange. A novel method for studying lipoprotein particle fusion is also demonstrated. It is shown that the progression of proteolytically (α- chymotrypsin) induced fusion of LDL particles can be followed by 1H NMR spectroscopy and, moreover, that fusion can be distinguished from aggregation. In addition, NMR methodology was used to study the changes in HDL3 particles induced by phospholipid transfer protein (PLTP) in HDL3 + PLTP mixtures. The 1H NMR study revealed a gradual production of enlarged HDL particles, which demonstrated that PLTP-mediated remodeling of HDL involves fusion of the HDL particles. These applications demonstrated that the 1H NMR approach offers several advantages both in quantification and in time course studies of lipoprotein-lipoprotein interactions and of enzyme/lipid transfer protein function.

  2. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  3. Accident safety analysis for 300 Area N Reactor Fuel Fabrication and Storage Facility

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-01

    The purpose of the accident safety analysis is to identify and analyze a range of credible events, their cause and consequences, and to provide technical justification for the conclusion that uranium billets, fuel assemblies, uranium scrap, and chips and fines drums can be safely stored in the 300 Area N Reactor Fuel Fabrication and Storage Facility, the contaminated equipment, High-Efficiency Air Particulate filters, ductwork, stacks, sewers and sumps can be cleaned (decontaminated) and/or removed, the new concretion process in the 304 Building will be able to operate, without undue risk to the public, employees, or the environment, and limited fuel handling and packaging associated with removal of stored uranium is acceptable.

  4. Chiropractic treatment of patients in motor vehicle accidents: a statistical analysis

    PubMed Central

    Dies, Stephen; Strapp, J Walter

    1992-01-01

    Motor vehicle accidents (MVA) are a major cause of spinal injuries treated by chiropractors. In this study the files of one chiropractor were reviewed retrospectively to generate a data base on the MVA cases (n = 149). The effect of age, sex, vehicle damage, symptoms and concurrent physiotherapy on the dependent variables of number of treatments, improvement and requirement for ongoing treatment was computed using an analysis of variance. Overall the average number of treatments given was 14.2. Patients who complained of headache or low back pain required more treatments than average. Improvement level was lowered by delay in seeking treatment, the presence of uncomplicated nausea and advancing age. Ongoing treatment to relieve persistent pain was required in 40.2 percent of the cases. None of the factors studied had a significant effect on this variable. The results of this study are comparable to those reported in the medical literature.

  5. Surgical videos for accident analysis, performance improvement, and complication prevention: time for a surgical black box?

    PubMed

    Gambadauro, Pietro; Magos, Adam

    2012-03-01

    Conventional audit of surgical records through review of surgical results provides useful knowledge but hardly helps identify the technical reasons lying behind specific outcomes or complications. Surgical teams not only need to know that a complication might happen but also how and when it is most likely to happen. Functional awareness is therefore needed to prevent complications, know how to deal with them, and improve overall surgical performance. The authors wish to argue that the systematic recording and reviewing of surgical videos, a "surgical black box," might improve surgical care, help prevent complications, and allow accident analysis. A possible strategy to test this hypothesis is presented and discussed. Recording and reviewing surgical interventions, apart from helping us achieve functional awareness and increasing the safety profile of our performance, allows us also to effectively share our experience with colleagues. The authors believe that those potential implications make this hypothesis worth testing.

  6. PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.

    PubMed

    Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

    2014-10-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  7. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  8. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  10. System analysis with improved thermo-mechanical fuel rod models for modeling current and advanced LWR materials in accident scenarios

    NASA Astrophysics Data System (ADS)

    Porter, Ian Edward

    A nuclear reactor systems code has the ability to model the system response in an accident scenario based on known initial conditions at the onset of the transient. However, there has been a tendency for these codes to lack the detailed thermo-mechanical fuel rod response models needed for accurate prediction of fuel rod failure. This proposed work will couple today's most widely used steady-state (FRAPCON) and transient (FRAPTRAN) fuel rod models with a systems code TRACE for best-estimate modeling of system response in accident scenarios such as a loss of coolant accident (LOCA). In doing so, code modifications will be made to model gamma heating in LWRs during steady-state and accident conditions and to improve fuel rod thermal/mechanical analysis by allowing axial nodalization of burnup-dependent phenomena such as swelling, cladding creep and oxidation. With the ability to model both burnup-dependent parameters and transient fuel rod response, a fuel dispersal study will be conducted using a hypothetical accident scenario under both PWR and BWR conditions to determine the amount of fuel dispersed under varying conditions. Due to the fuel fragmentation size and internal rod pressure both being dependent on burnup, this analysis will be conducted at beginning, middle and end of cycle to examine the effects that cycle time can play on fuel rod failure and dispersal. Current fuel rod and system codes used by the Nuclear Regulatory Commission (NRC) are compilations of legacy codes with only commonly used light water reactor materials, Uranium Dioxide (UO2), Mixed Oxide (U/PuO 2) and zirconium alloys. However, the events at Fukushima Daiichi and Three Mile Island accident have shown the need for exploration into advanced materials possessing improved accident tolerance. This work looks to further modify the NRC codes to include silicon carbide (SiC), an advanced cladding material proposed by current DOE funded research on accident tolerant fuels (ATF). Several

  11. A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains

    SciTech Connect

    Burgherr, P.; Hirschberg, S.

    2008-07-01

    This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

  12. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. PMID:23376544

  13. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    ERIC Educational Resources Information Center

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  14. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  15. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  16. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  17. [Analysis of accidents for magnetically induced displacement of the large ferromagnetic material in magnetic resonance systems].

    PubMed

    Yamatani, Yuya; Doi, Tsukasa; Ueyama, Tsuyoshi; Nishiki, Shigeo; Ogura, Akio; Kawamitsu, Hideaki; Tsuchihashi, Toshio; Okuaki, Tomoyuki; Matsuda, Tsuyoshi

    2013-01-01

    To improve magnetic resonance (MR) safety, we surveyed the accidents caused by large ferromagnetic materials brought into MR systems accidentally. We sent a questionnaire to 700 Japanese medical institutions and received 405 valid responses (58%). A total of 97 accidents in 77 institutions were observed and we analyzed them regarding incidental rate, the detail situation and environmental factors. The mean accident rate of each institute was 0.7/100,000 examinations, which was widely distributed (0-25.6/100,000) depending on the institute. In this survey, relatively small institutes with less than 500 beds tend to have these accidents more frequently (p<0.01). The institutes in which daily MR examination counts are more than 10 patients have fewer accidents than those with less than 10 daily examinations. The institutes with 6-10 MR examinations daily have significantly more accidents than that with more than 10 daily MR examinations (p<0.01). The main mental factors of the accidents were considered to be "prejudice" and "carelessness" but some advocate "ignorance." Though we could not find significant reduction in the institutes that have lectures and training for MR safety, we should continue lectures and training for MR safety to reduce accidents due to "ignorance."

  18. Modeling methodology for supply chain synthesis and disruption analysis

    NASA Astrophysics Data System (ADS)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  19. Emergency drinking water treatment during source water pollution accidents in China: origin analysis, framework and technologies.

    PubMed

    Zhang, Xiao-Jian; Chen, Chao; Lin, Peng-Fei; Hou, Ai-Xin; Niu, Zhang-Bin; Wang, Jun

    2011-01-01

    China has suffered frequent source water contamination accidents in the past decade, which has resulted in severe consequences to the water supply of millions of residents. The origins of typical cases of contamination are discussed in this paper as well as the emergency response to these accidents. In general, excessive pursuit of rapid industrialization and the unreasonable location of factories are responsible for the increasing frequency of accidental pollution events. Moreover, insufficient attention to environmental protection and rudimentary emergency response capability has exacerbated the consequences of such accidents. These environmental accidents triggered or accelerated the promulgation of stricter environmental protection policy and the shift from economic development mode to a more sustainable direction, which should be regarded as the turning point of environmental protection in China. To guarantee water security, China is trying to establish a rapid and effective emergency response framework, build up the capability of early accident detection, and develop efficient technologies to remove contaminants from water.

  20. Vehicle-mounted mine detection: test methodology, application, and analysis

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1998-09-01

    The Mine/Minefield detection community's maturing technology base has become a developmental resource for world wide military and humanitarian applications. During the last decade, this community has developed a variety of single and multi-sensor applications incorporating a diversity of sensor and processor technologies. These diverse developments from the Mine/Minefield detection community require appropriate metrics to objectively bound technology and to define applicability to expected military and humanitarian applications. This paper presents a survey of the test methodology, application and analysis activities conducted by the U.S. Army Communications and Electronics Command's, Night Vision and Electronic Sensors Directorate (NVESD) on behalf of the Mine/Minefield detection community. As needs of world wide military and humanitarian mine detection activities are being responded to by notable technology base advances, a diverse pool of knowledge has been developed. The maturity of these technology base advances must be evaluated in a more systematic method. As these technologies mature, metrics have been developed to support the development process and to define the applicability of these technology base advances. The author will review the diversity of the mine detection technology and their related testing strategies. Consideration is given to the impact of history and global realism on the U.S. Army's present mine detection testing program. Further, definitions of testing metrics and analysis will be reviewed. Finally the paper will outline future U.S. Army testing plans with a special consideration given to the Vehicular Mounted Mine Detection/Ground Standoff Mine Detection System (VMMD/GSTAMIDS) Advanced Technology Demonstration and related issues.

  1. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  2. Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

  3. Consequence analysis of a hypothetical contained criticality accident in the Hanford Critical Mass Laboratory

    SciTech Connect

    Gore, B.F.; Strenge, D.L.; Mishima, J.

    1984-12-01

    The original hazards summary report (i.e., SAR) for the CML addressed the consequences of a hypothetical accidential critical excursion occurring with the experimental assembly room open. That report indicated that the public would receive insignificant radiation exposure regardless of the type of atmospheric condition, while plant personnel could possibly receive exposures greater than the annual exposure limits for radiation workers, when a strong inversion existed. This analysis investigates the consequencs of a hypothetical accident criticality occurring with the experimental assembly room sealed. Due to the containment capabilities designed and built into the critical assembly room, the consequences are greatly reduced below those presented in HW-66266. Despite the incorporation of many extremely conservative assumptions to simplify the analysis, the radiation doses predicted for personnel 100 meters or more distant from the CML are found to be smaller than the annual radiation dose limit for members of the public in uncontrolled areas during routine, nonaccident operations. Therefore, the results of this analysis demonstrate that the occurrence of a hypothetical critical excursion within the sealed experimental assembly room at the Hanford Critical Mass Laboratory presents only a small, acceptable risk to personnel and facilities in the area and no additional safety systems or controls are needed for the continued safe operation of the CML. 11 references, 4 tables. (ACR)

  4. Differences in rural and urban driver-injury severities in accidents involving large-trucks: an exploratory analysis.

    PubMed

    Khorashadi, Ahmad; Niemeier, Debbie; Shankar, Venky; Mannering, Fred

    2005-09-01

    This study explores the differences between urban and rural driver injuries (both passenger-vehicle and large-truck driver injuries) in accidents that involve large trucks (in excess of 10,000 pounds). Using 4 years of California accident data, and considering four driver-injury severity categories (no injury, complaint of pain, visible injury, and severe/fatal injury), a multinomial logit analysis of the data was conducted. Significant differences with respect to various risk factors including driver, vehicle, environmental, road geometry and traffic characteristics were found to exist between urban and rural models. For example, in rural accidents involving tractor-trailer combinations, the probability of drivers' injuries being severe/fatal increased about 26% relative to accidents involving single-unit trucks. In urban areas, this same probability increased nearly 700%. In accidents where alcohol or drug use was identified as being the primary cause of the accident, the probability of severe/fatal injury increased roughly 250% percent in rural areas and nearly 800% in urban areas. While many of the same variables were found to be significant in both rural and urban models (although often with quite different impact), there were 13 variables that significantly influenced driver-injury severity in rural but not urban areas, and 17 variables that significantly influenced driver-injury severity in urban but not rural areas. We speculate that the significant differences between rural and urban injury severities may be at least partially attributable to the different perceptual, cognitive and response demands placed on drivers in rural versus urban areas.

  5. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management

    NASA Astrophysics Data System (ADS)

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8 % of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced.

  6. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management.

    PubMed

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8% of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced.

  7. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

  8. Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2004-01-01

    A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

  9. Homicide or accident off the coast of Florida: trauma analysis of mutilated human remains.

    PubMed

    Stubblefield, P R

    1999-07-01

    In the many years Dr. William R. Maples served as a forensic anthropologist, he saw diverse sources of trauma presented in the victims of violent crime, accident and suicide in the state of Florida. In 1996 the District 18 Medical Examiner's Office of Florida requested the assistance of Dr. Maples in the analysis of human remains recovered by the U.S. Coast Guard. The deceased was in an advanced state of decomposition characterized by skin slippage and discoloration. The torso bore multiple lacerations, including nearly parallel lacerations in the skin of the back. Specimens were carefully macerated and the fractures reconstructed. The skeletal trauma was caused by a device capable of delivering robust cuts and blunt trauma in linear paths, as is consistent with propeller trauma. Unusual in this case were blows to the ventral and dorsal surfaces of the body. Based on the anthropological analysis and interviews with the family of the deceased, the F.B.I. proceeded with the case as a homicide investigation.

  10. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  11. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring.

  12. The effect of gamma-ray transport on afterheat calculations for accident analysis

    SciTech Connect

    Reyes, S.; Latkowski, J.F.; Sanz, J.

    2000-05-01

    Radioactive afterheat is an important source term for the release of radionuclides in fusion systems under accident conditions. Heat transfer calculations are used to determine time-temperature histories in regions of interest, but the true source term needs to be the effective afterheat, which considers the transport of penetrating gamma rays. Without consideration of photon transport, accident temperatures may be overestimated in others. The importance of this effect is demonstrated for a simple, one-dimensional problem. The significance of this effect depends strongly on the accident scenario being analyzed.

  13. Transient analysis for thermal margin with COASISO during a severe accident

    SciTech Connect

    Kim, Chan S.; Chu, Ho S.; Suh, Kune Y.; Park, Goon C.; Lee, Un C.; Yoon, Ho J.

    2002-07-01

    As an IVR-EVC (in-vessel retention through external vessel cooling) design concept, external cooling of the reactor vessel was suggested to protect the lower head from being overheated due to relocated material from the core during a severe accident. The COASISO (Corium Attack Syndrome Immunization Structure Outside the vessel) adopts an external vessel cooling strategy of flooding the reactor vessel inside the thermal insulator. Its advantage is the quick response time so that the initial heat removal mechanism of the EVC is nucleate boiling from the downward-facing lower head. The efficiency of the COASISO may be estimated by the thermal margin defined as the ratio of the actual heat flux from the reactor vessel to the critical heat flux (CHF). In this study the thermal margin for the large power reactor as the APR1400 (Advanced Power Reactor 1400 MWe) was determined by means of transient analysis for the local condition of the coolant and temperature distributions within the reactor vessel. The heat split fraction in the oxide pool and the metal layer focusing effect were considered during calculation of the angular thermal load at the inner wall of the lower head. The temperature distributions in the reactor vessel resulted in the actual heat flux on the outer wall. The local quality was obtained by solving the simplified transient energy equation. The unheated section of the reactor vessel decreases the thermal margin by mean of the two-dimensional conduction heat transfer. The peak temperature of the reactor vessel was estimated in the film boiling region as the thermal margin was equal to unity. Sensitivity analyses were performed for the time of corium relocation after the reactor trip, the coolant flow rate, and the initial subcooled condition of the coolant. There is no vessel failure predicted at the worst EVC condition when the stratification is not taken into account between the metal layer and the oxidic pool. The present predictive tool may be

  14. The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

  15. Lower head creep rupture failure analysis associated with alternative accident sequences of the Three Mile Island Unit 2

    SciTech Connect

    Sang Lung, Chan

    2004-07-01

    The objective of this lower head creep rupture analysis is to assess the current version of MELCOR 1.8.5-RG against SCDAP/RELAP5 MOD 3.3kz. The purpose of this assessment is to investigate the current MELCOR in-vessel core damage progression phenomena including the model for the formation of a molten pool. The model for stratified molten pool natural heat transfer will be included in the next MELCOR release. Presently, MELCOR excludes the gap heat-transfer model for the cooling associated with the narrow gap between the debris and the lower head vessel wall. All these phenomenological models are already treated in SCDAP/RELAP5 using the COUPLE code to model the heat transfer of the relocated debris with the lower head based on a two-dimensional finite-element-method. The assessment should determine if current MELCOR capabilities adequately cover core degradation phenomena appropriate for the consolidated MELCOR code. Inclusion of these features should bring MELCOR much closer to a state of parity with SCDAP/RELAP5 and is a currently underway element in the MELCOR code consolidation effort. This assessment deals with the following analysis of the Three Mile Island Unit 2 (TMI-2) alternative accident sequences. The TMI-2 alternative accident sequence-1 includes the continuation of the base case of the TMI-2 accident with the Reactor Coolant Pumps (RCP) tripped, and the High Pressure Injection System (HPIS) throttled after approximately 6000 s accident time, while in the TMI-2 alternative accident sequence-2, the reactor coolant pumps is tripped after 6000 s and the HPIS is activated after 12,012 s. The lower head temperature distributions calculated with SCDAP/RELAP5 are visualized and animated with open source visualization freeware 'OpenDX'. (author)

  16. Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS

    NASA Astrophysics Data System (ADS)

    Bhagyaiah, M.; Shrinagesh, B.

    2014-06-01

    Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

  17. Methodologies for analysis of patterning in the mouse RPE sheet

    PubMed Central

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  18. Offsite radiological consequence analysis for the bounding tank failure due to excessive loads accident

    SciTech Connect

    OBERG, B.D.

    2003-03-20

    This document quantifies the offsite radiological consequence of the bounding tank failure due to excessive loads accident for comparison with the 25 rem Evaluation Guideline established in WE-STK-3009, Appendix A. The bounding tank failure due to excessive loads accident is a single-shell tank failure due to excessive concentrated load. The calculated offsite dose of 0.045 rem, based on reasonably conservative input, does not challenge the Evaluation Guideline.

  19. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    SciTech Connect

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  20. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents.

  1. Identification of Behavior Based Safety by Using Traffic Light Analysis to Reduce Accidents

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Nasution, M. I.

    2016-01-01

    This work present the safety assessment of a case study and describes an important area within the field production in oil and gas industry, namely behavior based safety (BBS). The company set a rigorous BBS and its intervention program that implemented and deployed continually. In this case, observers requested to have discussion and spread a number of determined questions related with work behavior to the workers during observation. Appraisal of Traffic Light Analysis (TLA) as one tools of risk assessment used to determine the estimated score of BBS questionnaire. Standardization of TLA appraisal in this study are based on Regulation of Minister of Labor and Occupational Safety and Health No:PER.05/MEN/1996. The result shown that there are some points under 84%, which categorized in yellow category and should corrected immediately by company to prevent existing bad behavior of workers. The application of BBS expected to increase the safety performance at work time-by-time and effective in reducing accidents.

  2. Sensitivity analysis of a ship accident at a deep-ocean site in the northwest Atlantic

    SciTech Connect

    Kaplan, M.F.

    1985-04-01

    This report presents the results of a sensitivity analysis for an HLW ship accident occurring in the Nares Abyssal Plain in the northwestern Atlantic. Waste form release rate, canister lifetime and sorption in the water column (partition coefficients) were varied. Also investigated were the relative importance of the dose from the food chain and from seaweed in the diet. Peak individual doses and integrated collective doses for populations were the units of comparison. In accordance with international guidelines on radiological protection, the comparisons of different options were carried out over ''all time''; the study uses a million-year time frame. Partition coefficients have the most pronounced effect on collective dose of the parameters studied. Variations in partition coefficients affect the shape of the collective dose curve over the entire time frame. Peak individual doses decrease markedly when the value for the sorption of americium is increased, but show no increase when less sorption is assumed. Waste form release rates and canister lifetimes affect collective doses only in periods prior to 20,000 years. Hence, comparisons of these options need not be carried out beyond 20,000 years. Waste from release rates below 10/sup -3//yr (nominal value) affect individual doses in a linear manner, i.e., an order-of-magnitude reduction in release rate leads to an order-of-magnitude reduction in peak individual dose. Little reduction in peak individual doses is seen with canister lifetimes extended beyond the nominal 100 years. 32 refs., 14 figs., 16 tabs.

  3. Hospital Multifactor Productivity: A Presentation and Analysis of Two Methodologies

    PubMed Central

    Cylus, Jonathan D.; Dickensheets, Bridget A.

    2007-01-01

    In response to recent discussions regarding the ability of hospitals to achieve gains in productivity, we present two methodologies that attempt to measure multifactor productivity (MFP) in the hospital sector We analyze each method and conclude that the inconsistencies in their outcomes make it difficult to estimate a precise level of MFP that hospitals have historically achieved. Our goal in developing two methodologies is to inform the debate surrounding the ability of hospitals to achieve gains in MFP, as well as to highlight some of the challenges that exist in measuring hospital MFP. PMID:18435223

  4. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.

  5. Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR

    SciTech Connect

    Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D.; Struwe, D.; Pfrang, W.; Ponomarev, A.

    2012-07-01

    In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed

  6. TRAC-BF1/MOD1: An advanced best-estimate computer program for BWR accident analysis: User's guide

    SciTech Connect

    Rettig, W.H.; Wade, N.L. )

    1992-06-01

    The TRAC-BWR code development program at the Idaho National Engineering Laboratory has developed versions of the Transient Reactor Analysis Code (TRAC) for the US Nuclear Regulatory Commission and the public. The TRAC-BF1/MODI version of the computer code provides a best-estimate analysis capability for analyzing the full range of postulated accidents in boiling water reactor (BWR) systems and related facilities. This version provides a consistent and unified analysis capability for analyzing all areas of a large- or small-break loss-of-coolant accident (LOCA), beginning with the blowdown phase and continuing through heatup, reflood with quenching, and, finally, the refill phase of the accident. Also provided is a basic capability for the analysis of operational transients up to and including anticipated transients without scram (ATWS). The TRAC-BF1/MOD1 version produces results consistent with previous versions. Assessment calculations using the two TRAC-BFI versions show overall improvements in agreement with data and computation times as compared to earlier versions of the TRAC-BWR series of computer codes.

  7. TRAC-BF1/MOD1: An advanced best-estimate computer program for BWR accident analysis, Model description

    SciTech Connect

    Borkowski, J.A.; Wade, N.L.; Giles, M.M.; Rouhani, S.Z.; Shumway, R.W.; Singer, G.L.; Taylor, D.D.; Weaver, W.L. )

    1992-08-01

    The TRAC-BWR code development program at the Idaho National Engineering Laboratory has developed versions of the Transient Reactor Analysis Code (TRAC) for the US Nuclear Regulatory Commission and the public. The TRAC-BF1/MODl version of the computer code provides a best-estimate analysis capability for analyzing the full range of postulated accidents in boiling water reactor (BWR) systems and related facilities. This version provides a consistent and unified analysis capability for analyzing all areas of a large- or small-break loss-of-coolant accident (LOCA), beginning with the blowdown phase and continuing through heatup, reflood with quenching, and, finally, the refill phase of the accident. Also provided is a basic capability for the analysis of operational transients up to and including anticipated transients without scram (ATWS). The TRAC-BF1/MODI version produces results consistent with previous versions. Assessment calculations using the two TRAC-BF1 versions show overall improvements in agreement with data and computation times as compared to earlier versions of the TRAC-BWR series of computer codes.

  8. Advanced neutron source reactor conceptual safety analysis report, three-element-core design: Chapter 15, accident analysis

    SciTech Connect

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L.; Harrington, R.M.

    1996-02-01

    In order to utilize reduced enrichment fuel, the three-element-core design for the Advanced Neutron Source has been proposed. The proposed core configuration consists of inner, middle, and outer elements, with the middle element offset axially beneath the inner and outer elements, which are axially aligned. The three-element-core RELAP5 model assumes that the reactor hardware is changed only within the core region, so that the loop piping, heat exchangers, and pumps remain as assumed for the two-element-core configuration. To assess the impact of changes in the core region configuration and the thermal-hydraulic steady-state conditions, the safety analysis has been updated. This report gives the safety margins for the loss-of-off-site power and pressure-boundary fault accidents based on the RELAP5 results. AU margins are greater for the three-element-core simulations than those calculated for the two-element core.

  9. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  10. An Analysis of the Research Methodology of the Ramirez Study.

    ERIC Educational Resources Information Center

    Thomas, Wayne P.

    1992-01-01

    Analyzes the political, educational, and technical factors that strongly influenced the Ramirez study of bilingual programs. Enumerates strengths and weaknesses of the study's research methodology, along with implications for decision making in language-minority education. Summarizes defensible conclusions of the study that have not yet been…

  11. Modeling and analysis of the unprotected loss-of-flow accident in the Clinch River Breeder Reactor

    SciTech Connect

    Morris, E.E.; Dunn, F.E.; Simms, R.; Gruber, E.E.

    1985-01-01

    The influence of fission-gas-driven fuel compaction on the energetics resulting from a loss-of-flow accident was estimated with the aid of the SAS3D accident analysis code. The analysis was carried out as part of the Clinch River Breeder Reactor licensing process. The TREAT tests L6, L7, and R8 were analyzed to assist in the modeling of fuel motion and the effects of plenum fission-gas release on coolant and clad dynamics. Special, conservative modeling was introduced to evaluate the effect of fission-gas pressure on the motion of the upper fuel pin segment following disruption. For the nominal sodium-void worth, fission-gas-driven fuel compaction did not adversely affect the outcome of the transient. When uncertainties in the sodium-void worth were considered, however, it was found that if fuel compaction occurs, loss-of-flow driven transient overpower phenomenology could not be precluded.

  12. The epidemiology and cost analysis of patients presented to Emergency Department following traffic accidents

    PubMed Central

    Karadana, Gökçe Akgül; Aksu, Nalan Metin; Akkaş, Meltem; Akman, Canan; Üzümcügil, Akın; Özmen, M. Mahir

    2013-01-01

    Background Traffic accidents are ranked first as the cause of personal injury throughout the world. The high number of traffic accidents yielding injuries and fatalities makes them of great importance to Emergency Departments. Material/Methods Patients admitted to Hacettepe University Faculty of Medicine Adult Emergency Department due to traffic accidents were investigated epidemiologically. Differences between groups were evaluated by Kruskall-Wallis, Mann-Whitney, and Wilcoxon tests. A value of p<0.05 was accepted as statistically significant. Results We included 2003 patients over 16 years of age. The mean age was 39.6±16.1 and 55% were males. Admissions by ambulance and due to motor vehicle accidents were the most common. In 2004 the rate of traffic accidents (15.3%) was higher than the other years, the most common month was May (10.8%), and the most common time period was 6 pm to 12 am (midnight). About half of the patients (51.5%) were admitted in the first 30 minutes. Life-threatening condition was present in 9.6% of the patients. Head trauma was the most common type of trauma, with the rate of 18.3%. Mortality rate was 81.8%. The average length of hospital stay was 403 minutes (6.7 hours) and the average cost per patient was 983±4364 TL. Conclusions Further studies are needed to compare the cost found in this study with the mean cost for Turkey. However, the most important step to reduce the direct and indirect costs due to traffic accidents is the prevention of these accidents. PMID:24316815

  13. Occupational accidents aboard merchant ships

    PubMed Central

    Hansen, H; Nielsen, D; Frydenberg, M

    2002-01-01

    Objectives: To investigate the frequency, circumstances, and causes of occupational accidents aboard merchant ships in international trade, and to identify risk factors for the occurrence of occupational accidents as well as dangerous working situations where possible preventive measures may be initiated. Methods: The study is a historical follow up on occupational accidents among crew aboard Danish merchant ships in the period 1993–7. Data were extracted from the Danish Maritime Authority and insurance data. Exact data on time at risk were available. Results: A total of 1993 accidents were identified during a total of 31 140 years at sea. Among these, 209 accidents resulted in permanent disability of 5% or more, and 27 were fatal. The mean risk of having an occupational accident was 6.4/100 years at sea and the risk of an accident causing a permanent disability of 5% or more was 0.67/100 years aboard. Relative risks for notified accidents and accidents causing permanent disability of 5% or more were calculated in a multivariate analysis including ship type, occupation, age, time on board, change of ship since last employment period, and nationality. Foreigners had a considerably lower recorded rate of accidents than Danish citizens. Age was a major risk factor for accidents causing permanent disability. Change of ship and the first period aboard a particular ship were identified as risk factors. Walking from one place to another aboard the ship caused serious accidents. The most serious accidents happened on deck. Conclusions: It was possible to clearly identify work situations and specific risk factors for accidents aboard merchant ships. Most accidents happened while performing daily routine duties. Preventive measures should focus on workplace instructions for all important functions aboard and also on the prevention of accidents caused by walking around aboard the ship. PMID:11850550

  14. Assessment of ISLOCA risk-methodology and application to a combustion engineering plant

    SciTech Connect

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

  15. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    SciTech Connect

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following

  16. Testing and analysis of structural integrity of electrosleeved tubes under severe accident transients

    SciTech Connect

    Majumdar, S.

    1999-12-10

    The structural integrity of flawed steam generator tubing with Electrosleeves{trademark} under simulated severe accident transients was analyzed by analytical models that used available material properties data and results from high-temperature tests conducted on Electrosleeved tubes. The Electrosleeve material is almost pure Ni and derives its strength and other useful properties from its nanocrystalline microstructure, which is stable at reactor operating temperatures. However, it undergoes rapid grain growth, at the high temperatures expected during severe accidents, resulting in a loss of strength and a corresponding decrease in flow stress. The magnitude of this decrease depends on the time-temperature history during the accident. Failure tests were conducted at ANL and FTI on internally pressurized Electrosleeved tubes with 80% and 100% throughwall machined axial notches in tie parent tubes that were subjected to simulated severe accident temperature transients. The test results, together with the analytical model, were used to estimate the unaged flow stress curve of the Electrosleeved material at high temperatures. Failure temperatures for Electrosleeved tubes with throughwall and part-throughwall axial cracks of various lengths in the parent tubes were calculated for a postulated severe accident transient.

  17. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  18. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China,a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail.

  19. WASTE-ACC: A computer model for analysis of waste management accidents

    SciTech Connect

    Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

    1996-12-01

    In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

  20. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  1. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi

  2. A Common Methodology for Safety and Reliability Analysis for Space Reactor Missions

    SciTech Connect

    Frank, Michael V.

    2006-01-20

    The thesis of this paper is that the methodology of probabilistic risk management (PRM) has the capability to integrate both safety and reliability analyses for space nuclear missions. Practiced within a decision analysis framework, the concept of risk and the overall methodology of PRM are not dependent on whether the outcome affects mission success or mission safety. This paper presents the methodology by means of simplified exampl0008.

  3. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  4. APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

  5. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  6. Analysis of hospitalization occurred due to motorcycles accidents in São Paulo city

    PubMed Central

    Gorios, Carlos; Armond, Jane de Eston; Rodrigues, Cintia Leci; Pernambuco, Henrique; Iporre, Ramiro Ortiz; Colombo-Souza, Patrícia

    2015-01-01

    OBJECTIVE: To characterize the motorcycle accidents occurred in the city of São Paulo, SP, Brazil in the year 2013, with emphasis on information about hospital admissions from SIH/SUS. METHODS: This is a retrospective cross-sectional study. The study covered 5,597 motorcyclists traumatized in traffic accident during the year 2013 occurred in the city of São Paulo. A survey was conducted using secondary data from the Information System of Hospitalization Health System (SIH/SUS). RESULTS: In 2013, in the city of São Paulo there were 5,597 admissions of motorcyclists traumatized in traffic accidents, of which 89.8% were male. The admission diagnosis were: leg fracture, femur fracture, and intracranial injury. CONCLUSION: This study confirms other preliminary studies on several points, among which stands out the highest prevalence of male young adults. Level of Evidence II, Retrospective Study. PMID:26327804

  7. Comparative analysis of social, demographic, and flight-related attributes between accident and nonaccident general aviation pilots.

    PubMed

    Urban, R F

    1984-04-01

    This investigation represents an exploratory examination of several differentiating social and demographic characteristics for a sample of calendar year 1978 Colorado-resident nonfatal accident-involved pilots and a random sample of nonaccident general aviation (i.e., nonairline) pilots. During 1979-1980 80 currently active pilots were interviewed by the author, and information concerning the standard demographic variables, in addition to several social, psychological, and flying-related items, was obtained. The sample was generated from commercially available data files derived from U.S. Government records and consisted of 46 accident and 34 nonaccident pilots who resided within a 100-mi radius of Denver, east of the Rocky Mountains. Descriptively, the respondents represented a broad spectrum of general aviation, including: corporate pilots, "crop dusters," builders of amateur experimental aircraft, and recreational fliers. Application of stepwise discriminant analysis revealed that the pilots' education, political orientation, birth order, percent of flying for business purposes, participation in nonflying aviation activities, number of years of flying experience, and an index of aviation procedural noncompliance yielded statistically significant results. Furthermore, utilization of the classification capability of discriminant analysis produced a mathematical function which correctly allocated 78.5% of the cases into the appropriate groups, thus contributing to a 56.5% proportionate reduction in error over a random effects model. No relationship was found between accident involvement and several indicators of social attachments, socioeconomic status, and a number of measures of flying exposure. PMID:6732683

  8. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

  9. Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.

    PubMed

    Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

    2011-07-01

    The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

  10. Development of the simulation system {open_quotes}IMPACT{close_quotes} for analysis of nuclear power plant severe accidents

    SciTech Connect

    Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi

    1997-07-01

    The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system {open_quotes}IMPACT{close_quotes} for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT`s distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed by three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data.

  11. Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.

    SciTech Connect

    Salay, Michael; Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

    2008-10-01

    Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

  12. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such

  13. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such

  14. A rational design change methodology based on experimental and analytical modal analysis

    SciTech Connect

    Weinacht, D.J.; Bennett, J.G.

    1993-08-01

    A design methodology that integrates analytical modeling and experimental characterization is presented. This methodology represents a powerful tool for making rational design decisions and changes. An example of its implementation in the design, analysis, and testing of a precisions machine tool support structure is given.

  15. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  16. Radiotherapy Accidents

    NASA Astrophysics Data System (ADS)

    Mckenzie, Alan

    A major benefit of a Quality Assurance system in a radiotherapy centre is that it reduces the likelihood of an accident. For over 20 years I have been the interface in the UK between the Institute of Physics and Engineering in Medicine and the media — newspapers, radio and TV — and so I have learned about radiotherapy accidents from personal experience. In some cases, these accidents did not become public and so the hospital cannot be identified. Nevertheless, lessons are still being learned.

  17. Stream habitat analysis using the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  18. Drinking locations of drink-drivers: a comparative analysis of accident and nonaccident cases.

    PubMed

    Lang, E; Stockwell, T

    1991-12-01

    This study utilizes data collected by the Perth (Australia) Traffic Police on the last drinking location of persons arrested for drink-driving either as a consequence of their being involved in a road traffic accident or as a result of failing a roadside breath test. A comparison of these data has found that significantly more persons involved in traffic accidents had been drinking at unlicensed locations, that is at private residences or in public places such as parks, than at licensed premises. It was also found that accident cases were more likely to involve males under 25 years; for those involved to have, on average, significantly higher blood alcohol levels than was the case for nonaccident drink-driving cases; and for most accidents to occur late at night and early morning. The significance of these findings were confirmed by logistic regression. A surprise incidental finding was that considerably more women had been arrested for drink-driving than had been previously reported in other studies, both in Australia and overseas.

  19. Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.

    ERIC Educational Resources Information Center

    Dunwoody, Sharon; And Others

    Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

  20. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  1. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  2. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed Central

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-01-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  3. Bullet identification: a case of a fatal hunting accident resolved by comparison of lead shot using instrumental neutron activation analysis.

    PubMed

    Capannesi, G; Sedda, A F

    1992-03-01

    Bullet identification by chemical analysis often provides a powerful clue in forensic science. A case is reported in which a hunting accident was resolved by using instrumental neutron activation analysis (INAA) for direct comparison of the trace element content in lead shot. Different preparation batches of lead shot appear to have a high within-group composition homogeneity, and good differentiation is achieved between different batches. Determination of the nickel and antimony content on a bush branch demonstrated that the branch had been perforated by one of the shot pellets, and this helped the detectives in reconstruction of the crime scene. PMID:1500906

  4. Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II

    NASA Astrophysics Data System (ADS)

    Hu, G.; Zhao, S.; Ruan, K.

    2012-01-01

    In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

  5. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C, M.

    2007-01-01

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  6. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  7. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  8. Interpretation methodology and analysis of in-flight lightning data

    NASA Technical Reports Server (NTRS)

    Rudolph, T.; Perala, R. A.

    1982-01-01

    A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.

  9. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  10. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    SciTech Connect

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  11. Risk of road accident associated with the use of drugs: a systematic review and meta-analysis of evidence from epidemiological studies.

    PubMed

    Elvik, Rune

    2013-11-01

    This paper is a corrigendum to a previously published paper where errors were detected. The errors have been corrected in this paper. The paper is otherwise identical to the previously published paper. A systematic review and meta-analysis of studies that have assessed the risk of accident associated with the use of drugs when driving is presented. The meta-analysis included 66 studies containing a total of 264 estimates of the effects on accident risk of using illicit or prescribed drugs when driving. Summary estimates of the odds ratio of accident involvement are presented for amphetamines, analgesics, anti-asthmatics, anti-depressives, anti-histamines, benzodiazepines, cannabis, cocaine, opiates, penicillin and zopiclone (a sleeping pill). For most of the drugs, small or moderate increases in accident risk associated with the use of the drugs were found. Information about whether the drugs were actually used while driving and about the doses used was often imprecise. Most studies that have evaluated the presence of a dose-response relationship between the dose of drugs taken and the effects on accident risk confirm the existence of a dose-response relationship. Use of drugs while driving tends to have a larger effect on the risk of fatal and serious injury accidents than on the risk of less serious accidents (usually property-damage-only accidents). The quality of the studies that have assessed risk varied greatly. There was a tendency for the estimated effects of drug use on accident risk to be smaller in well-controlled studies than in poorly controlled studies. Evidence of publication bias was found for some drugs. The associations found cannot be interpreted as causal relationships, principally because most studies do not control very well for potentially confounding factors. PMID:22785089

  12. Risk of road accident associated with the use of drugs: a systematic review and meta-analysis of evidence from epidemiological studies.

    PubMed

    Elvik, Rune

    2013-11-01

    This paper is a corrigendum to a previously published paper where errors were detected. The errors have been corrected in this paper. The paper is otherwise identical to the previously published paper. A systematic review and meta-analysis of studies that have assessed the risk of accident associated with the use of drugs when driving is presented. The meta-analysis included 66 studies containing a total of 264 estimates of the effects on accident risk of using illicit or prescribed drugs when driving. Summary estimates of the odds ratio of accident involvement are presented for amphetamines, analgesics, anti-asthmatics, anti-depressives, anti-histamines, benzodiazepines, cannabis, cocaine, opiates, penicillin and zopiclone (a sleeping pill). For most of the drugs, small or moderate increases in accident risk associated with the use of the drugs were found. Information about whether the drugs were actually used while driving and about the doses used was often imprecise. Most studies that have evaluated the presence of a dose-response relationship between the dose of drugs taken and the effects on accident risk confirm the existence of a dose-response relationship. Use of drugs while driving tends to have a larger effect on the risk of fatal and serious injury accidents than on the risk of less serious accidents (usually property-damage-only accidents). The quality of the studies that have assessed risk varied greatly. There was a tendency for the estimated effects of drug use on accident risk to be smaller in well-controlled studies than in poorly controlled studies. Evidence of publication bias was found for some drugs. The associations found cannot be interpreted as causal relationships, principally because most studies do not control very well for potentially confounding factors.

  13. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

  14. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  15. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  16. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. PMID:26613351

  17. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system.

  18. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  19. Accident simulation and consequence analysis in support of MHTGR safety evaluations

    SciTech Connect

    Ball, S.J.; Wichner, R.P.; Smith, O.L.; Conklin, J.C. ); Barthold, W.P. )

    1991-01-01

    This paper summarizes research performed at Oak Ridge National Laboratory (ORNL) to assist the Nuclear Regulatory Commission (NRC) in preliminary determinations of licensability of the US Department of Energy (DOE) reference design of a standard modular high-temperature gas-cooled reactor (MHTGR). The work described includes independent analyses of core heatup and steam ingress accidents, and the reviews and analyses of fuel performance and fission product transport technology.

  20. Review of Cytogenetic analysis of restoration workers for Fukushima Daiichi nuclear power station accident.

    PubMed

    Suto, Yumiko

    2016-09-01

    Japan faced with the nuclear accident of the Fukushima Daiichi Nuclear Power Station (NPS) caused by the combined disaster of the Great East Japan Earthquake and the subsequent tsunamis on 11 March 2011. National Institute of Radiological Sciences received all nuclear workers who were engaged in emergency response tasks at the NPS and suspected of being overexposed to acute radiation. Biological dosimetry by dicentric chromosome assay was helpful for medical triage and management of the workers. PMID:27473701

  1. Review of Cytogenetic analysis of restoration workers for Fukushima Daiichi nuclear power station accident.

    PubMed

    Suto, Yumiko

    2016-09-01

    Japan faced with the nuclear accident of the Fukushima Daiichi Nuclear Power Station (NPS) caused by the combined disaster of the Great East Japan Earthquake and the subsequent tsunamis on 11 March 2011. National Institute of Radiological Sciences received all nuclear workers who were engaged in emergency response tasks at the NPS and suspected of being overexposed to acute radiation. Biological dosimetry by dicentric chromosome assay was helpful for medical triage and management of the workers.

  2. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  3. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  4. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  5. SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    Brad J. Merrill; Shannon M Bragg-Sitton

    2013-09-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

  6. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications

    PubMed Central

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  8. An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident

    SciTech Connect

    El-Genk, M.S.; Paramonov, D. )

    1993-01-10

    An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

  9. Principal component analysis based methodology to distinguish protein SERS spectra

    NASA Astrophysics Data System (ADS)

    Das, G.; Gentile, F.; Coluccio, M. L.; Perri, A. M.; Nicastri, A.; Mecarini, F.; Cojoc, G.; Candeloro, P.; Liberale, C.; De Angelis, F.; Di Fabrizio, E.

    2011-05-01

    Surface-enhanced Raman scattering (SERS) substrates were fabricated using electro-plating and e-beam lithography techniques. Nano-structures were obtained comprising regular arrays of gold nanoaggregates with a diameter of 80 nm and a mutual distance between the aggregates (gap) ranging from 10 to 30 nm. The nanopatterned SERS substrate enabled to have better control and reproducibility on the generation of plasmon polaritons (PPs). SERS measurements were performed for various proteins, namely bovine serum albumin (BSA), myoglobin, ferritin, lysozyme, RNase-B, α-casein, α-lactalbumin and trypsin. Principal component analysis (PCA) was used to organize and classify the proteins on the basis of their secondary structure. Cluster analysis proved that the error committed in the classification was of about 14%. In the paper, it was clearly shown that the combined use of SERS measurements and PCA analysis is effective in categorizing the proteins on the basis of secondary structure.

  10. The Energy Interaction Model: A promising new methodology for projecting GPHS-RTG cladding failures, release amounts & respirable release fractions for postulated pre-launch, launch, and post-reentry earth impact accidents

    NASA Astrophysics Data System (ADS)

    Coleman, James R.; Sholtis, Joseph A.; McCulloch, William H.

    1998-01-01

    Safety analyses and evaluations must be scrutable, defensible, and credible. This is particularly true when nuclear systems are involved, with their attendant potential for releases of radioactive materials (source terms) to the unrestricted environment. Analytical projections of General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) source terms, for safety analyses conducted to date, have relied upon generic data correlations using a single parameter of cladding damage, termed ``distortion.'' However, distortion is not an unequivocal measure of cladding insult, failure, or release. Furthermore, the analytical foundation, applicability, and broad use of distortion are argumentative and, thus, somewhat troublesome. In an attempt to avoid the complications associated with the use of distortion, a new methodology, referred to as the Energy Interaction Model (EIM), has been preliminarily developed. This new methodology is based upon the physical principles of energy and energy exchange during mechanical interactions. Specifically, the EIM considers the energy imparted to GPHS-RTG components (bare fueled clads, GPHS modules, and full GPHS-RTGs) when exposed to mechanical threats (blast/overpressure, shrapnel and fragment impacts, and Earth surface impacts) posed by the full range of potential accidents. Expected forms are developed for equations intended to project cladding failure probabilities, the number of cladding failures expected, release amounts, and the fraction released as respirable particles. The coefficients of the equations developed are then set to fit the GPHS-RTG test data, ensuring good agreement with the experimental database. This assured, fitted agreement with the test database, along with the foundation of the EIM in first principles, provides confidence in the model's projections beyond the available database. In summary, the newly developed EIM methodology is described and discussed. The conclusions reached are that the EIM

  11. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  12. The first steps towards a standardized methodology for CSP electricity yield analysis.

    SciTech Connect

    Wagner, Michael; Hirsch, Tobias , Institute of Technical Thermodynamics, Stuttgart,Germany); Benitez, Daniel; Eck, Markus , Institute of Technical Thermodynamics, Stuttgart,Germany); Ho, Clifford Kuofei

    2010-08-01

    The authors have founded a temporary international core team to prepare a SolarPACES activity aimed at the standardization of a methodology for electricity yield analysis of CSP plants. This core team has drafted a structural framework for a standardized methodology and the standardization process itself. The structural framework has to assure that the standardized methodology is applicable to all conceivable CSP systems, can be used on all levels of the project development process and covers all aspects affecting the electricity yield of CSP plants. Since the development of the standardized methodology is a complex task, the standardization process has been structured in work packages, and numerous international experts covering all aspects of CSP yield analysis have been asked to contribute to this process. These experts have teamed up in an international working group with the objective to develop, document and publish standardized methodologies for CSP yield analysis. This paper summarizes the intended standardization process and presents the structural framework of the methodology for CSP yield analysis.

  13. Workshop on the use of PRA methodology for the analysis of reactor events and operational data: Proceedings

    SciTech Connect

    Rasmuson, D.M.; Dingman, S.

    1992-06-01

    A workshop entitled ``The Use of PRA Methodology for the Analysis of Reactor Events and Operational Data`` was held on January 29--30, 1992 in Annapolis, Maryland. Over 50 participants from the NRC, its contractors, and others participated in the meetings. During the first day, presentations were made by invited speakers to discuss issues in relevant topics. On the second day, discussion groups were held to focus on three areas: risk significance of operational events, industry risk profile and generic concerns, and risk monitoring and risk-based performance indicators. Important considerations identified from the workshop are the following: Improve the Accident Sequence Precursor models and data. Improve the SCSS and NPRDS (e.g., by adding detailed performance information on selected components, by improving narratives on failure causes). Develop risk-based performance indicators. Use risk insights to help focus trending and performance analyses of components, systems, initiators, and sequences. Improve the statistical quality of trending and performance analyses. Flag implications of special conditions (e.g., external events, containment performance) during data studies. Trend common cause and human performance using appropriate models to obtain a better understanding of the impact and causes of failure. Develop a method for producing an industry risk profile.

  14. Workshop on the use of PRA methodology for the analysis of reactor events and operational data: Proceedings

    SciTech Connect

    Rasmuson, D.M. . Div. of Safety Programs); Dingman, S. )

    1992-06-01

    A workshop entitled The Use of PRA Methodology for the Analysis of Reactor Events and Operational Data'' was held on January 29--30, 1992 in Annapolis, Maryland. Over 50 participants from the NRC, its contractors, and others participated in the meetings. During the first day, presentations were made by invited speakers to discuss issues in relevant topics. On the second day, discussion groups were held to focus on three areas: risk significance of operational events, industry risk profile and generic concerns, and risk monitoring and risk-based performance indicators. Important considerations identified from the workshop are the following: Improve the Accident Sequence Precursor models and data. Improve the SCSS and NPRDS (e.g., by adding detailed performance information on selected components, by improving narratives on failure causes). Develop risk-based performance indicators. Use risk insights to help focus trending and performance analyses of components, systems, initiators, and sequences. Improve the statistical quality of trending and performance analyses. Flag implications of special conditions (e.g., external events, containment performance) during data studies. Trend common cause and human performance using appropriate models to obtain a better understanding of the impact and causes of failure. Develop a method for producing an industry risk profile.

  15. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  16. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  17. Finite element methodology for transient conduction/forced-convection thermal analysis

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Wieting, A. R.

    1979-01-01

    Finite element methodology for steady state thermal analysis of convectively cooled structures has been extended for transient analysis. The finite elements are based on representing the fluid passages by fluid bulk-temperature nodes and fluid-solid interface nodes. The formulation of the finite element equations for a typical flow passage is based on the weighted residual method with upwind weighting functions. Computer implementation of the convective finite element methodology using explicit and implicit time integration algorithms is described. Accuracy and efficiency of the methodology is evaluated by comparisons with analytical solutions and finite-difference lumped-parameter analyses. The comparative analyses demonstrate that finite element conduction/conduction methodology may be used to predict transient temperatures with an accuracy equal or superior to the lumped-parameter finite-difference method.

  18. A Property-Driven Methodology for Formal Analysis of Synthetic Biology Systems.

    PubMed

    Konur, Savas; Gheorghe, Marian

    2015-01-01

    This paper proposes a formal methodology to analyse bio-systems, in particular synthetic biology systems. An integrative analysis perspective combining different model checking approaches based on different property categories is provided. The methodology is applied to the synthetic pulse generator system and several verification experiments are carried out to demonstrate the use of our approach to formally analyse various aspects of synthetic biology systems.

  19. Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident

    PubMed Central

    2015-01-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

  20. Recursive modeling of loss of control in human and organizational processes: a systemic model for accident analysis.

    PubMed

    Kontogiannis, Tom; Malakis, Stathis

    2012-09-01

    A recursive model of accident investigation is proposed by exploiting earlier work in systems thinking. Safety analysts can understand better the underlying causes of decision or action flaws by probing into the patterns of breakdown in the organization of safety. For this deeper analysis, a cybernetic model of organizational factors and a control model of human processes have been integrated in this article (i.e., the viable system model and the extended control model). The joint VSM-ECOM framework has been applied to a case study to help safety practitioners with the analysis of patterns of breakdown with regard to how operators and organizations manage goal conflicts, monitor work progress, recognize weak signals, align goals across teams, and adapt plans on the fly. The recursive accident representation brings together several organizational issues (e.g., the dilemma of autonomy versus compliance, or the interaction between structure and strategy) and addresses how operators adapt to challenges in their environment by adjusting their modes of functioning and recovery. Finally, it facilitates the transfer of knowledge from diverse incidents and near misses within similar domains of practice.

  1. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  2. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  3. Analysis of a complex recreational scuba diving accident: French Pass, New Zealand, 2000.

    PubMed

    McGeoch, Graham; Davis, F Michael

    2009-03-01

    In March 2000, six students and an instructor dived using open-circuit scuba in a narrow pass and were swept by a strong current to a depth of 90 metres' sea water. Three died and four were injured, which makes the incident the worst diving accident in New Zealand history. The group was on an officially-sanctioned course with many factors contributing to the final tragic events. The dive is described and the medical response examined. The legal consequences are reported and their implications for diver training and employment are discussed.

  4. Analysis of a small break loss-of-coolant accident of pressurized water reactor by APROS

    SciTech Connect

    Al-Falahi, A.; Haennine, M.; Porkholm, K.

    1995-09-01

    The purpose of this paper is to study the capability of APROS (Advanced PROcess Simulator) code to simulate the real plant thermal-hydraulic transient of a Small Break Loss-Of-Coolant Accident (SBLOCA) of Loss-Of-Fluid Test (LOFT) facility. The LOFT is a scaled model of a Pressurized Water Reactor (PWR). This work is a part of a larger validation of the APROS thermal-hydraulic models. The results of SBLOCA transient calculated by APROS showed a reasonable agreement with the measured data.

  5. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  6. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  7. Distinguishing neglect from abuse and accident: analysis of the case files of a hospital child protection team in Israel.

    PubMed

    Davidson-Arad, Bilha; Benbenishty, Rami; Chen, Wendy; Glasser, Saralee; Zur, Shmuel; Lerner-Geva, Liat

    2010-11-01

    The study compares the characteristics of children assessed as neglected, physically abused, or accident victims by a hospital child protection team (CPT) and identifies the information on which the CPT based its assessments. The comparison is based on content analysis of records of 414 children examined by the CPT in a major hospital in Israel between 1991 and 2006, of whom 130 (31.4%) were neglected, 54 (13.0%) were physically abused, and 230 (55.6%) were accident victims. Findings of three hierarchical logistic regressions show that the children classified as neglected had the most early development problems, but were the least likely to have received psychological treatment, and that that their families were the most likely to be receiving state financial support and to have had prior contact with the social services. They also show that the CPT had received the least information indicative of maltreatment about these children from the community and that their medical and physical examinations aroused the least suspicion. Finally, they show that the impressions the hospital staff and CPT had of the parents during the hospital visit had greater power to distinguish between the groups than the children's characteristics or the parents' socio-demographic background. The findings attest to the ability of the CPT to differentiate between neglect victims and physical abuse or accident victims. With this, they also point to ambiguities in the classification process that should be addressed by further research and training and to the need for detailed and thorough documentation of the information and observations on which the CPT's assessments are based.

  8. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    SciTech Connect

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  9. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  10. A faster reactor transient analysis methodology for PCs

    SciTech Connect

    Ott, K.O. . School of Nuclear Engineering)

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report.

  11. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Jacobs, Daniel C.; Hazelton, B. J.; Trott, C. M.; Dillon, Joshua S.; Pindor, B.; Sullivan, I. S.; Pober, J. C.; Barry, N.; Beardsley, A. P.; Bernardi, G.; Bowman, Judd D.; Briggs, F.; Cappallo, R. J.; Carroll, P.; Corey, B. E.; de Oliveira-Costa, A.; Emrich, D.; Ewall-Wice, A.; Feng, L.; Gaensler, B. M.; Goeke, R.; Greenhill, L. J.; Hewitt, J. N.; Hurley-Walker, N.; Johnston-Hollitt, M.; Kaplan, D. L.; Kasper, J. C.; Kim, HS; Kratzenberg, E.; Lenc, E.; Line, J.; Loeb, A.; Lonsdale, C. J.; Lynch, M. J.; McKinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Neben, A. R.; Thyagarajan, N.; Oberoi, D.; Offringa, A. R.; Ord, S. M.; Paul, S.; Prabu, T.; Procopio, P.; Riding, J.; Rogers, A. E. E.; Roshi, A.; Udaya Shankar, N.; Sethi, Shiv K.; Srivani, K. S.; Subrahmanyan, R.; Tegmark, M.; Tingay, S. J.; Waterson, M.; Wayth, R. B.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.; Wu, C.; Wyithe, J. S. B.

    2016-07-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  12. Analysis methodology and recent results of the IGS network combination

    NASA Astrophysics Data System (ADS)

    Ferland, R.; Kouba, J.; Hutchison, D.

    2000-11-01

    A working group of the International GPS Service (IGS) was created to look after Reference Frame (RF) issues and contribute to the densification and improvement of the International Terrestrial Reference Frame (ITRF). One important objective of the Reference Frame Working Group is to generate consistent IGS station coordinates and velocities, Earth Rotation Parameters (ERP) and geocenter estimates along with the appropriate covariance information. These parameters have a direct impact on other IGS products such as the estimation of GPS satellite ephemerides, as well as satellite and station clocks. The information required is available weekly from the Analysis Centers (AC) (cod, emr, esa, gfz, jpl, ngs, sio) and from the Global Network Associate Analysis Centers (GNAAC) (JPL, mit, ncl) using a "Software Independent Exchange Format" (SINEX). The AC are also contributing daily ERPs as part of their weekly submission. The procedure in place simultaneously combines the weekly station coordinates, geocenter and daily ERP estimates. A cumulative solution containing station coordinates and velocity is also updated with each weekly combination. This provides a convenient way to closely monitor the quality of the estimated station coordinates and to have an up to date cumulative solution available at all times. To provide some necessary redundancy, the weekly station coordinates solution is compared against the GNAAC solutions. Each of the 3 GNAAC uses its own software, allowing independent verification of the combination process. The RMS of the coordinate differences in the north, east and up components between the AC/GNAAC and the ITRF97 Reference Frame Stations are 4-10 mm, 5-20 mm and 6-25 mm. The station velocities within continental plates are compared to the NNR-NUVEL1A plate motion model (DeMets et al., 1994). The north, east and up velocity RMS are 2 mm/y, 3 mm/y and 8 mm/y. Note that NNR-NUVEL1A assumes a zero vertical velocity.

  13. Probabilistic methodology for estimation of undiscovered petroleum resources in play analysis of the United States

    USGS Publications Warehouse

    Crovelli, R.A.

    1992-01-01

    A geostochastic system called FASPF was developed by the U.S. Geological Survey for their 1989 assessment of undiscovered petroleum resources in the United States. FASPF is a fast appraisal system for petroleum play analysis using a field-size geological model and an analytic probabilistic methodology. The geological model is a particular type of probability model whereby the volumes of oil and gas accumulations are modeled as statistical distributions in the form of probability histograms, and the risk structure is bilevel (play and accumulation) in terms of conditional probability. The probabilistic methodology is an analytic method derived from probability theory rather than Monte Carlo simulation. The resource estimates of crude oil and natural gas are calculated and expressed in terms of probability distributions. The probabilistic methodology developed by the author is explained. The analytic system resulted in a probabilistic methodology for play analysis, subplay analysis, economic analysis, and aggregation analysis. Subplay analysis included the estimation of petroleum resources on non-Federal offshore areas. Economic analysis involved the truncation of the field size with a minimum economic cutoff value. Aggregation analysis was needed to aggregate individual play and subplay estimates of oil and gas, respectively, at the provincial, regional, and national levels. ?? 1992 Oxford University Press.

  14. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  15. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  16. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  17. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  18. Anthropological analysis of taekwondo--new methodological approach.

    PubMed

    Cular, Drazen; Munivrana, Goran; Katić, Ratko

    2013-05-01

    The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance

  19. Posttraumatic Stress Disorder: Diagnostic Data Analysis by Data Mining Methodology

    PubMed Central

    Marinić, Igor; Supek, Fran; Kovačić, Zrnka; Rukavina, Lea; Jendričko, Tihana; Kozarić-Kovačić, Dragica

    2007-01-01

    Aim To use data mining methods in assessing diagnostic symptoms in posttraumatic stress disorder (PTSD) Methods The study included 102 inpatients: 51 with a diagnosis of PTSD and 51 with psychiatric diagnoses other than PTSD. Several models for predicting diagnosis were built using the random forest classifier, one of the intelligent data analysis methods. The first prediction model was based on a structured psychiatric interview, the second on psychiatric scales (Clinician-administered PTSD Scale – CAPS, Positive and Negative Syndrome Scale – PANSS, Hamilton Anxiety Scale – HAMA, and Hamilton Depression Scale – HAMD), and the third on combined data from both sources. Additional models placing more weight on one of the classes (PTSD or non-PTSD) were trained, and prototypes representing subgroups in the classes constructed. Results The first model was the most relevant for distinguishing PTSD diagnosis from comorbid diagnoses such as neurotic, stress-related, and somatoform disorders. The second model pointed out the scores obtained on the Clinician-administered PTSD Scale (CAPS) and additional Positive and Negative Syndrome Scale (PANSS) scales, together with comorbid diagnoses of neurotic, stress-related, and somatoform disorders as most relevant. In the third model, psychiatric scales and the same group of comorbid diagnoses were found to be most relevant. Specialized models placing more weight on either the PTSD or non-PTSD class were able to better predict their targeted diagnoses at some expense of overall accuracy. Class subgroup prototypes mainly differed in values achieved on psychiatric scales and frequency of comorbid diagnoses. Conclusion Our work demonstrated the applicability of data mining methods for the analysis of structured psychiatric data for PTSD. In all models, the group of comorbid diagnoses, including neurotic, stress-related, and somatoform disorders, surfaced as important. The important attributes of the data, based on the

  20. Anthropological analysis of taekwondo--new methodological approach.

    PubMed

    Cular, Drazen; Munivrana, Goran; Katić, Ratko

    2013-05-01

    The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance

  1. Analysis of injuries among pilots involved in fatal general aviation airplane accidents.

    PubMed

    Wiegmann, Douglas A; Taneja, Narinder

    2003-07-01

    The purpose of this study was to analyze patterns of injuries sustained by pilots involved in fatal general aviation (GA) airplane accidents. Detailed information on the pattern and nature of injuries was retrieved from the Federal Aviation Administration's autopsy database for pilots involved in fatal GA airplane accidents from 1996 to 1999. A review of 559 autopsies revealed that blunt trauma was the primary cause of death in 86.0% (N=481) of the autopsies. The most commonly occurring bony injuries were fracture of the ribs (72.3%), skull (55.1%), facial bones (49.4%), tibia (37.9%) and pelvis (36.0%). Common organ injuries included laceration of the liver (48.1%), lung (37.6%) heart (35.6%), and spleen (30.1%), and hemorrhage of the brain (33.3%) and lung (32.9%). A fractured larynx was observed in 14.7% of the cases, a finding that has not been reported in literature until now. It was observed that individuals who sustained brain hemorrhage were also more likely to have fractures of the facial bones rather than skull fractures. PMID:12729820

  2. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-09-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations preformed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  3. LeRoy Meisinger, Part II: Analysis of the Scientific Ballooning Accident of 2 June 1924.

    NASA Astrophysics Data System (ADS)

    Lewis, John M.; Moore, Charles B.

    1995-02-01

    During the spring of 1924, U.S. Weather Bureau meteorologist LeRoy Meisinger conducted a series of experiments with a free balloon to determine the trajectories of air around extratropical cyclones. The 10th flight in the series ended with a crash of the balloon over central Illinois. Both Meisinger and the pilot, Army Air Services Lt. James Neely, were killed.An effort has been made to reconstruct this accident using information from a review article by early twentieth-century meteorologist Vincent Jakl and newspaper accounts of the accident. The principal results of the study follow.1) Meisinger's balloon was caught in the downdraft of a newly developed thunderstorm over the Bement, Illinois, area on the evening of 2 June;2) a hard landing took place in a cornfield just north of Bement, and loss of ballast at the hard-landing site was sufficient to cause the balloon to rise again; and3) after rebounding from the ground, the balloon with the two aeronauts aboard was struck by lightning. A fire resulted that burned through the netting and led to a crash four miles northeast of the hard-landing site.

  4. Qualitative urinary organic acid analysis: methodological approaches and performance.

    PubMed

    Peters, V; Garbade, S F; Langhans, C D; Hoffmann, G F; Pollitt, R J; Downing, M; Bonham, J R

    2008-12-01

    A programme for proficiency testing of biochemical genetics laboratories undertaking urinary qualitative organic acid analysis and its results for 50 samples examined for factors contributing to poor performance are described. Urine samples from patients in whom inherited metabolic disorders have been confirmed as well as control urines were circulated to participants and the results from 94 laboratories were evaluated. Laboratories showed variability both in terms of their individual performance and on a disease-specific basis. In general, conditions including methylmalonic aciduria, propionic aciduria, isovaleric aciduria, mevalonic aciduria, Canavan disease and 3-methylcrotonyl-CoA carboxylase were readily identified. Detection was poorer for other diseases such as glutaric aciduria type II, glyceric aciduria and, in one sample, 3-methylcrotonyl-CoA carboxylase deficiency. To identify the factors that allow some laboratories to perform well on a consistent basis while others perform badly, we devised a questionnaire and compared the responses with the results for performance in the scheme. A trend towards better performance could be demonstrated for those laboratories that regularly use internal quality control (QC) samples in their sample preparation (p = 0.079) and those that participate in further external quality assurance (EQA) schemes (p = 0,040). Clinicians who depend upon these diagnostic services to identify patients with these defects and the laboratories that provide them should be aware of the potential for missed diagnoses and the factors that may lead to improved performance.

  5. Acid rain research: a review and analysis of methodology

    SciTech Connect

    Irving, P.M.

    1983-01-01

    The acidic deposition phenomena, when implicated as a factor potentially responsible for crop and forest yield losses and destruction of aquatic life, has gained increasing attention. The widespread fear that acid rain is having or may have devastating effects has prompted international debates and legislative proposals. An analysis of research on the effects of acid rain, however, reveals serious questions concerning the applicability and validity of conclusions of much of the work and thus conclusive estimations of impacts are lacking. In order to establish cause-effect relationships between rain acidity and the response of a receptor, controlled studies are necessary to verify observations in the field since there are many natural processes that produce and consume acidity and because numerous other environmental variables affect ecosystem response. Only when the response of an entire system is understood (i.e., interactions between plant, soil, soil microbes, and groundwater) can economic impacts be assessed and tolerance thresholds established for the wet deposition of acids. 14 references, 5 figures, 1 table.

  6. Methodological and computational considerations for multiple correlation analysis.

    PubMed

    Shieh, Gwowen; Kung, Cmen-Feng

    2007-11-01

    The squared multiple correlation coefficient has been widely employed to assess the goodness-of-fit of linear regression models in many applications. Although there are numerous published sources that present inferential issues and computing algorithms for multinormal correlation models, the statistical procedure for testing substantive significance by specifying the nonzero-effect null hypothesis has received little attention. This article emphasizes the importance of determining whether the squared multiple correlation coefficient is small or large in comparison with some prescribed standard and develops corresponding Excel worksheets that facilitate the implementation of various aspects of the suggested significance tests. In view of the extensive accessibility of Microsoft Excel software and the ultimate convenience of general-purpose statistical packages, the associated computer routines for interval estimation, power calculation, a nd samplesize determination are alsoprovided for completeness. The statistical methods and available programs of multiple correlation analysis described in this article purport to enhance pedagogical presentation in academic curricula and practical application in psychological research.

  7. Landscape equivalency analysis: methodology for estimating spatially explicit biodiversity credits.

    PubMed

    Bruggeman, Douglas J; Jones, Michael L; Lupi, Frank; Scribner, Kim T

    2005-10-01

    We propose a biodiversity credit system for trading endangered species habitat designed to minimize and reverse the negative effects of habitat loss and fragmentation, the leading cause of species endangerment in the United States. Given the increasing demand for land, approaches that explicitly balance economic goals against conservation goals are required. The Endangered Species Act balances these conflicts based on the cost to replace habitat. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects of habitat fragmentation. Mitigating the effects of land development on biodiversity requires decisions that recognize regional ecological effects resulting from local economic decisions. We propose Landscape Equivalency Analysis (LEA), a landscape-scale approach similar to HEA, as an accounting system to calculate conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local decisions. Credits purchased by public agencies or NGOs for purposes other than mitigating a take create a net investment in natural capital leading to habitat defragmentation. Credits calculated by LEA use metapopulation genetic theory to estimate sustainability criteria against which all trades are judged. The approach is rooted in well-accepted ecological, evolutionary, and economic theory, which helps compensate for the degree of uncertainty regarding the effects of habitat loss and fragmentation on endangered species. LEA requires application of greater scientific rigor than typically applied to endangered species management on private lands but provides an objective, conceptually sound basis for achieving the often conflicting goals of economic efficiency and long-term ecological sustainability. PMID:16132443

  8. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    PubMed

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  9. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    PubMed

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  10. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    SciTech Connect

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

  11. Thermal analysis of the 10-gallon and the 55-gallon DOT-6M containers with thermal boundary conditions corresponding to 10CFR71 normal transport and accident conditions

    SciTech Connect

    Sanchez, L.C.; Longenbaugh, R.S.; Moss, M.; Haseman, G.M.; Fowler, W.E.; Roth, E.P.

    1988-03-01

    This report describes the heat transfer analysis of the 10-gallon and 55-gallon 6M containers. The analysis was performed with boundary conditions corresponding to a normal transport condition and a hypothetical accident condition. Computational results indicated that the insulation material in the 6M containers will adequately protect the payload region of the 6M containers. 26 refs., 26 figs., 8 tabs.

  12. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S.; Abrahmson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  13. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  14. TRANSIENT ACCIDENT ANALYSIS OF THE GLOVEBOX SYSTEM IN A LARGE PROCESS ROOM

    SciTech Connect

    Lee, S

    2008-01-11

    Local transient hydrogen concentrations were evaluated inside a large process room when the hydrogen gas was released by three postulated accident scenarios associated with the process tank leakage and fire leading to a loss of gas confinement. The three cases considered in this work were fire in a room, loss of confinement from a process tank, and loss of confinement coupled with fire event. Based on these accident scenarios in a large and unventilated process room, the modeling calculations of the hydrogen migration were performed to estimate local transient concentrations of hydrogen due to the sudden leakage and release from a glovebox system associated with the process tank. The modeling domain represented the major features of the process room including the principal release or leakage source of gas storage system. The model was benchmarked against the literature results for key phenomena such as natural convection, turbulent behavior, gas mixing due to jet entrainment, and radiation cooling because these phenomena are closely related to the gas driving mechanisms within a large air space of the process room. The modeling results showed that at the corner of the process room, the gas concentrations migrated by the Case 2 and Case 3 scenarios reached the set-point value of high activity alarm in about 13 seconds, while the Case 1 scenario takes about 90 seconds to reach the concentration. The modeling results were used to estimate transient radioactive gas migrations in an enclosed process room installed with high activity alarm monitor when the postulated leakage scenarios are initiated without room ventilation.

  15. Semantic analysis according to Peep Koort--a substance-oriented research methodology.

    PubMed

    Sivonen, Kerstin; Kasén, Anne; Eriksson, Katie

    2010-12-01

    The aim of this article is to describe the hermeneutic semantic analysis created by professor Peep Koort (1920-1977) and to discuss it as a methodology for research within caring science. The methodology is developed with a hermeneutic approach that differs from the traditions of semantic analysis in philosophy or linguistics. The research objects are core concepts and theoretical constructs (originally within the academic discipline of education science, later on within the academic discipline of caring science), focusing deeper understanding of essential meaning content when developing a discipline. The qualitative methodology of hermeneutic semantic analysis is described step by step as created by Koort, interpreted and developed by the authors. An etymological investigation and an analysis of synonymy between related concepts within a conceptual family guides the researcher to understand and discriminate conceptual dimensions of meaning content connected to the word studied, thus giving opportunities to summarise it in a theoretical definition, a discovery that can be tested in varying contexts. From a caring science perspective, we find the hermeneutic methodology of semantic analysis fruitful and suitable for researchers developing their understanding of core concepts and theoretical constructs connected to the development of the academic discipline.

  16. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  17. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. PMID:26687539

  18. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD.

  19. Tools for improving safety management in the Norwegian Fishing Fleet occupational accidents analysis period of 1998-2006.

    PubMed

    Aasjord, Halvard L

    2006-01-01

    Reporting of human accidents in the Norwegian Fishing Fleet has always been very difficult because there has been no tradition in making reports on all types of working accidents among fishermen, if the accident does not seem to be very serious or there is no economical incentive to report. Therefore reports are only written when the accidents are serious or if the fisherman is reported sick. Reports about an accident are sent to the insurance company, but another report should also be sent to the Norwegian Maritime Directorate (NMD). Comparing of data from one former insurance company and NMD shows that the real numbers of injuries or serious accidents among Norwegian fishermen could be up to two times more than the numbers reported to NMD. Special analyses of 1690 accidents from the so called PUS-database (NMD) for the period 1998-2002, show that the calculated risk was 23.6 accidents per 1000 man-years. This is quite a high risk level, and most of the accidents in the fishing fleet were rather serious. The calculated risks are highest for fishermen on board the deep sea fleet of trawlers (28.6 accidents per 1000 man-years) and also on the deep sea fleet of purse seiners (28.9 accidents per 1000 man-years). Fatal accidents over a longer period of 51.5 years from 1955 to 2006 are also roughly analysed. These data from SINTEF's own database show that the numbers of fatal accidents have been decreasing over this long period, except for the two periods 1980-84 and 1990-94 where we had some casualties with total losses of larger vessels with the loss of most of the crew, but also many others typical work accidents on smaller vessels. The total numbers of registered Norwegian fishermen and also the numbers of man-years have been drastically reduced over the 51.5 years from 1955 to 2006. The risks of fatal accidents have been very steady over time at a high level, although there has been a marked risk reduction since 1990-94. For the last 8.5-year period of January 1998

  20. Tools for improving safety management in the Norwegian Fishing Fleet occupational accidents analysis period of 1998-2006.

    PubMed

    Aasjord, Halvard L

    2006-01-01

    Reporting of human accidents in the Norwegian Fishing Fleet has always been very difficult because there has been no tradition in making reports on all types of working accidents among fishermen, if the accident does not seem to be very serious or there is no economical incentive to report. Therefore reports are only written when the accidents are serious or if the fisherman is reported sick. Reports about an accident are sent to the insurance company, but another report should also be sent to the Norwegian Maritime Directorate (NMD). Comparing of data from one former insurance company and NMD shows that the real numbers of injuries or serious accidents among Norwegian fishermen could be up to two times more than the numbers reported to NMD. Special analyses of 1690 accidents from the so called PUS-database (NMD) for the period 1998-2002, show that the calculated risk was 23.6 accidents per 1000 man-years. This is quite a high risk level, and most of the accidents in the fishing fleet were rather serious. The calculated risks are highest for fishermen on board the deep sea fleet of trawlers (28.6 accidents per 1000 man-years) and also on the deep sea fleet of purse seiners (28.9 accidents per 1000 man-years). Fatal accidents over a longer period of 51.5 years from 1955 to 2006 are also roughly analysed. These data from SINTEF's own database show that the numbers of fatal accidents have been decreasing over this long period, except for the two periods 1980-84 and 1990-94 where we had some casualties with total losses of larger vessels with the loss of most of the crew, but also many others typical work accidents on smaller vessels. The total numbers of registered Norwegian fishermen and also the numbers of man-years have been drastically reduced over the 51.5 years from 1955 to 2006. The risks of fatal accidents have been very steady over time at a high level, although there has been a marked risk reduction since 1990-94. For the last 8.5-year period of January 1998

  1. The {ital Energy Interaction Model}: A promising new methodology for projecting GPHS-RTG cladding failures, release amounts & respirable release fractions for postulated pre-launch, launch, and post-reentry earth impact accidents

    SciTech Connect

    Coleman, J.R.; Sholtis, J.A. Jr.; McCulloch, W.H.

    1998-01-01

    Safety analyses and evaluations must be scrutable, defensible, and credible. This is particularly true when nuclear systems are involved, with their attendant potential for releases of radioactive materials (source terms) to the unrestricted environment. Analytical projections of General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) source terms, for safety analyses conducted to date, have relied upon generic data correlations using a single parameter of cladding damage, termed {open_quotes}distortion.{close_quotes} However, distortion is not an unequivocal measure of cladding insult, failure, or release. Furthermore, the analytical foundation, applicability, and broad use of distortion are argumentative and, thus, somewhat troublesome. In an attempt to avoid the complications associated with the use of distortion, a new methodology, referred to as the {ital Energy Interaction Model (EIM)}, has been preliminarily developed. This new methodology is based upon the physical principles of energy and energy exchange during mechanical interactions. Specifically, the {ital EIM} considers the energy imparted to GPHS-RTG components (bare fueled clads, GPHS modules, and full GPHS-RTGs) when exposed to mechanical threats (blast/overpressure, shrapnel and fragment impacts, and Earth surface impacts) posed by the full range of potential accidents. Expected forms are developed for equations intended to project cladding failure probabilities, the number of cladding failures expected, release amounts, and the fraction released as respirable particles. The coefficients of the equations developed are then set to fit the GPHS-RTG test data, ensuring good agreement with the experimental database. This assured, fitted agreement with the test database, along with the foundation of the {ital EIM} in first principles, provides confidence in the model{close_quote}s projections beyond the available database. In summary, the newly developed {ital EIM} methodology is

  2. Fault tree analysis of fire and explosion accidents for dual fuel (diesel/natural gas) ship engine rooms

    NASA Astrophysics Data System (ADS)

    Guan, Yifeng; Zhao, Jie; Shi, Tengfei; Zhu, Peipei

    2016-07-01

    In recent years, China's increased interest in environmental protection has led to a promotion of energy-efficient dual fuel (diesel/natural gas) ships in Chinese inland rivers. A natural gas as ship fuel may pose dangers of fire and explosion if a gas leak occurs. If explosions or fires occur in the engine rooms of a ship, heavy damage and losses will be incurred. In this paper, a fault tree model is presented that considers both fires and explosions in a dual fuel ship; in this model, dual fuel engine rooms are the top events. All the basic events along with the minimum cut sets are obtained through the analysis. The primary factors that affect accidents involving fires and explosions are determined by calculating the degree of structure importance of the basic events. According to these results, corresponding measures are proposed to ensure and improve the safety and reliability of Chinese inland dual fuel ships.

  3. Fault tree analysis of fire and explosion accidents for dual fuel (diesel/natural gas) ship engine rooms

    NASA Astrophysics Data System (ADS)

    Guan, Yifeng; Zhao, Jie; Shi, Tengfei; Zhu, Peipei

    2016-09-01

    In recent years, China's increased interest in environmental protection has led to a promotion of energy-efficient dual fuel (diesel/natural gas) ships in Chinese inland rivers. A natural gas as ship fuel may pose dangers of fire and explosion if a gas leak occurs. If explosions or fires occur in the engine rooms of a ship, heavy damage and losses will be incurred. In this paper, a fault tree model is presented that considers both fires and explosions in a dual fuel ship; in this model, dual fuel engine rooms are the top events. All the basic events along with the minimum cut sets are obtained through the analysis. The primary factors that affect accidents involving fires and explosions are determined by calculating the degree of structure importance of the basic events. According to these results, corresponding measures are proposed to ensure and improve the safety and reliability of Chinese inland dual fuel ships.

  4. Severe Accident Scoping Simulations of Accident Tolerant Fuel Concepts for BWRs

    SciTech Connect

    Robb, Kevin R.

    2015-08-01

    Accident-tolerant fuels (ATFs) are fuels and/or cladding that, in comparison with the standard uranium dioxide Zircaloy system, can tolerate loss of active cooling in the core for a considerably longer time period while maintaining or improving the fuel performance during normal operations [1]. It is important to note that the currently used uranium dioxide Zircaloy fuel system tolerates design basis accidents (and anticipated operational occurrences and normal operation) as prescribed by the US Nuclear Regulatory Commission. Previously, preliminary simulations of the plant response have been performed under a range of accident scenarios using various ATF cladding concepts and fully ceramic microencapsulated fuel. Design basis loss of coolant accidents (LOCAs) and station blackout (SBO) severe accidents were analyzed at Oak Ridge National Laboratory (ORNL) for boiling water reactors (BWRs) [2]. Researchers have investigated the effects of thermal conductivity on design basis accidents [3], investigated silicon carbide (SiC) cladding [4], as well as the effects of ATF concepts on the late stage accident progression [5]. These preliminary analyses were performed to provide initial insight into the possible improvements that ATF concepts could provide and to identify issues with respect to modeling ATF concepts. More recently, preliminary analyses for a range of ATF concepts have been evaluated internationally for LOCA and severe accident scenarios for the Chinese CPR1000 [6] and the South Korean OPR-1000 [7] pressurized water reactors (PWRs). In addition to these scoping studies, a common methodology and set of performance metrics were developed to compare and support prioritizing ATF concepts [8]. A proposed ATF concept is based on iron-chromium-aluminum alloys (FeCrAl) [9]. With respect to enhancing accident tolerance, FeCrAl alloys have substantially slower oxidation kinetics compared to the zirconium alloys typically employed. During a severe accident, Fe

  5. 76 FR 35431 - Federal Need Analysis Methodology for the 2012-2013 Award Year

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year Correction In notice document 2010-12812 appearing on pages 30139 through 30142 in the issue of Tuesday, May 24, 2011, make the following...

  6. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  7. Methodological Advances in the Analysis of Individual Growth with Relevance to Education Policy.

    ERIC Educational Resources Information Center

    Kaplan, David

    2002-01-01

    Demonstrates how recent methodological developments in the analysis of individual growth can inform important problems in education policy, focusing on growth mixture modeling and applying growth mixture modeling to data from the Early Childhood Longitudinal Study-Kindergarten class of 1998-99 to investigate the effects of full- and part-day…

  8. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    SciTech Connect

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  9. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  10. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-07-12

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  11. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  12. Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology

    SciTech Connect

    Fuller, R.; Harrell, J.

    1996-12-01

    The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

  13. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    PubMed

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  14. "Murder-suicide" or "murder-accident"? Difficulties with the analysis of cases.

    PubMed

    Byard, Roger W; Veldhoen, David; Kobus, Hilton; Heath, Karen

    2010-09-01

    Homicide where a perpetrator is found dead adjacent to the victim usually represents murder-suicide. Two incidents are reported to demonstrate characteristic features in one, and alternative features in the other, that indicate differences in the manner of death. (i) A 37-year-old mother was found dead in a burnt out house with her two young sons in an adjacent bedroom. Deaths were due to incineration and inhalation of products of combustion. (ii) A 39-year-old woman was found stabbed to death in a burnt out house with her 39-year-old de facto partner deceased from the combined effects of incineration and inhalation of products of combustion. The first incident represented a typical murder-suicide, however, in the second incident, the perpetrator had tried to escape through a window and had then sought refuge in a bathroom under a running shower. Murder-accident rather than murder-suicide may therefore be a more accurate designation for such cases.

  15. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  16. Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.

    PubMed

    Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

    2013-03-01

    Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (∼ 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (∼ 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (≥ 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

  17. [Diagnostic analysis of the rehabilitation services that attend victims of accidents and violence in Recife].

    PubMed

    de Lima, Maria Luiza Carvalho; Deslandes, Suely Ferreira; de Souza, Edinilsa Ramos; de Lima, Maria Luiza Lopes Timóteo; Barreira, Alice Kelly

    2009-01-01

    This paper aims to analyze the rehabilitation services in Recife, Brazil, confronting it with what the National Policy for Decreasing Morbimortality from Accidents and Violence (NPDMAV) professes. Six units of rehabilitation services were analyzed; five were from municipal administration and the other was an ONG on an agreement with SUS. A partial structured interview was conducted with the person in charge for the Attention to the Disable Person's Health aiming to map the network. Also a questionnaire was applied to the managers of the six services aiming to identify the unit and its activities, structure and organization. The Rehabilitation Assistance network in Recife do not meet the NPDMAV guidelines as it presents: decreased number of services/programs; lack of a multidisciplinary team; lack of technological support; deficient joint intra and inter sectors; little interaction between the victim and his family to reinsertion into family and social life; and still incipient prevention and promotion actions. In conclusion, there is an effort to adequate the actions to this population group; however there are relevant covering deficits of people, equipments, information and articulation registers among the various levels of this health network. PMID:19851594

  18. Shipping container response to severe highway and railway accident conditions: Appendices

    SciTech Connect

    Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

    1987-02-01

    Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

  19. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1993-01-01

    In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

  20. Temporal Statistic of Traffic Accidents in Turkey

    NASA Astrophysics Data System (ADS)

    Erdogan, S.; Yalcin, M.; Yilmaz, M.; Korkmaz Takim, A.

    2015-10-01

    Traffic accidents form clusters in terms of geographic space and over time which themselves exhibit distinct spatial and temporal patterns. There is an imperative need to understand how, where and when traffic accidents occur in order to develop appropriate accident reduction strategies. An improved understanding of the location, time and reasons for traffic accidents makes a significant contribution to preventing them. Traffic accident occurrences have been extensively studied from different spatial and temporal points of view using a variety of methodological approaches. In literature, less research has been dedicated to the temporal patterns of traffic accidents. In this paper, the numbers of traffic accidents are normalized according to the traffic volume and the distribution and fluctuation of these accidents is examined in terms of Islamic time intervals. The daily activities and worship of Muslims are arranged according to these time intervals that are spaced fairly throughout the day according to the position of the sun. The Islamic time intervals are never been used before to identify the critical hour for traffic accidents in the world. The results show that the sunrise is the critical time that acts as a threshold in the rate of traffic accidents throughout Turkey in Islamic time intervals.

  1. Improved methodology for integral analysis of advanced reactors employing passive safety

    NASA Astrophysics Data System (ADS)

    Muftuoglu, A. Kursad

    After four decades of experience with pressurized water reactors, a new generation of nuclear plants are emerging. These advanced designs employ passive safety which relies on natural forces, such as gravity and natural circulation. The new concept of passive safety also necessitates improvement in computational tools available for best-estimate analyses. The system codes originally designed for high pressure conditions in the presence of strong momentum sources such as pumps are challenged in many ways. Increased interaction of the primary system with the containment necessitates a tool for integral analysis. This study addresses some of these concerns. An improved tool for integral analysis coupling primary system with containment calculation is also presented. The code package is based on RELAP5 and CONTAIN programs, best-estimate thermal-hydraulics code for primary system analysis and containment code for containment analysis, respectively. The suitability is demonstrated with a postulated small break loss of coolant accident analysis of Westinghouse AP600 plant. The thesis explains the details of the analysis including the coupling model.

  2. A new analysis methodology for the motion of self-propelled particles and its application

    NASA Astrophysics Data System (ADS)

    Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

    2011-03-01

    The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

  3. ADAPT (Analysis of Dynamic Accident Progression Trees) Beta Version 0.9

    2010-01-07

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified simulator. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which use explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system evolution along with stochastic modeling. When DET are used to model different aspects of Probabilistic Risk Assessment (PRA),more » all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specified times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at different times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination) and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worse case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  4. Radiation accidents

    SciTech Connect

    Saenger, E.L.

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity.

  5. Radiation accidents.

    PubMed

    Saenger, E L

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity. PMID:3526994

  6. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  7. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  8. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  9. Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement

    SciTech Connect

    Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

    1996-12-01

    This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

  10. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    SciTech Connect

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  11. Oranges and Peaches: Understanding Communication Accidents in the Reference Interview.

    ERIC Educational Resources Information Center

    Dewdney, Patricia; Michell, Gillian

    1996-01-01

    Librarians often have communication "accidents" with reference questions as initially presented. This article presents linguistic analysis of query categories, including: simple failures of hearing, accidents involving pronunciation or homophones, accidents where users repeat earlier misinterpretations to librarians, and accidents where users…

  12. The Fukushima accident was preventable.

    PubMed

    Synolakis, Costas; Kânoğlu, Utku

    2015-10-28

    The 11 March 2011 tsunami was probably the fourth largest in the past 100 years and killed over 15 000 people. The magnitude of the design tsunami triggering earthquake affecting this region of Japan had been grossly underestimated, and the tsunami hit the Fukushima Dai-ichi nuclear power plant (NPP), causing the third most severe accident in an NPP ever. Interestingly, while the Onagawa NPP was also hit by a tsunami of approximately the same height as Dai-ichi, it survived the event 'remarkably undamaged'. We explain what has been referred to as the cascade of engineering and regulatory failures that led to the Fukushima disaster. One, insufficient attention had been given to evidence of large tsunamis inundating the region earlier, to Japanese research suggestive that large earthquakes could occur anywhere along a subduction zone, and to new research on mega-thrusts since Boxing Day 2004. Two, there were unexplainably different design conditions for NPPs at close distances from each other. Three, the hazard analysis to calculate the maximum probable tsunami at Dai-ichi appeared to have had methodological mistakes, which almost nobody experienced in tsunami engineering would have made. Four, there were substantial inadequacies in the Japan nuclear regulatory structure. The Fukushima accident was preventable, if international best practices and standards had been followed, if there had been international reviews, and had common sense prevailed in the interpretation of pre-existing geological and hydrodynamic findings. Formal standards are needed for evaluating the tsunami vulnerability of NPPs, for specific training of engineers and scientists who perform tsunami computations for emergency preparedness or critical facilities, as well as for regulators who review safety studies.

  13. The Fukushima accident was preventable.

    PubMed

    Synolakis, Costas; Kânoğlu, Utku

    2015-10-28

    The 11 March 2011 tsunami was probably the fourth largest in the past 100 years and killed over 15 000 people. The magnitude of the design tsunami triggering earthquake affecting this region of Japan had been grossly underestimated, and the tsunami hit the Fukushima Dai-ichi nuclear power plant (NPP), causing the third most severe accident in an NPP ever. Interestingly, while the Onagawa NPP was also hit by a tsunami of approximately the same height as Dai-ichi, it survived the event 'remarkably undamaged'. We explain what has been referred to as the cascade of engineering and regulatory failures that led to the Fukushima disaster. One, insufficient attention had been given to evidence of large tsunamis inundating the region earlier, to Japanese research suggestive that large earthquakes could occur anywhere along a subduction zone, and to new research on mega-thrusts since Boxing Day 2004. Two, there were unexplainably different design conditions for NPPs at close distances from each other. Three, the hazard analysis to calculate the maximum probable tsunami at Dai-ichi appeared to have had methodological mistakes, which almost nobody experienced in tsunami engineering would have made. Four, there were substantial inadequacies in the Japan nuclear regulatory structure. The Fukushima accident was preventable, if international best practices and standards had been followed, if there had been international reviews, and had common sense prevailed in the interpretation of pre-existing geological and hydrodynamic findings. Formal standards are needed for evaluating the tsunami vulnerability of NPPs, for specific training of engineers and scientists who perform tsunami computations for emergency preparedness or critical facilities, as well as for regulators who review safety studies. PMID:26392611

  14. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  15. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  16. Putting phylogeny into the analysis of biological traits: a methodological approach.

    PubMed

    Jombart, Thibaut; Pavoine, Sandrine; Devillard, Sébastien; Pontier, Dominique

    2010-06-01

    Phylogenetic comparative methods have long considered phylogenetic signal as a source of statistical bias in the correlative analysis of biological traits. However, the main life-history strategies existing in a set of taxa are often combinations of life history traits that are inherently phylogenetically structured. In this paper, we present a method for identifying evolutionary strategies from large sets of biological traits, using phylogeny as a source of meaningful historical and ecological information. Our methodology extends a multivariate method developed for the analysis of spatial patterns, and relies on finding combinations of traits that are phylogenetically autocorrelated. Using extensive simulations, we show that our method efficiently uncovers phylogenetic structures with respect to various tree topologies, and remains powerful in cases where a large majority of traits are not phylogenetically structured. Our methodology is illustrated using empirical data, and implemented in the adephylo package for the free software R.

  17. Social representations, correspondence factor analysis and characterization questionnaire: a methodological contribution.

    PubMed

    Lo Monaco, Grégory; Piermattéo, Anthony; Guimelli, Christian; Abric, Jean-Claude

    2012-11-01

    The characterization questionnaire is inspired by Q-sort methodologies (i.e. qualitative sorting). It consists in asking participants to give their opinion on a list of items by sorting them into categories depending on their level of characterization of the object. This technique allows us to obtain distributions for each item and each response modality (i.e. characteristic vs. not chosen vs. not characteristic). This contribution intends to analyze these frequencies by means of correspondence factor analysis. The originality of this contribution lies in the fact that this kind of analysis has never been used to process data collected by means of this questionnaire. The procedure will be detailed and exemplified by means of two empirical studies on social representations of the good wine and the good supermarket. The interests of such a contribution will be discussed from both methodological points of view and an applications perspective. PMID:23156928

  18. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  19. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  20. Methodology for social accountability: multiple methods and feminist, poststructural, psychoanalytic discourse analysis.

    PubMed

    Phillips, D A

    2001-06-01

    Bridging the gap between the individual and social context, methodology that aims to surface and explore the regulatory function of discourse on subjectivity production moves nursing research beyond the individual level in order to theorize social context and its influence on health and well-being. This article describes the feminist, poststructural, psychoanalytic discourse analysis and multiple methods used in a recent study exploring links between cultural discourses of masculinity, performativity of masculinity, and practices of male violence.

  1. Comparison of Fracture Methodologies for Flaw Stability Analysis of Storage Tanks

    SciTech Connect

    LAM, POH-SANG

    2004-04-05

    Fracture mechanics methodologies for flaw stability analysis of a storage tank were compared in terms of the maximum stable through-wall flaw sizes or ''instability lengths.'' The comparison was made at a full range of stress loading at a specific set of mechanical properties of A285 carbon steel and with the actual tank configuration. The two general methodologies, the J-integral-tearing modulus (J-T) and the failure assessment diagram (FAD), and their specific estimation schemes were evaluated. A finite element analysis of a flawed tank was also performed for validating the J estimation scheme with curvature correction and for constructing the finite element-based FAD. The calculated instability crack lengths show that the J-T methodology that uses an estimated scheme, and the material-specific FAD, most closely approximate the result calculated with finite element analysis for the stress range that bounds those expected at the highest fill levels in the storage tanks. The results from the other FAD methods show instability lengths less than the J-T results over this range.

  2. A systematic review and analysis of factors associated with methodological quality in laparoscopic randomized controlled trials.

    PubMed

    Antoniou, Stavros Athanasios; Andreou, Alexandros; Antoniou, George Athanasios; Bertsias, Antonios; Köhler, Gernot; Koch, Oliver Owen; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-01-01

    Several methods for assessment of methodological quality in randomized controlled trials (RCTs) have been developed during the past few years. Factors associated with quality in laparoscopic surgery have not been defined till date. The aim of this study was to investigate the relationship between bibliometric and the methodological quality of laparoscopic RCTs. The PubMed search engine was queried to identify RCTs on minimally invasive surgery published in 2012 in the 10 highest impact factor surgery journals and the 5 highest impact factor laparoscopic journals. Eligible studies were blindly assessed by two independent investigators using the Scottish Intercollegiate Guidelines Network (SIGN) tool for RCTs. Univariate and multivariate analyses were performed to identify potential associations with methodological quality. A total of 114 relevant RCTs were identified. More than half of the trials were of high or acceptable quality. Half of the reports provided information on comparative demo graphic data and only 21% performed intention-to-treat analysis. RCTs with sample size of at least 60 patients presented higher methodological quality (p = 0.025). Upon multiple regression, reporting on preoperative care and the experience level of surgeons were independent factors of quality. PMID:25896540

  3. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  4. Safety and Response-Time Analysis of an Automotive Accident Assistance Service

    NASA Astrophysics Data System (ADS)

    Argent-Katwala, Ashok; Clark, Allan; Foster, Howard; Gilmore, Stephen; Mayer, Philip; Tribastone, Mirco

    In the present paper we assess both the safety properties and the response-time profile of a subscription service which provides medical assistance to drivers who are injured in vehicular collisions. We use both timed and untimed process calculi cooperatively to perform the required analysis. The formal analysis tools used are hosted on a high-level modelling platform with support for scripting and orchestration which enables users to build custom analysis processes from the general-purpose analysers which are hosted as services on the platform.

  5. Methodology for the analysis of fenbendazole and its metabolites in plasma, urine, feces, and tissue homogenates.

    PubMed

    Barker, S A; Hsieh, L C; Short, C R

    1986-05-15

    New methodology for the extraction and analysis of the anthelmintic fenbendazole and its metabolites from plasma, urine, liver homogenates, and feces from several animal species is presented. Quantitation of fenbendazole and its metabolites was conducted by high-pressure liquid chromatography using ultraviolet detection at 290 nm. The combined extraction and analysis procedures give excellent recoveries in all of the different biological matrices examined. High specificity, low limits of detection, and excellent linearity, accuracy, and inter- and intrasample variability were also obtained. The study of fenbendazole pharmacokinetics in vitro and in vivo should be greatly enhanced through the utilization of these methods.

  6. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  7. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    NASA Astrophysics Data System (ADS)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  8. Characteristic variation and original analysis of emergent water source pollution accidents in China between 1985 and 2013.

    PubMed

    Qu, Jianhua; Meng, Xianlin; Ye, Xiuqing; You, Hong

    2016-10-01

    China has suffered various water source pollution incidents in the past decades, which have resulted in severe threats to the safety of the water supply for millions of residents. From the aspects of quantity fluctuation, temporal volatility, regional inequality, pollutant category variation, and accident type differences, this study first characterizes the current status of water source contaminations in China by analyzing 340 pollution events for the period spanning from 1985 to 2013. The results show a general increase in the number of accidents during the period 1985-2006 and then a rapid decline starting in 2007. Spring and summer are high-incidence seasons for pollution, and the accident rate in developed southeastern coastal areas is far higher than that in the northwestern regions. Hazardous chemicals and petroleum are the most frequently occurring pollutants, whereas heavy metals and tailings are becoming emerging contaminants during occasional pollutions. Most of the accidents that occurred before 2005 were blamed on illegal emissions or traffic accidents; however, leakage in production has gradually become a major accident type in the past decade. Then, in combination with government actions and policy constraints, this paper explores the deep inducements and offers valuable insight into measures that should be taken to ensure future prevention and mitigation of emergent source water pollution.

  9. A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans

    PubMed Central

    Ramírez-Villegas, Julián; Khoury, Colin; Jarvis, Andy; Debouck, Daniel Gabriel; Guarino, Luigi

    2010-01-01

    Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources. PMID:20976009

  10. Analysis of Sodium Fire in the Containment Building of Prototype Fast Breeder Reactor Under the Scenario of Core Disruptive Accident

    SciTech Connect

    Rao, P.M.; Kasinathan, N.; Kannan, S.E.

    2006-07-01

    The potential for sodium release to reactor containment building from reactor assembly during Core Disruptive Accident (CDA) in Fast Breeder Reactors (FBR) is an important safety issue with reference to the structural integrity of Reactor Containment Building (RCB). For Prototype Fast Breeder Reactor (PFBR), the estimated sodium release under a CDA of 100 MJ energy release is 350 kg. The ejected sodium reacts easily with air in RCB and causes temperature and pressure rise in the RCB. For estimating the severe thermal consequences in RCB, different modes of sodium fires like pool and spray fires were analyzed by using SOFIRE -- II and NACOM sodium fire computer codes. Effects of important parameters like amount of sodium, area of pool, containment air volume and oxygen concentration have been investigated. A peak pressure rise of 7.32 kPa is predicted by SOFIRE II code for 350 kg sodium pool fire in 86,000 m{sup 3} RCB volume. Under sodium release as spray followed by unburnt sodium as pool fire mode analysis, the estimated pressure rise is 5.85 kPa in the RCB. In the mode of instantaneous combustion of sodium, the estimated peak pressure rise is 13 kPa. (authors)

  11. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. PMID:26360211

  12. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run.

  13. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    SciTech Connect

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  14. [All traffic related deaths are not "fatalities"--analysis of the official Swedish statistics of traffic accident fatalities in 1999].

    PubMed

    Ahlm, K; Eriksson, A; Lekander, T; Björnstig, U

    2001-04-25

    In 1997 the Swedish Parliament decided, in accordance with the so-called Vision Zero, that one official goal for the national traffic safety effort is that the number of traffic fatalities in the year 2007 must not exceed 270. In order to monitor efforts toward this hard-won goal, it is of course of utmost importance that official statistics on traffic deaths are reliable. In a meticulous analysis of all 580 officially registered traffic deaths in Sweden in 1999, we found that 490 were true accidental deaths, while 18 were suicides, 12 were deaths due to indeterminate causes, 59 were natural deaths and 1 case was not possible to evaluate due to missing data. Thus, only 84% of the officially registered "accidental traffic deaths" were bona fide accidents. In order to enhance the reliability of the official statistics, we suggest that regulations concerning police investigation and medicolegal autopsy of all unnatural deaths be adhered to all deaths reported to the Swedish National Road Administration should be checked in the database of autopsied cases in the National Board of Forensic Medicine in order to exclude natural deaths the time delay (1.5 years) to complete the official Cause-of-Death Register be shortened criteria for the classification of manner of death in "borderline" cases be suggested for international acceptance.

  15. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    PubMed

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  16. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    SciTech Connect

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein.

  17. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    PubMed Central

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638

  18. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    PubMed

    Naik, Kartiki S; Stenstrom, Michael K

    2016-03-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option.

  19. Kinetics Parameters of VVER-1000 Core with 3 MOX Lead Test Assemblies To Be Used for Accident Analysis Codes

    SciTech Connect

    Pavlovitchev, A.M.

    2000-03-08

    The present work is a part of Joint U.S./Russian Project with Weapons-Grade Plutonium Disposition in VVER Reactor and presents the neutronics calculations of kinetics parameters of VVER-1000 core with 3 introduced MOX LTAs. MOX LTA design has been studied in [1] for two options of MOX LTA: 100% plutonium and of ''island'' type. As a result, zoning i.e. fissile plutonium enrichments in different plutonium zones, has been defined. VVER-1000 core with 3 introduced MOX LTAs of chosen design has been calculated in [2]. In present work, the neutronics data for transient analysis codes (RELAP [3]) has been obtained using the codes chain of RRC ''Kurchatov Institute'' [5] that is to be used for exploitation neutronics calculations of VVER. Nowadays the 3D assembly-by-assembly code BIPR-7A and 2D pin-by-pin code PERMAK-A, both with the neutronics constants prepared by the cell code TVS-M, are the base elements of this chain. It should be reminded that in [6] TVS-M was used only for the constants calculations of MOX FAs. In current calculations the code TVS-M has been used both for UOX and MOX fuel constants. Besides, the volume of presented information has been increased and additional explications have been included. The results for the reference uranium core [4] are presented in Chapter 2. The results for the core with 3 MOX LTAs are presented in Chapter 3. The conservatism that is connected with neutronics parameters and that must be taken into account during transient analysis calculations, is discussed in Chapter 4. The conservative parameters values are considered to be used in 1-point core kinetics models of accident analysis codes.

  20. Preliminary Accident Analysis for Construction and Operation of the Chornobyl New Safety Confinement

    SciTech Connect

    Batiy, Valeriy; Rubezhansky, Yruiy; Rudko, Vladimir; shcherbin, vladimir; Yegorov, V; Schmieman, Eric A.; Timmins, Douglas C.

    2005-08-08

    Analysis of potential exposure of personal and population during construction and exploitation of the New Safe Confinement was made. Scenarios of hazard event development were ranked. It is shown, that as a whole construction and exploitation of the NSC are in accordance with actual radiation safety norms of Ukraine.