Sample records for source reactor probabilistic

  1. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  2. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  3. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  4. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  5. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  6. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-05-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  7. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less

  8. 76 FR 11525 - Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS) Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA), Room T-2B1, 11545 Rockville Pike, Rockville, Maryland...

  9. 76 FR 22934 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on May 11, 2011, Room T-2B3, 11545...

  10. 76 FR 71609 - Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-18

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS), Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment; Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on December 14, 2011, Room T-2B3...

  11. 76 FR 18586 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA); Notice of Meeting The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on April 20, 2011, Room T-2B1, 11545...

  12. 76 FR 55717 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... Subcommittee on Reliability and Probabilistic Risk Assessment The ACRS Subcommittee on Reliability and Probabilistic Risk Assessment (PRA) will hold a meeting on September 20, 2011, Room T-2B1, 11545 Rockville Pike... Memorandum on Modifying the Risk-Informed Regulatory Guidance for New Reactors. The Subcommittee will hear...

  13. Preliminary risks associated with postulated tritium release from production reactor operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Kula, K.R.; Horton, W.H.

    1988-01-01

    The Probabilistic Risk Assessment (PRA) of Savannah River Plant (SRP) reactor operation is assessing the off-site risk due to tritium releases during postulated full or partial loss of heavy water moderator accidents. Other sources of tritium in the reactor are less likely to contribute to off-site risk in non-fuel melting accident scenarios. Preliminary determination of the frequency of average partial moderator loss (including incidents with leaks as small as .5 kg) yields an estimate of /approximately/1 per reactor year. The full moderator loss frequency is conservatively chosen as 5 /times/ 10/sup /minus/3/ per reactor year. Conditional consequences, determined with amore » version of the MACCS code modified to handle tritium, are found to be insignificant. The 95th percentile individual cancer risk is 4 /times/ 10/sup /minus/8/ per reactor year within 16 km of the release point. The full moderator loss accident contributes about 75% of the evaluated risks. 13 refs., 4 figs., 5 tabs.« less

  14. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  15. Calculations of the thermal and fast neutron fluxes in the Syrian miniature neutron source reactor using the MCNP-4C code.

    PubMed

    Khattab, K; Sulieman, I

    2009-04-01

    The MCNP-4C code, based on the probabilistic approach, was used to model the 3D configuration of the core of the Syrian miniature neutron source reactor (MNSR). The continuous energy neutron cross sections from the ENDF/B-VI library were used to calculate the thermal and fast neutron fluxes in the inner and outer irradiation sites of MNSR. The thermal fluxes in the MNSR inner irradiation sites were also measured experimentally by the multiple foil activation method ((197)Au (n, gamma) (198)Au and (59)Co (n, gamma) (60)Co). The foils were irradiated simultaneously in each of the five MNSR inner irradiation sites to measure the thermal neutron flux and the epithermal index in each site. The calculated and measured results agree well.

  16. Development of probabilistic design method for annular fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozawa, Takayuki

    2007-07-01

    The increase of linear power and burn-up during the reactor operation is considered as one measure to ensure the utility of fast reactors in the future; for this the application of annular oxide fuels is under consideration. The annular fuel design code CEPTAR was developed in the Japan Atomic Energy Agency (JAEA) and verified by using many irradiation experiences with oxide fuels. In addition, the probabilistic fuel design code BORNFREE was also developed to provide a safe and reasonable fuel design and to evaluate the design margins quantitatively. This study aimed at the development of a probabilistic design method formore » annular oxide fuels; this was implemented in the developed BORNFREE-CEPTAR code, and the code was used to make a probabilistic evaluation with regard to the permissive linear power. (author)« less

  17. Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.P.; Stover, R.L.; Hashimoto, P.S.

    1989-01-01

    Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.

  18. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  19. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  20. Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Backman, Marie; Williams, Paul

    This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less

  1. 75 FR 78777 - Advisory Committee On Reactor Safeguards; Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... Committee includes individuals experienced in reactor operations, management; probabilistic risk assessment...: December 10, 2010. Andrew L. Bates, Advisory Committee Management Officer. [FR Doc. 2010-31590 Filed 12-15...

  2. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less

  3. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE PAGES

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...

    2017-01-24

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  4. Safety Issues at the DOE Test and Research Reactors. A Report to the U.S. Department of Energy.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Commission on Physical Sciences, Mathematics, and Resources.

    This report provides an assessment of safety issues at the Department of Energy (DOE) test and research reactors. Part A identifies six safety issues of the reactors. These issues include the safety design philosophy, the conduct of safety reviews, the performance of probabilistic risk assessments, the reliance on reactor operators, the fragmented…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  6. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    NASA Technical Reports Server (NTRS)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  7. 76 FR 55717 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-08

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Reliability and Probabilistic Risk Assessment The ACRS Subcommittee on Reliability and PRA will hold a meeting [[Page 55718

  8. Integrated Risk-Informed Decision-Making for an ALMR PRISM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muhlheim, Michael David; Belles, Randy; Denning, Richard S.

    Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less

  9. A probabilistic safety analysis of incidents in nuclear research reactors.

    PubMed

    Lopes, Valdir Maciel; Agostinho Angelo Sordi, Gian Maria; Moralles, Mauricio; Filho, Tufic Madi

    2012-06-01

    This work aims to evaluate the potential risks of incidents in nuclear research reactors. For its development, two databases of the International Atomic Energy Agency (IAEA) were used: the Research Reactor Data Base (RRDB) and the Incident Report System for Research Reactor (IRSRR). For this study, the probabilistic safety analysis (PSA) was used. To obtain the result of the probability calculations for PSA, the theory and equations in the paper IAEA TECDOC-636 were used. A specific program to analyse the probabilities was developed within the main program, Scilab 5.1.1. for two distributions, Fischer and chi-square, both with the confidence level of 90 %. Using Sordi equations, the maximum admissible doses to compare with the risk limits established by the International Commission on Radiological Protection (ICRP) were obtained. All results achieved with this probability analysis led to the conclusion that the incidents which occurred had radiation doses within the stochastic effects reference interval established by the ICRP-64.

  10. Expert assessments of the cost of light water small modular reactors

    PubMed Central

    Abdulla, Ahmed; Azevedo, Inês Lima; Morgan, M. Granger

    2013-01-01

    Analysts and decision makers frequently want estimates of the cost of technologies that have yet to be developed or deployed. Small modular reactors (SMRs), which could become part of a portfolio of carbon-free energy sources, are one such technology. Existing estimates of likely SMR costs rely on problematic top-down approaches or bottom-up assessments that are proprietary. When done properly, expert elicitations can complement these approaches. We developed detailed technical descriptions of two SMR designs and then conduced elicitation interviews in which we obtained probabilistic judgments from 16 experts who are involved in, or have access to, engineering-economic assessments of SMR projects. Here, we report estimates of the overnight cost and construction duration for five reactor-deployment scenarios that involve a large reactor and two light water SMRs. Consistent with the uncertainty introduced by past cost overruns and construction delays, median estimates of the cost of new large plants vary by more than a factor of 2.5. Expert judgments about likely SMR costs display an even wider range. Median estimates for a 45 megawatts-electric (MWe) SMR range from $4,000 to $16,300/kWe and from $3,200 to $7,100/kWe for a 225-MWe SMR. Sources of disagreement are highlighted, exposing the thought processes of experts involved with SMR design. There was consensus that SMRs could be built and brought online about 2 y faster than large reactors. Experts identify more affordable unit cost, factory fabrication, and shorter construction schedules as factors that may make light water SMRs economically viable. PMID:23716682

  11. Expert assessments of the cost of light water small modular reactors.

    PubMed

    Abdulla, Ahmed; Azevedo, Inês Lima; Morgan, M Granger

    2013-06-11

    Analysts and decision makers frequently want estimates of the cost of technologies that have yet to be developed or deployed. Small modular reactors (SMRs), which could become part of a portfolio of carbon-free energy sources, are one such technology. Existing estimates of likely SMR costs rely on problematic top-down approaches or bottom-up assessments that are proprietary. When done properly, expert elicitations can complement these approaches. We developed detailed technical descriptions of two SMR designs and then conduced elicitation interviews in which we obtained probabilistic judgments from 16 experts who are involved in, or have access to, engineering-economic assessments of SMR projects. Here, we report estimates of the overnight cost and construction duration for five reactor-deployment scenarios that involve a large reactor and two light water SMRs. Consistent with the uncertainty introduced by past cost overruns and construction delays, median estimates of the cost of new large plants vary by more than a factor of 2.5. Expert judgments about likely SMR costs display an even wider range. Median estimates for a 45 megawatts-electric (MWe) SMR range from $4,000 to $16,300/kWe and from $3,200 to $7,100/kWe for a 225-MWe SMR. Sources of disagreement are highlighted, exposing the thought processes of experts involved with SMR design. There was consensus that SMRs could be built and brought online about 2 y faster than large reactors. Experts identify more affordable unit cost, factory fabrication, and shorter construction schedules as factors that may make light water SMRs economically viable.

  12. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  13. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprung, J.L.; Jow, H-N; Rollstin, J.A.

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less

  14. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  15. 77 FR 61446 - Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... documents online in the NRC Library at http://www.nrc.gov/reading-rm/adams.html . To begin the search... of digital instrumentation and control system PRAs, including common cause failures in PRAs and uncertainty analysis associated with new reactor digital systems, and (4) incorporation of additional...

  16. 77 FR 66649 - Proposed Revision to Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... in the NRC Library at http://www.nrc.gov/reading-rm/adams.html . To begin the search, select ``ADAMS... of digital instrumentation and control system PRAs, including common cause failures in PRAs and uncertainty analysis associated with new reactor digital systems, and (4) incorporation of additional...

  17. The Experimental Breeder Reactor II seismic probabilistic risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roglans, J; Hill, D J

    1994-02-01

    The Experimental Breeder Reactor II (EBR-II) is a US Department of Energy (DOE) Category A research reactor located at Argonne National Laboratory (ANL)-West in Idaho. EBR-II is a 62.5 MW-thermal Liquid Metal Reactor (LMR) that started operation in 1964 and it is currently being used as a testbed in the Integral Fast Reactor (IFR) Program. ANL has completed a Level 1 Probabilistic Risk Assessment (PRA) for EBR-II. The Level 1 PRA for internal events and most external events was completed in June 1991. The seismic PRA for EBR-H has recently been completed. The EBR-II reactor building contains the reactor, themore » primary system, and the decay heat removal systems. The reactor vessel, which contains the core, and the primary system, consisting of two primary pumps and an intermediate heat exchanger, are immersed in the sodium-filled primary tank, which is suspended by six hangers from a beam support structure. Three systems or functions in EBR-II were identified as the most significant from the standpoint of risk of seismic-induced fuel damage: (1) the reactor shutdown system, (2) the structural integrity of the passive decay heat removal systems, and (3) the integrity of major structures, like the primary tank containing the reactor that could threaten both the reactivity control and decay heat removal functions. As part of the seismic PRA, efforts were concentrated in studying these three functions or systems. The passive safety response of EBR-II reactor -- both passive reactivity shutdown and passive decay heat removal, demonstrated in a series of tests in 1986 -- was explicitly accounted for in the seismic PRA as it had been included in the internal events assessment.« less

  18. 77 FR 58590 - Determining Technical Adequacy of Probabilistic Risk Assessment for Risk-Informed License...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... reactors or for activities associated with review of applications for early site permits and combined licenses (COL) for the Office of New Reactors (NRO). DATES: The effective date of this SRP update is... Rulemaking Web Site: Go to http://www.regulations.gov and search for Docket ID NRC-2010-0138. Address...

  19. Analysis of the stochastic excitability in the flow chemical reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  20. Analysis of the stochastic excitability in the flow chemical reactor

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2015-11-01

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  1. Probabilistic pipe fracture evaluations for leak-rate-detection applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, S.; Ghadiali, N.; Paul, D.

    1995-04-01

    Regulatory Guide 1.45, {open_quotes}Reactor Coolant Pressure Boundary Leakage Detection Systems,{close_quotes} was published by the U.S. Nuclear Regulatory Commission (NRC) in May 1973, and provides guidance on leak detection methods and system requirements for Light Water Reactors. Additionally, leak detection limits are specified in plant Technical Specifications and are different for Boiling Water Reactors (BWRs) and Pressurized Water Reactors (PWRs). These leak detection limits are also used in leak-before-break evaluations performed in accordance with Draft Standard Review Plan, Section 3.6.3, {open_quotes}Leak Before Break Evaluation Procedures{close_quotes} where a margin of 10 on the leak detection limit is used in determining the crackmore » size considered in subsequent fracture analyses. This study was requested by the NRC to: (1) evaluate the conditional failure probability for BWR and PWR piping for pipes that were leaking at the allowable leak detection limit, and (2) evaluate the margin of 10 to determine if it was unnecessarily large. A probabilistic approach was undertaken to conduct fracture evaluations of circumferentially cracked pipes for leak-rate-detection applications. Sixteen nuclear piping systems in BWR and PWR plants were analyzed to evaluate conditional failure probability and effects of crack-morphology variability on the current margins used in leak rate detection for leak-before-break.« less

  2. GRIZZLY/FAVOR Interface Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, Terry L; Williams, Paul T; Yin, Shengjun

    As part of the Light Water Reactor Sustainability (LWRS) Program, the objective of the GRIZZLY/FAVOR Interface project is to create the capability to apply GRIZZLY 3-D finite element (thermal and stress) analysis results as input to FAVOR probabilistic fracture mechanics (PFM) analyses. The one benefit of FAVOR to Grizzly is the PROBABILISTIC capability. This document describes the implementation of the GRIZZLY/FAVOR Interface, the preliminary verification and tests results and a user guide that provides detailed step-by-step instructions to run the program.

  3. A probabilistic framework for single-sensor acoustic emission source localization in thin metallic plates

    NASA Astrophysics Data System (ADS)

    Ebrahimkhanlou, Arvin; Salamone, Salvatore

    2017-09-01

    Tracking edge-reflected acoustic emission (AE) waves can allow the localization of their sources. Specifically, in bounded isotropic plate structures, only one sensor may be used to perform these source localizations. The primary goal of this paper is to develop a three-step probabilistic framework to quantify the uncertainties associated with such single-sensor localizations. According to this framework, a probabilistic approach is first used to estimate the direct distances between AE sources and the sensor. Then, an analytical model is used to reconstruct the envelope of edge-reflected AE signals based on the source-to-sensor distance estimations and their first arrivals. Finally, the correlation between the probabilistically reconstructed envelopes and recorded AE signals are used to estimate confidence contours for the location of AE sources. To validate the proposed framework, Hsu-Nielsen pencil lead break (PLB) tests were performed on the surface as well as the edges of an aluminum plate. The localization results show that the estimated confidence contours surround the actual source locations. In addition, the performance of the framework was tested in a noisy environment simulated by two dummy transducers and an arbitrary wave generator. The results show that in low-noise environments, the shape and size of the confidence contours depend on the sources and their locations. However, at highly noisy environments, the size of the confidence contours monotonically increases with the noise floor. Such probabilistic results suggest that the proposed probabilistic framework could thus provide more comprehensive information regarding the location of AE sources.

  4. Safety design approach for external events in Japan sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamano, H.; Kubo, S.; Tani, A.

    2012-07-01

    This paper describes a safety design approach for external events in the design study of Japan sodium-cooled fast reactor. An emphasis is introduction of a design extension external condition (DEEC). In addition to seismic design, other external events such as tsunami, strong wind, abnormal temperature, etc. were addressed in this study. From a wide variety of external events consisting of natural hazards and human-induced ones, a screening method was developed in terms of siting, consequence, frequency to select representative events. Design approaches for these events were categorized on the probabilistic, statistical and deterministic basis. External hazard conditions were considered mainlymore » for DEECs. In the probabilistic approach, the DEECs of earthquake, tsunami and strong wind were defined as 1/10 of exceedance probability of the external design bases. The other representative DEECs were also defined based on statistical or deterministic approaches. (authors)« less

  5. Probabilistic location estimation of acoustic emission sources in isotropic plates with one sensor

    NASA Astrophysics Data System (ADS)

    Ebrahimkhanlou, Arvin; Salamone, Salvatore

    2017-04-01

    This paper presents a probabilistic acoustic emission (AE) source localization algorithm for isotropic plate structures. The proposed algorithm requires only one sensor and uniformly monitors the entire area of such plates without any blind zones. In addition, it takes a probabilistic approach and quantifies localization uncertainties. The algorithm combines a modal acoustic emission (MAE) and a reflection-based technique to obtain information pertaining to the location of AE sources. To estimate confidence contours for the location of sources, uncertainties are quantified and propagated through the two techniques. The approach was validated using standard pencil lead break (PLB) tests on an Aluminum plate. The results demonstrate that the proposed source localization algorithm successfully estimates confidence contours for the location of AE sources.

  6. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.

  7. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  8. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  9. 75 FR 64366 - Advisory Committee on Reactor Safeguards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ... NUREG/CR- 6997, ``Modeling a Digital Feedwater Control System Using Traditional Probabilistic Risk.../reading-rm/adams.html or http://www.nrc.gov/reading-rm/doc-collections/ACRS/ . Video teleconferencing... line charges and for providing the equipment and facilities that they use to establish the video...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less

  11. Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Hoffman, William; Sen, Sonat

    2015-10-01

    The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtainmore » stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically decrease run times.« less

  12. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  13. Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging

    NASA Astrophysics Data System (ADS)

    Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.

    2017-10-01

    Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.

  14. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  15. 77 FR 58420 - Advisory Committee On Reactor Safeguards; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... Pike, Rockville, Maryland. Thursday, October 4, 2012, Conference Room T2-B1, 11545 Rockville Pike....: Safety Evaluation Report (SER) Associated with WCAP-16793-NP, Revision 2, ``Evaluation of Long-Term..., ``Evaluation of JNES Equipment Fragility Tests for Use in Seismic Probabilistic Risk Assessments for U.S...

  16. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  17. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  18. 75 FR 30077 - Advisory Committee On Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee On Digital I&C...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-28

    ... Subcommittee On Digital I&C Systems The ACRS Subcommittee on Digital Instrumentation and Control (DI&C) Systems... the area of Digital Instrumentation and Control (DI&C) Probabilistic Risk Assessment (PRA). Topics... software reliability methods (QSRMs), NUREG/CR--6997, ``Modeling a Digital Feedwater Control System Using...

  19. Probabilistic distributions of pinhole defects in atomic layer deposited films on polymeric substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yersak, Alexander S., E-mail: alexander.yersak@colorado.edu; Lee, Yung-Cheng

    Pinhole defects in atomic layer deposition (ALD) coatings were measured in an area of 30 cm{sup 2} in an ALD reactor, and these defects were represented by a probabilistic cluster model instead of a single defect density value with number of defects over area. With the probabilistic cluster model, the pinhole defects were simulated over a manufacturing scale surface area of ∼1 m{sup 2}. Large-area pinhole defect simulations were used to develop an improved and enhanced design method for ALD-based devices. A flexible thermal ground plane (FTGP) device requiring ALD hermetic coatings was used as an example. Using a single defectmore » density value, it was determined that for an application with operation temperatures higher than 60 °C, the FTGP device would not be possible. The new probabilistic cluster model shows that up to 40.3% of the FTGP would be acceptable. With this new approach the manufacturing yield of ALD-enabled or other thin film based devices with different design configurations can be determined. It is important to guide process optimization and control and design for manufacturability.« less

  20. Towards an Artificial Space Object Taxonomy

    DTIC Science & Technology

    2013-09-01

    demonstrate how to implement this taxonomy in Figaro, an open source probabilistic programming language. 2. INTRODUCTION Currently, US Space Command...Taxonomy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7...demonstrate how to implement this taxonomy in Figaro, an open source probabilistic programming language. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF

  1. Effect of time dependence on probabilistic seismic-hazard maps and deaggregation for the central Apennines, Italy

    USGS Publications Warehouse

    Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.

    2009-01-01

    We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.

  2. Probabilistic evaluation of n traces with no putative source: A likelihood ratio based approach in an investigative framework.

    PubMed

    De March, I; Sironi, E; Taroni, F

    2016-09-01

    Analysis of marks recovered from different crime scenes can be useful to detect a linkage between criminal cases, even though a putative source for the recovered traces is not available. This particular circumstance is often encountered in the early stage of investigations and thus, the evaluation of evidence association may provide useful information for the investigators. This association is evaluated here from a probabilistic point of view: a likelihood ratio based approach is suggested in order to quantify the strength of the evidence of trace association in the light of two mutually exclusive propositions, namely that the n traces come from a common source or from an unspecified number of sources. To deal with this kind of problem, probabilistic graphical models are used, in form of Bayesian networks and object-oriented Bayesian networks, allowing users to intuitively handle with uncertainty related to the inferential problem. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  4. INTEGRATING PROBABILISTIC AND FIXED-SITE MONITORING FOR ROBUST WATER QUALITY ASSESSMENTS

    EPA Science Inventory

    Determining the extent of water-quality degradation, controlling nonpoint sources, and defining allowable amounts of contaminants are important water-quality issues defined in the Clean Water Act that require new monitoring data. Probabilistic, randomized stream water-quality mon...

  5. Fast, noise-free memory for photon synchronization at room temperature.

    PubMed

    Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer

    2018-01-01

    Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.

  6. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  7. PROBABILISTIC ASSESSMENT OF GROUNDWATER VULNERABILITY TO NONPOINT SOURCE POLLUTION IN AGRICULTURAL WATERSHEDS

    EPA Science Inventory

    This paper presents a probabilistic framework for the assessment of groundwater pollution potential by pesticides in two adjacent agricultural watersheds in the Mid-Altantic Coastal Plain. Indices for estimating streams vulnerability to pollutants' load from the surficial aquifer...

  8. INTEGRATION OF RELIABILITY WITH MECHANISTIC THERMALHYDRAULICS: REPORT ON APPROACH AND TEST PROBLEM RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. S. Schroeder; R. W. Youngblood

    The Risk-Informed Safety Margin Characterization (RISMC) pathway of the Light Water Reactor Sustainability Program is developing simulation-based methods and tools for analyzing safety margin from a modern perspective. [1] There are multiple definitions of 'margin.' One class of definitions defines margin in terms of the distance between a point estimate of a given performance parameter (such as peak clad temperature), and a point-value acceptance criterion defined for that parameter (such as 2200 F). The present perspective on margin is that it relates to the probability of failure, and not just the distance between a nominal operating point and a criterion.more » In this work, margin is characterized through a probabilistic analysis of the 'loads' imposed on systems, structures, and components, and their 'capacity' to resist those loads without failing. Given the probabilistic load and capacity spectra, one can assess the probability that load exceeds capacity, leading to component failure. Within the project, we refer to a plot of these probabilistic spectra as 'the logo.' Refer to Figure 1 for a notional illustration. The implications of referring to 'the logo' are (1) RISMC is focused on being able to analyze loads and spectra probabilistically, and (2) calling it 'the logo' tacitly acknowledges that it is a highly simplified picture: meaningful analysis of a given component failure mode may require development of probabilistic spectra for multiple physical parameters, and in many practical cases, 'load' and 'capacity' will not vary independently.« less

  9. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  10. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  11. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  12. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monteleone, S.

    This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25--27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updatedmore » and may differ from those that appeared in the final program of the meeting. Individual papers have been cataloged separately. This document, Volume 1 covers the following topics: Advanced Reactor Research; Advanced Instrumentation and Control Hardware; Advanced Control System Technology; Human Factors Research; Probabilistic Risk Assessment Topics; Thermal Hydraulics; and Thermal Hydraulic Research for Advanced Passive Light Water Reactors.« less

  14. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  15. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  16. Modeling of a Flooding Induced Station Blackout for a Pressurized Water Reactor Using the RISMC Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Prescott, Steven R; Smith, Curtis L

    2011-07-01

    In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of amore » power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less

  18. Construction of a Calibrated Probabilistic Classification Catalog: Application to 50k Variable Sources in the All-Sky Automated Survey

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Brink, Henrik; Crellin-Quick, Arien

    2012-12-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  19. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In additionmore » to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.« less

  20. Review of reactor pressure vessel evaluation report for Yankee Rowe Nuclear Power Station (YAEC No. 1735)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheverton, R.D.; Dickson, T.L.; Merkle, J.G.

    1992-03-01

    The Yankee Atomic Electric Company has performed an Integrated Pressurized Thermal Shock (IPTS)-type evaluation of the Yankee Rowe reactor pressure vessel in accordance with the PTS Rule (10 CFR 50. 61) and a US Regulatory Guide 1.154. The Oak Ridge National Laboratory (ORNL) reviewed the YAEC document and performed an independent probabilistic fracture-mechnics analysis. The review included a comparison of the Pacific Northwest Laboratory (PNL) and the ORNL probabilistic fracture-mechanics codes (VISA-II and OCA-P, respectively). The review identified minor errors and one significant difference in philosophy. Also, the two codes have a few dissimilar peripheral features. Aside from these differences,more » VISA-II and OCA-P are very similar and with errors corrected and when adjusted for the difference in the treatment of fracture toughness distribution through the wall, yield essentially the same value of the conditional probability of failure. The ORNL independent evaluation indicated RT{sub NDT} values considerably greater than those corresponding to the PTS-Rule screening criteria and a frequency of failure substantially greater than that corresponding to the primary acceptance criterion'' in US Regulatory Guide 1.154. Time constraints, however, prevented as rigorous a treatment as the situation deserves. Thus, these results are very preliminary.« less

  1. FAVOR: A new fracture mechanics code for reactor pressure vessels subjected to pressurized thermal shock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.

    1993-01-01

    This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less

  2. FAVOR: A new fracture mechanics code for reactor pressure vessels subjected to pressurized thermal shock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, T.L.

    1993-04-01

    This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less

  3. Review of reactor pressure vessel evaluation report for Yankee Rowe Nuclear Power Station (YAEC No. 1735)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheverton, R.D.; Dickson, T.L.; Merkle, J.G.

    1992-03-01

    The Yankee Atomic Electric Company has performed an Integrated Pressurized Thermal Shock (IPTS)-type evaluation of the Yankee Rowe reactor pressure vessel in accordance with the PTS Rule (10 CFR 50. 61) and a US Regulatory Guide 1.154. The Oak Ridge National Laboratory (ORNL) reviewed the YAEC document and performed an independent probabilistic fracture-mechnics analysis. The review included a comparison of the Pacific Northwest Laboratory (PNL) and the ORNL probabilistic fracture-mechanics codes (VISA-II and OCA-P, respectively). The review identified minor errors and one significant difference in philosophy. Also, the two codes have a few dissimilar peripheral features. Aside from these differences,more » VISA-II and OCA-P are very similar and with errors corrected and when adjusted for the difference in the treatment of fracture toughness distribution through the wall, yield essentially the same value of the conditional probability of failure. The ORNL independent evaluation indicated RT{sub NDT} values considerably greater than those corresponding to the PTS-Rule screening criteria and a frequency of failure substantially greater than that corresponding to the ``primary acceptance criterion`` in US Regulatory Guide 1.154. Time constraints, however, prevented as rigorous a treatment as the situation deserves. Thus, these results are very preliminary.« less

  4. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  5. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    NASA Astrophysics Data System (ADS)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.

  6. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  7. Seismic Sources and Recurrence Rates as Adopted by USGS Staff for the Production of the 1982 and 1990 Probabilistic Ground Motion Maps for Alaska and the Conterminous United States

    USGS Publications Warehouse

    Hanson, Stanley L.; Perkins, David M.

    1995-01-01

    The construction of a probabilistic ground-motion hazard map for a region follows a sequence of analyses beginning with the selection of an earthquake catalog and ending with the mapping of calculated probabilistic ground-motion values (Hanson and others, 1992). An integral part of this process is the creation of sources used for the calculation of earthquake recurrence rates and ground motions. These sources consist of areas and lines that are representative of geologic or tectonic features and faults. After the design of the sources, it is necessary to arrange the coordinate points in a particular order compatible with the input format for the SEISRISK-III program (Bender and Perkins, 1987). Source zones are usually modeled as a point-rupture source. Where applicable, linear rupture sources are modeled with articulated lines, representing known faults, or a field of parallel lines, representing a generalized distribution of hypothetical faults. Based on the distribution of earthquakes throughout the individual source zones (or a collection of several sources), earthquake recurrence rates are computed for each of the sources, and a minimum and maximum magnitude is assigned. Over a period of time from 1978 to 1980 several conferences were held by the USGS to solicit information on regions of the United States for the purpose of creating source zones for computation of probabilistic ground motions (Thenhaus, 1983). As a result of these regional meetings and previous work in the Pacific Northwest, (Perkins and others, 1980), California continental shelf, (Thenhaus and others, 1980), and the Eastern outer continental shelf, (Perkins and others, 1979) a consensus set of source zones was agreed upon and subsequently used to produce a national ground motion hazard map for the United States (Algermissen and others, 1982). In this report and on the accompanying disk we provide a complete list of source areas and line sources as used for the 1982 and later 1990 seismic hazard maps for the conterminous U.S. and Alaska. These source zones are represented in the input form required for the hazard program SEISRISK-III, and they include the attenuation table and several other input parameter lines normally found at the beginning of an input data set for SEISRISK-III.

  8. Application of reliability-centered-maintenance to BWR ECCS motor operator valve performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.; Choi, Y.A.

    1993-01-01

    This paper describes the application of reliability-centered maintenance (RCM) methods to plant probabilistic risk assessment (PRA) and safety analyses for four boiling water reactor emergency core cooling systems (ECCSs): (1) high-pressure coolant injection (HPCI); (2) reactor core isolation cooling (RCIC); (3) residual heat removal (RHR); and (4) core spray systems. Reliability-centered maintenance is a system function-based technique for improving a preventive maintenance program that is applied on a component basis. Those components that truly affect plant function are identified, and maintenance tasks are focused on preventing their failures. The RCM evaluation establishes the relevant criteria that preserve system function somore » that an RCM-focused approach can be flexible and dynamic.« less

  9. Probabilistic evaluation of seismic isolation effect with respect to siting of a fusion reactor facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeda, Masatoshi; Komura, Toshiyuki; Hirotani, Tsutomu

    1995-12-01

    Annual failure probabilities of buildings and equipment were roughly evaluated for two fusion-reactor-like buildings, with and without seismic base isolation, in order to examine the effectiveness of the base isolation system regarding siting issues. The probabilities are calculated considering nonlinearity and rupture of isolators. While the probability of building failure for both buildings on the same site was almost equal, the function failures for equipment showed that the base-isolated building had higher reliability than the non-isolated building. Even if the base-isolated building alone is located on a higher seismic hazard area, it could compete favorably with the ordinary one inmore » reliability of equipment.« less

  10. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  11. Acoustic emission based damage localization in composites structures using Bayesian identification

    NASA Astrophysics Data System (ADS)

    Kundu, A.; Eaton, M. J.; Al-Jumali, S.; Sikdar, S.; Pullin, R.

    2017-05-01

    Acoustic emission based damage detection in composite structures is based on detection of ultra high frequency packets of acoustic waves emitted from damage sources (such as fibre breakage, fatigue fracture, amongst others) with a network of distributed sensors. This non-destructive monitoring scheme requires solving an inverse problem where the measured signals are linked back to the location of the source. This in turn enables rapid deployment of mitigative measures. The presence of significant amount of uncertainty associated with the operating conditions and measurements makes the problem of damage identification quite challenging. The uncertainties stem from the fact that the measured signals are affected by the irregular geometries, manufacturing imprecision, imperfect boundary conditions, existing damages/structural degradation, amongst others. This work aims to tackle these uncertainties within a framework of automated probabilistic damage detection. The method trains a probabilistic model of the parametrized input and output model of the acoustic emission system with experimental data to give probabilistic descriptors of damage locations. A response surface modelling the acoustic emission as a function of parametrized damage signals collected from sensors would be calibrated with a training dataset using Bayesian inference. This is used to deduce damage locations in the online monitoring phase. During online monitoring, the spatially correlated time data is utilized in conjunction with the calibrated acoustic emissions model to infer the probabilistic description of the acoustic emission source within a hierarchical Bayesian inference framework. The methodology is tested on a composite structure consisting of carbon fibre panel with stiffeners and damage source behaviour has been experimentally simulated using standard H-N sources. The methodology presented in this study would be applicable in the current form to structural damage detection under varying operational loads and would be investigated in future studies.

  12. Low pass filter for plasma discharge

    DOEpatents

    Miller, Paul A.

    1994-01-01

    An isolator is disposed between a plasma reactor and its electrical energy source in order to isolate the reactor from the electrical energy source. The isolator operates as a filter to attenuate the transmission of harmonics of a fundamental frequency of the electrical energy source generated by the reactor from interacting with the energy source. By preventing harmonic interaction with the energy source, plasma conditions can be readily reproduced independent of the electrical characteristics of the electrical energy source and/or its associated coupling network.

  13. External events analysis for the Savannah River Site K reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{supmore » {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.« less

  14. Interim reliability evaluation program, Browns Ferry 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1981-01-01

    Probabilistic risk analysis techniques, i.e., event tree and fault tree analysis, were utilized to provide a risk assessment of the Browns Ferry Nuclear Plant Unit 1. Browns Ferry 1 is a General Electric boiling water reactor of the BWR 4 product line with a Mark 1 (drywell and torus) containment. Within the guidelines of the IREP Procedure and Schedule Guide, dominant accident sequences that contribute to public health and safety risks were identified and grouped according to release categories.

  15. Secondary Startup Neutron Sources as a Source of Tritium in a Pressurized Water Reactor (PWR) Reactor Coolant System (RCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Lanning, Donald D.

    2010-02-01

    The hypothesis of this paper is that the Zircaloy clad fuel source is minimal and that secondary startup neutron sources are the significant contributors of the tritium in the RCS that was previously assigned to release from fuel. Currently there are large uncertainties in the attribution of tritium in a Pressurized Water Reactor (PWR) Reactor Coolant System (RCS). The measured amount of tritium in the coolant cannot be separated out empirically into its individual sources. Therefore, to quantify individual contributors, all sources of tritium in the RCS of a PWR must be understood theoretically and verified by the sum ofmore » the individual components equaling the measured values.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.

    The US Nuclear Regulatory Commission (NRC) has been using full-power. Level 1, limited-scope risk models for the Accident Sequence Precursor (ASP) program for over fifteen years. These models have evolved and matured over the years, as have probabilistic risk assessment (PRA) and computer technologies. Significant upgrading activities have been undertaken over the past three years, with involvement from the Offices of Nuclear Reactor Regulation (NRR), Analysis and Evaluation of Operational Data (AEOD), and Nuclear Regulatory Research (RES), and several national laboratories. Part of these activities was an RES-sponsored feasibility study investigating the ability to extend the ASP models to includemore » contributors to core damage from events initiated with the reactor at low power or shutdown (LP/SD), both internal events and external events. This paper presents only the LP/SD internal event modeling efforts.« less

  17. Topics in Nuclear Power

    NASA Astrophysics Data System (ADS)

    Budnitz, Robert J.

    2011-11-01

    The 104 nuclear plants operating in the US today are far safer than they were 20-30 years ago. For example, there's been about a 100-fold reduction in the occurrence of "significant events" since the late 1970s. Although the youngest of currently operating US plants was designed in the 1970s, all have been significantly modified over the years. Key contributors to the safety gains are a vigilant culture, much improved equipment reliability, greatly improved training of operators and maintenance workers, worldwide sharing of experience, and the effective use of probabilistic risk assessment. Several manufacturers have submitted high quality new designs for large reactors to the U.S. Nuclear Regulatory Commission (NRC) for design approval, and some designers are taking a second look at the economies of smaller, modular reactors.

  18. Radiation Embrittlement Archive Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klasky, Hilda B; Bass, Bennett Richard; Williams, Paul T

    2013-01-01

    The Radiation Embrittlement Archive Project (REAP), which is being conducted by the Probabilistic Integrity Safety Assessment (PISA) Program at Oak Ridge National Laboratory under funding from the U.S. Nuclear Regulatory Commission s (NRC) Office of Nuclear Regulatory Research, aims to provide an archival source of information about the effect of neutron radiation on the properties of reactor pressure vessel (RPV) steels. Specifically, this project is an effort to create an Internet-accessible RPV steel embrittlement database. The project s website, https://reap.ornl.gov, provides information in two forms: (1) a document archive with surveillance capsule(s) reports and related technical reports, in PDF format,more » for the 104 commercial nuclear power plants (NPPs) in the United States, with similar reports from other countries; and (2) a relational database archive with detailed information extracted from the reports. The REAP project focuses on data collected from surveillance capsule programs for light-water moderated, nuclear power reactor vessels operated in the United States, including data on Charpy V-notch energy testing results, tensile properties, composition, exposure temperatures, neutron flux (rate of irradiation damage), and fluence, (Fast Neutron Fluence a cumulative measure of irradiation for E>1 MeV). Additionally, REAP contains data from surveillance programs conducted in other countries. REAP is presently being extended to focus on embrittlement data analysis, as well. This paper summarizes the current status of the REAP database and highlights opportunities to access the data and to participate in the project.« less

  19. Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.

    PubMed

    Zhao, Yuchao; Frey, H Christopher

    2004-11-01

    Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.

  20. Bayesian networks improve causal environmental ...

    EPA Pesticide Factsheets

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  1. SELF-REACTIVATING NEUTRON SOURCE FOR A NEUTRONIC REACTOR

    DOEpatents

    Newson, H.W.

    1959-02-01

    Reactors of the type employing beryllium in a reflector region around the active portion and to a neutron source for use therewith are discussed. The neutron source is comprised or a quantity of antimony permanently incorporated in, and as an integral part of, the reactor in or near the beryllium reflector region. During operation of the reactor the natural occurring antimony isotope of atomic weight 123 absorbs neutrons and is thereby transformed to the antimony isotope of atomic weight 124, which is radioactive and emits gamma rays. The gamma rays react with the beryllium to produce neutrons. The beryllium and antimony thus cooperate to produce a built in neutron source which is automatically reactivated by the operation of the reactor itself and which is of sufficient strength to maintain the slow neutron flux at a sufficiently high level to be reliably measured during periods when the reactor is shut down.

  2. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    NASA Astrophysics Data System (ADS)

    Albertson, J. D.

    2015-12-01

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large uncertainties in current approaches. In this paper, we describe results from a series of near-source (< 30 m) controlled methane releases where an instrumented van was used to measure methane concentrations during both fixed location sampling and during mobile traverses immediately downwind of the source. The measurements were used to evaluate the application of EPA Method 33A for estimating methane emissions downwind of a source and also to test the application of a new probabilistic approach for estimating emission rates from mobile traverse data.

  3. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part II: Inundation Modelling and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.

    2013-09-01

    Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    Work is underway at Pacific Northwest Laboratory (PNL) to improve the probabilistic analysis used to model pressurized thermal shock (PTS) incidents in reactor pressure vessels, and, further, to incorporate these improvements into the existing Vessel Integrity Simulation Analysis (VISA) code. Two topics related to work on input distributions in VISA are discussed in this paper. The first involves the treatment of flaw size distributions and the second concerns errors in the parameters in the (Guthrie) equation which is used to compute ..delta..RT/sub NDT/, the shift in reference temperature for nil ductility transition.

  5. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  6. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper representsmore » an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation environment such as RELAP-7. • Identify the risk-significant passive components, their failure modes and anticipated rates of degradation • Incorporate surveillance and maintenance activities and their effects into the plant state and into component aging progress. • Asses aging affects in a dynamic simulation environment 1. C. L. SMITH, V. N. SHAH, T. KAO, G. APOSTOLAKIS, “Incorporating Ageing Effects into Probabilistic Risk Assessment –A Feasibility Study Utilizing Reliability Physics Models,” NUREG/CR-5632, USNRC, (2001). 2. T. ALDEMIR, “A Survey of Dynamic Methodologies for Probabilistic Safety Assessment of Nuclear Power Plants, Annals of Nuclear Energy, 52, 113-124, (2013). 3. C. RABITI, A. ALFONSI, J. COGLIATI, D. MANDELLI and R. KINOSHITA “Reactor Analysis and Virtual Control Environment (RAVEN) FY12 Report,” INL/EXT-12-27351, (2012). 4. D. ANDERS et.al, "RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7," INL/EXT-12-25924, (2012).« less

  7. Statistical Analyses for Probabilistic Assessments of the Reactor Pressure Vessel Structural Integrity: Building a Master Curve on an Extract of the 'Euro' Fracture Toughness Dataset, Controlling Statistical Uncertainty for Both Mono-Temperature and multi-temperature tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick

    2006-07-01

    Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less

  8. Utilization of 134Cs/137Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-08-01

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.

  9. Utilization of (134)Cs/(137)Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident.

    PubMed

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-08-22

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured (134)Cs/(137)Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of (134)Cs/(137)Cs are different in reactor units owing to fuel burnup differences, the (134)Cs/(137)Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.

  10. Utilization of 134Cs/137Cs in the environment to identify the reactor units that caused atmospheric releases during the Fukushima Daiichi accident

    PubMed Central

    Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu

    2016-01-01

    The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12–21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2. PMID:27546490

  11. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  12. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  13. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    PubMed Central

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  14. The energy release and temperature field in the ultracold neutron source of the WWR-M reactor at the Petersburg Nuclear Physics Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serebrov, A. P., E-mail: serebrov@pnpi.spb.ru; Kislitsin, B. V.; Onegin, M. S.

    2016-12-15

    Results of calculations of energy releases and temperature fields in the ultracold neutron source under design at the WWR-M reactor are presented. It is shown that, with the reactor power of 18 MW, the power of energy release in the 40-L volume of the source with superfluid helium will amount to 28.5 W, while 356 W will be released in a liquid-deuterium premoderator. The lead shield between the reactor core and the source reduces the radiative heat release by an order of magnitude. A thermal power of 22 kW is released in it, which is removed by passage of water.more » The distribution of temperatures in all components of the vacuum structure is presented, and the temperature does not exceed 100°C at full reactor power. The calculations performed make it possible to go to design of the source.« less

  15. Probabilistic n/γ discrimination with robustness against outliers for use in neutron profile monitors

    NASA Astrophysics Data System (ADS)

    Uchida, Y.; Takada, E.; Fujisaki, A.; Kikuchi, T.; Ogawa, K.; Isobe, M.

    2017-08-01

    A method to stochastically discriminate neutron and γ-ray signals measured with a stilbene organic scintillator is proposed. Each pulse signal was stochastically categorized into two groups: neutron and γ-ray. In previous work, the Expectation Maximization (EM) algorithm was used with the assumption that the measured data followed a Gaussian mixture distribution. It was shown that probabilistic discrimination between these groups is possible. Moreover, by setting the initial parameters for the Gaussian mixture distribution with a k-means algorithm, the possibility of automatic discrimination was demonstrated. In this study, the Student's t-mixture distribution was used as a probabilistic distribution with the EM algorithm to improve the robustness against the effect of outliers caused by pileup of the signals. To validate the proposed method, the figures of merit (FOMs) were compared for the EM algorithm assuming a t-mixture distribution and a Gaussian mixture distribution. The t-mixture distribution resulted in an improvement of the FOMs compared with the Gaussian mixture distribution. The proposed data processing technique is a promising tool not only for neutron and γ-ray discrimination in fusion experiments but also in other fields, for example, homeland security, cancer therapy with high energy particles, nuclear reactor decommissioning, pattern recognition, and so on.

  16. Model fitting data from syllogistic reasoning experiments.

    PubMed

    Hattori, Masasi

    2016-12-01

    The data presented in this article are related to the research article entitled "Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics" (M. Hattori, 2016) [1]. This article presents predicted data by three signature probabilistic models of syllogistic reasoning and model fitting results for each of a total of 12 experiments ( N =404) in the literature. Models are implemented in R, and their source code is also provided.

  17. The Epistemic Representation of Information Flow Security in Probabilistic Systems

    DTIC Science & Technology

    1995-06-01

    The new characterization also means that our security crite- rion is expressible in a simpler logic and model. 1 Introduction Multilevel security is...ber generator) during its execution. Such probabilistic choices are useful in a multilevel security context for Supported by grants HKUST 608/94E from... 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and

  18. Improving online risk assessment with equipment prognostics and health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie B.; Liu, Xiaotong; Briere, Chris

    The current approach to evaluating the risk of nuclear power plant (NPP) operation relies on static probabilities of component failure, which are based on industry experience with the existing fleet of nominally similar light water reactors (LWRs). As the nuclear industry looks to advanced reactor designs that feature non-light water coolants (e.g., liquid metal, high temperature gas, molten salt), this operating history is not available. Many advanced reactor designs use advanced components, such as electromagnetic pumps, that have not been used in the US commercial nuclear fleet. Given the lack of rich operating experience, we cannot accurately estimate the evolvingmore » probability of failure for basic components to populate the fault trees and event trees that typically comprise probabilistic risk assessment (PRA) models. Online equipment prognostics and health management (PHM) technologies can bridge this gap to estimate the failure probabilities for components under operation. The enhanced risk monitor (ERM) incorporates equipment condition assessment into the existing PRA and risk monitor framework to provide accurate and timely estimates of operational risk.« less

  19. A probabilistic approach to radiative energy loss calculations for optically thick atmospheres - Hydrogen lines and continua

    NASA Technical Reports Server (NTRS)

    Canfield, R. C.; Ricchiazzi, P. J.

    1980-01-01

    An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.

  20. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    PubMed

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A framework for the probabilistic analysis of meteotsunamis

    USGS Publications Warehouse

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  2. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    PubMed

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  3. Modeling marine oily wastewater treatment by a probabilistic agent-based approach.

    PubMed

    Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong

    2018-02-01

    This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Georgia Tech Studies of Sub-Critical Advanced Burner Reactors with a D-T Fusion Tokamak Neutron Source for the Transmutation of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Stacey, W. M.

    2009-09-01

    The possibility that a tokamak D-T fusion neutron source, based on ITER physics and technology, could be used to drive sub-critical, fast-spectrum nuclear reactors fueled with the transuranics (TRU) in spent nuclear fuel discharged from conventional nuclear reactors has been investigated at Georgia Tech in a series of studies which are summarized in this paper. It is found that sub-critical operation of such fast transmutation reactors is advantageous in allowing longer fuel residence time, hence greater TRU burnup between fuel reprocessing stages, and in allowing higher TRU loading without compromising safety, relative to what could be achieved in a similar critical transmutation reactor. The required plasma and fusion technology operating parameter range of the fusion neutron source is generally within the anticipated operational range of ITER. The implications of these results for fusion development policy, if they hold up under more extensive and detailed analysis, is that a D-T fusion tokamak neutron source for a sub-critical transmutation reactor, built on the basis of the ITER operating experience, could possibly be a logical next step after ITER on the path to fusion electrical power reactors. At the same time, such an application would allow fusion to contribute to meeting the nation's energy needs at an earlier stage by helping to close the fission reactor nuclear fuel cycle.

  5. Probabilistic source mechanism estimation based on body-wave waveforms through shift and stack algorithm

    NASA Astrophysics Data System (ADS)

    Massin, F.; Malcolm, A. E.

    2017-12-01

    Knowing earthquake source mechanisms gives valuable information for earthquake response planning and hazard mitigation. Earthquake source mechanisms can be analyzed using long period waveform inversion (for moderate size sources with sufficient signal to noise ratio) and body-wave first motion polarity or amplitude ratio inversion (for micro-earthquakes with sufficient data coverage). A robust approach that gives both source mechanisms and their associated probabilities across all source scales would greatly simplify the determination of source mechanisms and allow for more consistent interpretations of the results. Following previous work on shift and stack approaches, we develop such a probabilistic source mechanism analysis, using waveforms, which does not require polarity picking. For a given source mechanism, the first period of the observed body-waves is selected for all stations, multiplied by their corresponding theoretical polarity and stacked together. (The first period is found from a manually picked travel time by measuring the central period where the signal power is concentrated, using the second moment of the power spectral density function.) As in other shift and stack approaches, our method is not based on the optimization of an objective function through an inversion. Instead, the power of the polarity-corrected stack is a proxy for the likelihood of the trial source mechanism, with the most powerful stack corresponding to the most likely source mechanism. Using synthetic data, we test our method for robustness to the data coverage, coverage gap, signal to noise ratio, travel-time picking errors and non-double couple component. We then present results for field data in a volcano-tectonic context. Our results are reliable when constrained by 15 body-wavelets, with gap below 150 degrees, signal to noise ratio over 1 and arrival time error below a fifth of the period (0.2T) of the body-wave. We demonstrate that the source scanning approach for source mechanism analysis has similar advantages to waveform inversion (full waveform data, no manual intervention, probabilistic approach) and similar applicability to polarity inversion (any source size, any instrument type).

  6. Alloying of steel and graphite by hydrogen in nuclear reactor

    NASA Astrophysics Data System (ADS)

    Krasikov, E.

    2017-02-01

    In traditional power engineering hydrogen may be one of the first primary source of equipment damage. This problem has high actuality for both nuclear and thermonuclear power engineering. Study of radiation-hydrogen embrittlement of the steel raises the question concerning the unknown source of hydrogen in reactors. Later unexpectedly high hydrogen concentrations were detected in irradiated graphite. It is necessary to look for this source of hydrogen especially because hydrogen flakes were detected in reactor vessels of Belgian NPPs. As a possible initial hypothesis about the enigmatical source of hydrogen one can propose protons generation during beta-decay of free neutrons поскольку inasmuch as protons detected by researches at nuclear reactors as witness of beta-decay of free neutrons.

  7. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part I: Propagation Modelling and Tsunami Hazard Assessment at the Shoreline

    NASA Astrophysics Data System (ADS)

    Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip

    2013-09-01

    Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.

  8. Conceptual design study of Fusion Experimental Reactor (FY86 FER): Safety

    NASA Astrophysics Data System (ADS)

    Seki, Yasushi; Iida, Hiromasa; Honda, Tsutomu

    1987-08-01

    This report describes the study on safety for FER (Fusion Experimental Reactor) which has been designed as a next step machine to the JT-60. Though the final purpose of this study is to have an image of design base accident, maximum credible accident and to assess their risk or probability, etc., as FER plant system, the emphasis of this years study is placed on fuel-gas circulation system where the tritium inventory is maximum. The report consists of two chapters. The first chapter summarizes the FER system and describes FMEA (Failure Mode and Effect Analysis) and related accident progression sequence for FER plant system as a whole. The second chapter of this report is focused on fuel-gas circulation system including purification, isotope separation and storage. Probability of risk is assessed by the probabilistic risk analysis (PRA) procedure based on FMEA, ETA and FTA.

  9. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  10. The procedure and results of calculations of the equilibrium isotopic composition of a demonstration subcritical molten salt reactor

    NASA Astrophysics Data System (ADS)

    Nevinitsa, V. A.; Dudnikov, A. A.; Blandinskiy, V. Yu.; Balanin, A. L.; Alekseev, P. N.; Titarenko, Yu. E.; Batyaev, V. F.; Pavlov, K. V.; Titarenko, A. Yu.

    2015-12-01

    A subcritical molten salt reactor with an external neutron source is studied computationally as a facility for incineration and transmutation of minor actinides from spent nuclear fuel of reactors of VVER-1000 type and for producing 233U from 232Th. The reactor configuration is chosen, the requirements to be imposed on the external neutron source are formulated, and the equilibrium isotopic composition of heavy nuclides and the key parameters of the fuel cycle are calculated.

  11. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  12. Hydrogasification reactor and method of operating same

    DOEpatents

    Hobbs, Raymond; Karner, Donald; Sun, Xiaolei; Boyle, John; Noguchi, Fuyuki

    2013-09-10

    The present invention provides a system and method for evaluating effects of process parameters on hydrogasification processes. The system includes a hydrogasification reactor, a pressurized feed system, a hopper system, a hydrogen gas source, and a carrier gas source. Pressurized carbonaceous material, such as coal, is fed to the reactor using the carrier gas and reacted with hydrogen to produce natural gas.

  13. Skyshine study for next generation of fusion devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gohar, Y.; Yang, S.

    1987-02-01

    A shielding analysis for next generation of fusion devices (ETR/INTOR) was performed to study the dose equivalent outside the reactor building during operation including the contribution from neutrons and photons scattered back by collisions with air nuclei (skyshine component). Two different three-dimensional geometrical models for a tokamak fusion reactor based on INTOR design parameters were developed for this study. In the first geometrical model, the reactor geometry and the spatial distribution of the deuterium-tritium neutron source were simplified for a parametric survey. The second geometrical model employed an explicit representation of the toroidal geometry of the reactor chamber and themore » spatial distribution of the neutron source. The MCNP general Monte Carlo code for neutron and photon transport was used to perform all the calculations. The energy distribution of the neutron source was used explicitly in the calculations with ENDF/B-V data. The dose equivalent results were analyzed as a function of the concrete roof thickness of the reactor building and the location outside the reactor building.« less

  14. Combustion flame-plasma hybrid reactor systems, and chemical reactant sources

    DOEpatents

    Kong, Peter C

    2013-11-26

    Combustion flame-plasma hybrid reactor systems, chemical reactant sources, and related methods are disclosed. In one embodiment, a combustion flame-plasma hybrid reactor system comprising a reaction chamber, a combustion torch positioned to direct a flame into the reaction chamber, and one or more reactant feed assemblies configured to electrically energize at least one electrically conductive solid reactant structure to form a plasma and feed each electrically conductive solid reactant structure into the plasma to form at least one product is disclosed. In an additional embodiment, a chemical reactant source for a combustion flame-plasma hybrid reactor comprising an elongated electrically conductive reactant structure consisting essentially of at least one chemical reactant is disclosed. In further embodiments, methods of forming a chemical reactant source and methods of chemically converting at least one reactant into at least one product are disclosed.

  15. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  16. 40 CFR 63.1406 - Reactor batch process vent provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Reactor batch process vent provisions... § 63.1406 Reactor batch process vent provisions. (a) Emission standards. Owners or operators of reactor... reactor batch process vent located at a new affected source shall control organic HAP emissions by...

  17. 40 CFR 63.1406 - Reactor batch process vent provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Reactor batch process vent provisions... § 63.1406 Reactor batch process vent provisions. (a) Emission standards. Owners or operators of reactor... reactor batch process vent located at a new affected source shall control organic HAP emissions by...

  18. Anticipatory systems using a probabilistic-possibilistic formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsoukalas, L.H.

    1989-01-01

    A methodology for the realization of the Anticipatory Paradigm in the diagnosis and control of complex systems, such as power plants, is developed. The objective is to synthesize engineering systems as analogs of certain biological systems which are capable of modifying their present states on the basis of anticipated future states. These future states are construed to be the output of predictive, numerical, stochastic or symbolic models. The mathematical basis of the implementation is developed on the basis of a formulation coupling probabilistic (random) and possibilistic(fuzzy) data in the form of an Information Granule. Random data are generated from observationsmore » and sensors input from the environment. Fuzzy data consists of eqistemic information, such as criteria or constraints qualifying the environmental inputs. The approach generates mathematical performance measures upon which diagnostic inferences and control functions are based. Anticipated performance is generated using a fuzzified Bayes formula. Triplex arithmetic is used in the numerical estimation of the performance measures. Representation of the system is based upon a goal-tree within the rule-based paradigm from the field of Applied Artificial Intelligence. The ensuing construction incorporates a coupling of Symbolic and Procedural programming methods. As a demonstration of the possibility of constructing such systems, a model-based system of a nuclear reactor is constructed. A numerical model of the reactor as a damped simple harmonic oscillator is used. The neutronic behavior is described by a point kinetics model with temperature feedback. The resulting system is programmed in OPS5 for the symbolic component and in FORTRAN for the procedural part.« less

  19. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  20. A novel probabilistic framework for event-based speech recognition

    NASA Astrophysics Data System (ADS)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  1. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  2. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  3. The procedure and results of calculations of the equilibrium isotopic composition of a demonstration subcritical molten salt reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nevinitsa, V. A., E-mail: Neviniza-VA@nrcki.ru; Dudnikov, A. A.; Blandinskiy, V. Yu.

    2015-12-15

    A subcritical molten salt reactor with an external neutron source is studied computationally as a facility for incineration and transmutation of minor actinides from spent nuclear fuel of reactors of VVER-1000 type and for producing {sup 233}U from {sup 232}Th. The reactor configuration is chosen, the requirements to be imposed on the external neutron source are formulated, and the equilibrium isotopic composition of heavy nuclides and the key parameters of the fuel cycle are calculated.

  4. Noise source and reactor stability estimation in a boiling water reactor using a multivariate autoregressive model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanemoto, S.; Andoh, Y.; Sandoz, S.A.

    1984-10-01

    A method for evaluating reactor stability in boiling water reactors has been developed. The method is based on multivariate autoregressive (M-AR) modeling of steady-state neutron and process noise signals. In this method, two kinds of power spectral densities (PSDs) for the measured neutron signal and the corresponding noise source signal are separately identified by the M-AR modeling. The closed- and open-loop stability parameters are evaluated from these PSDs. The method is applied to actual plant noise data that were measured together with artificial perturbation test data. Stability parameters identified from noise data are compared to those from perturbation test data,more » and it is shown that both results are in good agreement. In addition to these stability estimations, driving noise sources for the neutron signal are evaluated by the M-AR modeling. Contributions from void, core flow, and pressure noise sources are quantitatively evaluated, and the void noise source is shown to be the most dominant.« less

  5. Isotopic composition and neutronics of the Okelobondo natural reactor

    NASA Astrophysics Data System (ADS)

    Palenik, Christopher Samuel

    The Oklo-Okelobondo and Bangombe uranium deposits, in Gabon, Africa host Earth's only known natural nuclear fission reactors. These 2 billion year old reactors represent a unique opportunity to study used nuclear fuel over geologic periods of time. The reactors in these deposits have been studied as a means by which to constrain the source term of fission product concentrations produced during reactor operation. The source term depends on the neutronic parameters, which include reactor operation duration, neutron flux and the neutron energy spectrum. Reactor operation has been modeled using a point-source computer simulation (Oak Ridge Isotope Generation and Depletion, ORIGEN, code) for a light water reactor. Model results have been constrained using secondary ionization mass spectroscopy (SIMS) isotopic measurements of the fission products Nd and Te, as well as U in uraninite from samples collected in the Okelobondo reactor zone. Based upon the constraints on the operating conditions, the pre-reactor concentrations of Nd (150 ppm +/- 75 ppm) and Te (<1 ppm) in uraninite were estimated. Related to the burnup measured in Okelobondo samples (0.7 to 13.8 GWd/MTU), the final fission product inventories of Nd (90 to 1200 ppm) and Te (10 to 110 ppm) were calculated. By the same means, the ranges of all other fission products and actinides produced during reactor operation were calculated as a function of burnup. These results provide a source term against which the present elemental and decay abundances at the fission reactor can be compared. Furthermore, they provide new insights into the extent to which a "fossil" nuclear reactor can be characterized on the basis of its isotopic signatures. In addition, results from the study of two other natural systems related to the radionuclide and fission product transport are included. A detailed mineralogical characterization of the uranyl mineralogy at the Bangombe uranium deposit in Gabon, Africa was completed to improve geochemical models of the solubility-limiting phase. A study of the competing effects of radiation damage and annealing in a U-bearing crystal of zircon shows that low temperature annealing in actinide-bearing phases is significant in the annealing of radiation damage.

  6. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Federal and..., Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear... of this chapter, see § 2.106(d). (b) If the Director, Office of Nuclear Reactor Regulation, Director...

  7. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Federal and..., Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear... of this chapter, see § 2.106(d). (b) If the Director, Office of Nuclear Reactor Regulation, Director...

  8. Topics in nuclear power

    NASA Astrophysics Data System (ADS)

    Budnitz, Robert J.

    2015-03-01

    The 101 nuclear plants operating in the US today are far safer than they were 20-30 years ago. For example, there's been about a 100-fold reduction in the occurrence of "significant events" since the late 1970s. Although the youngest of currently operating US plants was designed in the 1970s, all have been significantly modified over the years. Key contributors to the safety gains are a vigilant culture, much improved equipment reliability, greatly improved training of operators and maintenance workers, worldwide sharing of experience, and the effective use of probabilistic risk assessment. Several manufacturers have submitted high quality new designs for large reactors to the U.S. Nuclear Regulatory Commission (NRC) for design approval, and several companies are vigorously working on designs for smaller, modular reactors. Although the Fukushima reactor accident in March 2011 in Japan has been an almost unmitigated disaster for the local population due to their being displaced from their homes and workplaces and also due to the land contamination, its "lessons learned" have been important for the broader nuclear industry, and will surely result in safer nuclear plants worldwide - indeed, have already done so, with more safety improvements to come.

  9. Topics in nuclear power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budnitz, Robert J.

    The 101 nuclear plants operating in the US today are far safer than they were 20-30 years ago. For example, there's been about a 100-fold reduction in the occurrence of 'significant events' since the late 1970s. Although the youngest of currently operating US plants was designed in the 1970s, all have been significantly modified over the years. Key contributors to the safety gains are a vigilant culture, much improved equipment reliability, greatly improved training of operators and maintenance workers, worldwide sharing of experience, and the effective use of probabilistic risk assessment. Several manufacturers have submitted high quality new designs formore » large reactors to the U.S. Nuclear Regulatory Commission (NRC) for design approval, and several companies are vigorously working on designs for smaller, modular reactors. Although the Fukushima reactor accident in March 2011 in Japan has been an almost unmitigated disaster for the local population due to their being displaced from their homes and workplaces and also due to the land contamination, its 'lessons learned' have been important for the broader nuclear industry, and will surely result in safer nuclear plants worldwide - indeed, have already done so, with more safety improvements to come.« less

  10. A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident.

    PubMed

    Wellock, Thomas R

    In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission's (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The "Rasmussen Report" inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report's controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a "figure of merit" to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power's safety to a growing chorus of critics. Subsequent attacks on the Report's methods and numerical estimates damaged the NRC's credibility. PRA's fortunes revived when the 1979 Three Mile Island accident demonstrated PRA's potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report's controversies endure in mistrust of PRA and its experts.

  11. Expert judgments about RD&D and the future of nuclear energy.

    PubMed

    Anadón, Laura D; Bosetti, Valentina; Bunn, Matthew; Catenacci, Michela; Lee, Audrey

    2012-11-06

    Probabilistic estimates of the cost and performance of future nuclear energy systems under different scenarios of government research, development, and demonstration (RD&D) spending were obtained from 30 U.S. and 30 European nuclear technology experts. We used a novel elicitation approach which combined individual and group elicitation. With no change from current RD&D funding levels, experts on average expected current (Gen. III/III+) designs to be somewhat more expensive in 2030 than they were in 2010, and they expected the next generation of designs (Gen. IV) to be more expensive still as of 2030. Projected costs of proposed small modular reactors (SMRs) were similar to those of Gen. IV systems. The experts almost unanimously recommended large increases in government support for nuclear RD&D (generally 2-3 times current spending). The majority expected that such RD&D would have only a modest effect on cost, but would improve performance in other areas, such as safety, waste management, and uranium resource utilization. The U.S. and E.U. experts were in relative agreement regarding how government RD&D funds should be allocated, placing particular focus on very high temperature reactors, sodium-cooled fast reactors, fuels and materials, and fuel cycle technologies.

  12. "What--me worry?" "Why so serious?": a personal view on the Fukushima nuclear reactor accidents.

    PubMed

    Gallucci, Raymond

    2012-09-01

    Infrequently, it seems that a significant accident precursor or, worse, an actual accident, involving a commercial nuclear power reactor occurs to remind us of the need to reexamine the safety of this important electrical power technology from a risk perspective. Twenty-five years since the major core damage accident at Chernobyl in the Ukraine, the Fukushima reactor complex in Japan experienced multiple core damages as a result of an earthquake-induced tsunami beyond either the earthquake or tsunami design basis for the site. Although the tsunami itself killed tens of thousands of people and left the area devastated and virtually uninhabitable, much concern still arose from the potential radioactive releases from the damaged reactors, even though there was little population left in the area to be affected. As a lifelong probabilistic safety analyst in nuclear engineering, even I must admit to a recurrence of the doubt regarding nuclear power safety after Fukushima that I had experienced after Three Mile Island and Chernobyl. This article is my attempt to "recover" my personal perspective on acceptable risk by examining both the domestic and worldwide history of commercial nuclear power plant accidents and attempting to quantify the risk in terms of the frequency of core damage that one might glean from a review of operational history. © 2012 Society for Risk Analysis.

  13. Reactor for making uniform capsules

    NASA Technical Reports Server (NTRS)

    Wang, Taylor G. (Inventor); Anikumar, Amrutur V. (Inventor); Lacik, Igor (Inventor)

    1999-01-01

    The present invention provides a novel reactor for making capsules with uniform membrane. The reactor includes a source for providing a continuous flow of a first liquid through the reactor; a source for delivering a steady stream of drops of a second liquid to the entrance of the reactor; a main tube portion having at least one loop, and an exit opening, where the exit opening is at a height substantially equal to the entrance. In addition, a method for using the novel reactor is provided. This method involves providing a continuous stream of a first liquid; introducing uniformly-sized drops of the second liquid into the stream of the first liquid; allowing the drops to react in the stream for a pre-determined period of time; and collecting the capsules.

  14. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Mialle, P.

    2015-12-01

    The IDC collects waveforms from a global network of infrasound sensors maintained by the IMS, and automatically detects signal onsets and associates them to form event hypotheses. However, a large number of signal onsets are due to local clutter sources such as microbaroms (from standing waves in the oceans), waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NET-VISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydro-acoustic and infrasound processing built on a unified probabilistic framework. Notes: The attached figure shows all the unassociated arrivals detected at IMS station I09BR for 2012 distributed by azimuth and center frequency. (The title displays the bandwidth of the kernel density estimate along the azimuth and frequency dimensions).This plot shows multiple micro-barom sources as well as other sources of infrasound clutter. A diverse clutter-field such as this one is quite common for most IMS infrasound stations, and it highlights the dangers of forming events without due consideration of this source of noise. References: [1] Infrasound categorization Towards a statistics-based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NET-VISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013.

  15. Fluidizing a mixture of particulate coal and char

    DOEpatents

    Green, Norman W.

    1979-08-07

    Method of mixing particulate materials comprising contacting a primary source and a secondary source thereof whereby resulting mixture ensues; preferably at least one of the two sources has enough motion to insure good mixing and the particulate materials may be heat treated if desired. Apparatus for such mixing comprising an inlet for a primary source, a reactor communicating therewith, a feeding means for supplying a secondary source to the reactor, and an inlet for the secondary source. Feeding means is preferably adapted to supply fluidized materials.

  16. Mixing method and apparatus

    DOEpatents

    Green, Norman W.

    1982-06-15

    Method of mixing particulate materials comprising contacting a primary source and a secondary source thereof whereby resulting mixture ensues; preferably at least one of the two sources has enough motion to insure good mixing and the particulate materials may be heat treated if desired. Apparatus for such mixing comprising an inlet for a primary source, a reactor communicating therewith, a feeding means for supplying a secondary source to the reactor, and an inlet for the secondary source. Feeding means is preferably adapted to supply fluidized materials.

  17. SURROGATE MODEL DEVELOPMENT AND VALIDATION FOR RELIABILITY ANALYSIS OF REACTOR PRESSURE VESSELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, William M.; Riley, Matthew E.; Spencer, Benjamin W.

    In nuclear light water reactors (LWRs), the reactor coolant, core and shroud are contained within a massive, thick walled steel vessel known as a reactor pressure vessel (RPV). Given the tremendous size of these structures, RPVs typically contain a large population of pre-existing flaws introduced in the manufacturing process. After many years of operation, irradiation-induced embrittlement makes these vessels increasingly susceptible to fracture initiation at the locations of the pre-existing flaws. Because of the uncertainty in the loading conditions, flaw characteristics and material properties, probabilistic methods are widely accepted and used in assessing RPV integrity. The Fracture Analysis of Vesselsmore » – Oak Ridge (FAVOR) computer program developed by researchers at Oak Ridge National Laboratory is widely used for this purpose. This program can be used in order to perform deterministic and probabilistic risk-informed analyses of the structural integrity of an RPV subjected to a range of thermal-hydraulic events. FAVOR uses a one-dimensional representation of the global response of the RPV, which is appropriate for the beltline region, which experiences the most embrittlement, and employs an influence coefficient technique to rapidly compute stress intensity factors for axis-aligned surface-breaking flaws. The Grizzly code is currently under development at Idaho National Laboratory (INL) to be used as a general multiphysics simulation tool to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled RPVs. Grizzly can be used to model the thermo-mechanical response of an RPV under transient conditions observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local 3D models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtain stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. To use Grizzly for probabilistic analysis, it is necessary to have a way to rapidly evaluate stress intensity factors. To accomplish this goal, a reduced order model (ROM) has been developed to efficiently represent the behavior of a detailed 3D Grizzly model used to calculate fracture parameters. This approach uses the stress intensity factor influence coefficient method that has been used with great success in FAVOR. Instead of interpolating between tabulated solutions, as FAVOR does, the ROM approach uses a response surface methodology to compute fracture solutions based on a sampled set of results used to train the ROM. The main advantages of this approach are that the process of generating the training data can be fully automated, and the procedure can be readily used to consider more general flaw configurations. This paper demonstrates the procedure used to generate a ROM to rapidly compute stress intensity factors for axis-aligned flaws. The results from this procedure are in good agreement with those produced using the traditional influence coefficient interpolation procedure, which gives confidence in this method. This paves the way for applying this procedure for more general flaw configurations.« less

  18. Worst case encoder-decoder policies for a communication system in the presence of an unknown probabilistic jammer

    NASA Astrophysics Data System (ADS)

    Cascio, David M.

    1988-05-01

    States of nature or observed data are often stochastically modelled as Gaussian random variables. At times it is desirable to transmit this information from a source to a destination with minimal distortion. Complicating this objective is the possible presence of an adversary attempting to disrupt this communication. In this report, solutions are provided to a class of minimax and maximin decision problems, which involve the transmission of a Gaussian random variable over a communications channel corrupted by both additive Gaussian noise and probabilistic jamming noise. The jamming noise is termed probabilistic in the sense that with nonzero probability 1-P, the jamming noise is prevented from corrupting the channel. We shall seek to obtain optimal linear encoder-decoder policies which minimize given quadratic distortion measures.

  19. Optimally moderated nuclear fission reactor and fuel source therefor

    DOEpatents

    Ougouag, Abderrafi M [Idaho Falls, ID; Terry, William K [Shelley, ID; Gougar, Hans D [Idaho Falls, ID

    2008-07-22

    An improved nuclear fission reactor of the continuous fueling type involves determining an asymptotic equilibrium state for the nuclear fission reactor and providing the reactor with a moderator-to-fuel ratio that is optimally moderated for the asymptotic equilibrium state of the nuclear fission reactor; the fuel-to-moderator ratio allowing the nuclear fission reactor to be substantially continuously operated in an optimally moderated state.

  20. Alternative approaches to fusion. [reactor design and reactor physics for Tokamak fusion reactors

    NASA Technical Reports Server (NTRS)

    Roth, R. J.

    1976-01-01

    The limitations of the Tokamak fusion reactor concept are discussed and various other fusion reactor concepts are considered that employ the containment of thermonuclear plasmas by magnetic fields (i.e., stellarators). Progress made in the containment of plasmas in toroidal devices is reported. Reactor design concepts are illustrated. The possibility of using fusion reactors as a power source in interplanetary space travel and electric power plants is briefly examined.

  1. Philosophy of ATHEANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  2. Source Detection with Bayesian Inference on ROSAT All-Sky Survey Data Sample

    NASA Astrophysics Data System (ADS)

    Guglielmetti, F.; Voges, W.; Fischer, R.; Boese, G.; Dose, V.

    2004-07-01

    We employ Bayesian inference for the joint estimation of sources and background on ROSAT All-Sky Survey (RASS) data. The probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS). Background maps were estimated in a single step together with the detection of sources without pixel censoring. Consistent uncertainties of background and sources are provided. The source probability is evaluated for single pixels as well as for pixel domains to enhance source detection of weak and extended sources.

  3. A Methodology for Modeling Nuclear Power Plant Passive Component Aging in Probabilistic Risk Assessment under the Impact of Operating Conditions, Surveillance and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Guler Yigitoglu, Askin

    In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.

  4. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  5. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.

  6. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110

  7. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  8. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  9. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  10. Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunsman, David Marvin; Aldemir, Tunc; Rutt, Benjamin

    2008-05-01

    This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accidentmore » progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other such codes.) The results of this demonstration indicate that the approach can significantly reduce the resources required for Level 2 PRAs. From the phenomenological viewpoint, ADAPT can also treat the associated epistemic and aleatory uncertainties. This methodology can also be used for analyses of other complex systems. Any complex system can be analyzed using ADAPT if the workings of that system can be displayed as an event tree, there is a computer code that simulates how those events could progress, and that simulator code has switches to turn on and off system events, phenomena, etc. Using and applying ADAPT to particular problems is not human independent. While the human resources for the creation and analysis of the accident progression are significantly decreased, knowledgeable analysts are still necessary for a given project to apply ADAPT successfully. This research and development effort has met its original goals and then exceeded them.« less

  11. Modelling of the anti-neutrino production and spectra from a Magnox reactor

    NASA Astrophysics Data System (ADS)

    Mills, Robert W.; Mountford, David J.; Coleman, Jonathon P.; Metelko, Carl; Murdoch, Matthew; Schnellbach, Yan-Jie

    2018-01-01

    The anti-neutrino source properties of a fission reactor are governed by the production and beta decay of the radionuclides present and the summation of their individual anti-neutrino spectra. The fission product radionuclide production changes during reactor operation and different fissioning species give rise to different product distributions. It is thus possible to determine some details of reactor operation, such as power, from the anti-neutrino emission to confirm safeguards records. Also according to some published calculations, it may be feasible to observe different anti-neutrino spectra depending on the fissile contents of the reactor fuel and thus determine the reactor's fissile material inventory during operation which could considerable improve safeguards. In mid-2014 the University of Liverpool deployed a prototype anti-neutrino detector at the Wylfa R1 station in Anglesey, United Kingdom based upon plastic scintillator technology developed for the T2K project. The deployment was used to develop the detector electronics and software until the reactor was finally shutdown in December 2015. To support the development of this detector technology for reactor monitoring and to understand its capabilities, the National Nuclear Laboratory modelled this graphite moderated and natural uranium fuelled reactor with existing codes used to support Magnox reactor operations and waste management. The 3D multi-physics code PANTHER was used to determine the individual powers of each fuel element (8×6152) during the year and a half period of monitoring based upon reactor records. The WIMS/TRAIL/FISPIN code route was then used to determine the radionuclide inventory of each nuclide on a daily basis in each element. These nuclide inventories were then used with the BTSPEC code to determine the anti-neutrino spectra and source strength using JEFF-3.1.1 data. Finally the anti-neutrino source from the reactor for each day during the year and a half of monitored reactor operation was calculated. The results of the preliminary calculations are shown and limitations in the methods and data discussed.

  12. CONTROL ROD DRIVE

    DOEpatents

    Chapellier, R.A.

    1960-05-24

    BS>A drive mechanism was invented for the control rod of a nuclear reactor. Power is provided by an electric motor and an outside source of fluid pressure is utilized in conjunction with the fluid pressure within the reactor to balance the loadings on the motor. The force exerted on the drive mechanism in the direction of scramming the rod is derived from the reactor fluid pressure so that failure of the outside pressure source will cause prompt scramming of the rod.

  13. Evaluating the Use of Existing Data Sources, Probabilistic Linkage, and Multiple Imputation to Build Population-based Injury Databases Across Phases of Trauma Care

    PubMed Central

    Newgard, Craig; Malveau, Susan; Staudenmayer, Kristan; Wang, N. Ewen; Hsia, Renee Y.; Mann, N. Clay; Holmes, James F.; Kuppermann, Nathan; Haukoos, Jason S.; Bulger, Eileen M.; Dai, Mengtao; Cook, Lawrence J.

    2012-01-01

    Objectives The objective was to evaluate the process of using existing data sources, probabilistic linkage, and multiple imputation to create large population-based injury databases matched to outcomes. Methods This was a retrospective cohort study of injured children and adults transported by 94 emergency medical systems (EMS) agencies to 122 hospitals in seven regions of the western United States over a 36-month period (2006 to 2008). All injured patients evaluated by EMS personnel within specific geographic catchment areas were included, regardless of field disposition or outcome. The authors performed probabilistic linkage of EMS records to four hospital and postdischarge data sources (emergency department [ED] data, patient discharge data, trauma registries, and vital statistics files) and then handled missing values using multiple imputation. The authors compare and evaluate matched records, match rates (proportion of matches among eligible patients), and injury outcomes within and across sites. Results There were 381,719 injured patients evaluated by EMS personnel in the seven regions. Among transported patients, match rates ranged from 14.9% to 87.5% and were directly affected by the availability of hospital data sources and proportion of missing values for key linkage variables. For vital statistics records (1-year mortality), estimated match rates ranged from 88.0% to 98.7%. Use of multiple imputation (compared to complete case analysis) reduced bias for injury outcomes, although sample size, percentage missing, type of variable, and combined-site versus single-site imputation models all affected the resulting estimates and variance. Conclusions This project demonstrates the feasibility and describes the process of constructing population-based injury databases across multiple phases of care using existing data sources and commonly available analytic methods. Attention to key linkage variables and decisions for handling missing values can be used to increase match rates between data sources, minimize bias, and preserve sampling design. PMID:22506952

  14. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  15. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  16. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    PubMed

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  17. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  18. Probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  19. Renewable energy in electric utility capacity planning: a decomposition approach with application to a Mexican utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staschus, K.

    1985-01-01

    In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less

  20. A Framework to Expand and Advance Probabilistic Risk Assessment to Support Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; David Schwieder; Robert Nourgaliev

    2012-09-01

    During the early development of nuclear power plants, researchers and engineers focused on many aspects of plant operation, two of which were getting the newly-found technology to work and minimizing the likelihood of perceived accidents through redundancy and diversity. As time, and our experience, has progressed, the realization of plant operational risk/reliability has entered into the design, operation, and regulation of these plants. But, to date, we have only dabbled at the surface of risk and reliability technologies. For the next generation of small modular reactors (SMRs), it is imperative that these technologies evolve into an accepted, encompassing, validated, andmore » integral part of the plant in order to reduce costs and to demonstrate safe operation. Further, while it is presumed that safety margins are substantial for proposed SMR designs, the depiction and demonstration of these margins needs to be better understood in order to optimize the licensing process.« less

  1. Sequencing batch-reactor control using Gaussian-process models.

    PubMed

    Kocijan, Juš; Hvala, Nadja

    2013-06-01

    This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Probabilistic analysis on the failure of reactivity control for the PWR

    NASA Astrophysics Data System (ADS)

    Sony Tjahyani, D. T.; Deswandri; Sunaryo, G. R.

    2018-02-01

    The fundamental safety function of the power reactor is to control reactivity, to remove heat from the reactor, and to confine radioactive material. The safety analysis is used to ensure that each parameter is fulfilled during the design and is done by deterministic and probabilistic method. The analysis of reactivity control is important to be done because it will affect the other of fundamental safety functions. The purpose of this research is to determine the failure probability of the reactivity control and its failure contribution on a PWR design. The analysis is carried out by determining intermediate events, which cause the failure of reactivity control. Furthermore, the basic event is determined by deductive method using the fault tree analysis. The AP1000 is used as the object of research. The probability data of component failure or human error, which is used in the analysis, is collected from IAEA, Westinghouse, NRC and other published documents. The results show that there are six intermediate events, which can cause the failure of the reactivity control. These intermediate events are uncontrolled rod bank withdrawal at low power or full power, malfunction of boron dilution, misalignment of control rod withdrawal, malfunction of improper position of fuel assembly and ejection of control rod. The failure probability of reactivity control is 1.49E-03 per year. The causes of failures which are affected by human factor are boron dilution, misalignment of control rod withdrawal and malfunction of improper position for fuel assembly. Based on the assessment, it is concluded that the failure probability of reactivity control on the PWR is still within the IAEA criteria.

  3. Soviet space nuclear reactor incidents - Perception versus reality

    NASA Technical Reports Server (NTRS)

    Bennett, Gary L.

    1992-01-01

    Since the Soviet Union reportedly began flying nuclear power sources in 1965 it has had four publicly known accidents involving space reactors, two publicly known accidents involving radioisotope power sources and one close call with a space reactor (Cosmos 1900). The reactor accidents, particularly Cosmos 954 and Cosmos 1402, indicated that the Soviets had adopted burnup as their reentry philosophy which is consistent with the U.S. philosophy from the 1960s and 1970s. While quantitative risk analyses have shown that the Soviet accidents have not posed a serious risk to the world's population, concerns still remain about Soviet space nuclear safety practices.

  4. 40 CFR 63.1407 - Non-reactor batch process vent provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Non-reactor batch process vent... § 63.1407 Non-reactor batch process vent provisions. (a) Emission standards. (1) Owners or operators of non-reactor batch process vents located at new or existing affected sources with 0.25 tons per year (0...

  5. 40 CFR 63.1407 - Non-reactor batch process vent provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 11 2011-07-01 2011-07-01 false Non-reactor batch process vent... § 63.1407 Non-reactor batch process vent provisions. (a) Emission standards. (1) Owners or operators of non-reactor batch process vents located at new or existing affected sources with 0.25 tons per year (0...

  6. Catalog of experimental projects for a fissioning plasma reactor

    NASA Technical Reports Server (NTRS)

    Lanzo, C. D.

    1973-01-01

    Experimental and theoretical investigations were carried out to determine the feasibility of using a small scale fissioning uranium plasma as the power source in a driver reactor. The driver system is a light water cooled and moderated reactor of the MTR type. The eight experiments and proposed configurations for the reactor are outlined.

  7. SOURCELESS STARTUP. A MACHINE CODE FOR COMPUTING LOW-SOURCE REACTOR STARTUPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMillan, D.B.

    1960-06-01

    >A revision to the sourceless start-up code is presented. The code solves a system of differential equations encountered in computing the probability distribution of activity at an observed power level during reactor start-up from a very low source level. (J.R.D.)

  8. Antineutrino analysis for continuous monitoring of nuclear reactors: Sensitivity study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Christopher; Erickson, Anna

    This paper explores the various contributors to uncertainty on predictions of the antineutrino source term which is used for reactor antineutrino experiments and is proposed as a safeguard mechanism for future reactor installations. The errors introduced during simulation of the reactor burnup cycle from variation in nuclear reaction cross sections, operating power, and other factors are combined with those from experimental and predicted antineutrino yields, resulting from fissions, evaluated, and compared. The most significant contributor to uncertainty on the reactor antineutrino source term when the reactor was modeled in 3D fidelity with assembly-level heterogeneity was found to be the uncertaintymore » on the antineutrino yields. Using the reactor simulation uncertainty data, the dedicated observation of a rigorously modeled small, fast reactor by a few-ton near-field detector was estimated to offer reduction of uncertainty on antineutrino yields in the 3.0–6.5 MeV range to a few percent for the primary power-producing fuel isotopes, even with zero prior knowledge of the yields.« less

  9. Bacterial community dynamics in a biodenitrification reactor packed with polylactic acid/poly (3-hydroxybutyrate-co-3-hydroxyvalerate) blend as the carbon source and biofilm carrier.

    PubMed

    Qiu, Tianlei; Xu, Ying; Gao, Min; Han, Meilin; Wang, Xuming

    2017-05-01

    While heterotrophic denitrification has been widely used for treating such nitrogen-rich wastewater, it requires the use of additional carbon sources. With fluctuations in the nitrate concentration in the influent, controlling the C/N ratio to avoid carbon breakthrough becomes difficult. To overcome this obstacle, solid-phase denitrification (SPD) using biodegradable polymers has been used, where denitrification and carbon source biodegradation depend on microorganisms growing within the reactor. However, the microbial community dynamics in continuous-flow SPD reactors have not been fully elucidated yet. Here, we aimed to study bacterial community dynamics in a biodenitrification reactor packed with a polylactic acid/poly (3-hydroxybutyrate-co-3-hydroxyvalerate) (PLA/PHBV) blend as the carbon source and biofilm carrier. A lab-scale denitrifying reactor filled with a PLA/PHBV blend was used. With 85 mg/L of influent NO 3 -N concentration and a hydraulic retention time (HRT) of 2.5 h, more than 92% of the nitrate was removed. The bacterial community of inoculated activated sludge had the highest species richness in all samples. Bacterial species diversity in the reactor first decreased and then increased to a stable level. Diaphorobacter species were predominant in the reactor after day 24. In total, 178 clones were retrieved from the 16S rRNA gene clone library constructed from the biofilm samples in the reactor at 62 days of operation, and 80.9% of the clones were affiliated with Betaproteobacteria. Of these, 97.2% were classified into phylotypes corresponding to Diaphorobacter nitroreducens strain NA10B with 99% sequence similarity. Diaphorobacter, Rhizobium, Acidovorax, Rubrivivax, Azospira, Thermomonas, and Acidaminobacter constituted the biofilm microflora in the stably running reactor. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  10. Developing an Event-Tree Probabilistic Tsunami Inundation Model for NE Atlantic Coasts: Application to a Case Study

    NASA Astrophysics Data System (ADS)

    Omira, R.; Matias, L.; Baptista, M. A.

    2016-12-01

    This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.

  11. Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference

    PubMed Central

    Ehrenfeld, Stephan; Herbort, Oliver; Butz, Martin V.

    2013-01-01

    This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control. PMID:24191151

  12. Background radiation measurements at high power research reactors

    NASA Astrophysics Data System (ADS)

    Ashenfelter, J.; Balantekin, B.; Baldenegro, C. X.; Band, H. R.; Barclay, G.; Bass, C. D.; Berish, D.; Bowden, N. S.; Bryan, C. D.; Cherwinka, J. J.; Chu, R.; Classen, T.; Davee, D.; Dean, D.; Deichert, G.; Dolinski, M. J.; Dolph, J.; Dwyer, D. A.; Fan, S.; Gaison, J. K.; Galindo-Uribarri, A.; Gilje, K.; Glenn, A.; Green, M.; Han, K.; Hans, S.; Heeger, K. M.; Heffron, B.; Jaffe, D. E.; Kettell, S.; Langford, T. J.; Littlejohn, B. R.; Martinez, D.; McKeown, R. D.; Morrell, S.; Mueller, P. E.; Mumm, H. P.; Napolitano, J.; Norcini, D.; Pushin, D.; Romero, E.; Rosero, R.; Saldana, L.; Seilhan, B. S.; Sharma, R.; Stemen, N. T.; Surukuchi, P. T.; Thompson, S. J.; Varner, R. L.; Wang, W.; Watson, S. M.; White, B.; White, C.; Wilhelmi, J.; Williams, C.; Wise, T.; Yao, H.; Yeh, M.; Yen, Y.-R.; Zhang, C.; Zhang, X.; Prospect Collaboration

    2016-01-01

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the background fields encountered. The general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.

  13. Developing an event-tree probabilistic tsunami inundation model for NE Atlantic coasts: Application to case studies

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Baptista, Maria Ana; Matias, Luis

    2015-04-01

    This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).

  14. Analysis of space reactor system components: Investigation through simulation and non-nuclear testing

    NASA Astrophysics Data System (ADS)

    Bragg-Sitton, Shannon M.

    The use of fission energy in space power and propulsion systems offers considerable advantages over chemical propulsion. Fission provides over six orders of magnitude higher energy density, which translates to higher vehicle specific impulse and lower specific mass. These characteristics enable ambitious space exploration missions. The natural space radiation environment provides an external source of protons and high energy, high Z particles that can result in the production of secondary neutrons through interactions in reactor structures. Applying the approximate proton source in geosynchronous orbit during a solar particle event, investigation using MCNPX 2.5.b for proton transport through the SAFE-400 heat pipe cooled reactor indicates an incoming secondary neutron current of (1.16 +/- 0.03) x 107 n/s at the core-reflector interface. This neutron current may affect reactor operation during low power maneuvers (e.g., start-up) and may provide a sufficient reactor start-up source. It is important that a reactor control system be designed to automatically adjust to changes in reactor power levels, maintaining nominal operation without user intervention. A robust, autonomous control system is developed and analyzed for application during reactor start-up, accounting for fluctuations in the radiation environment that result from changes in vehicle location or to temporal variations in the radiation field. Development of a nuclear reactor for space applications requires a significant amount of testing prior to deployment of a flight unit. High confidence in fission system performance can be obtained through relatively inexpensive non-nuclear tests performed in relevant environments, with the heat from nuclear fission simulated using electric resistance heaters. A series of non-nuclear experiments was performed to characterize various aspects of reactor operation. This work includes measurement of reactor core deformation due to material thermal expansion and implementation of a virtual reactivity feedback control loop; testing and thermal hydraulic characterization of the coolant flow paths for two space reactor concepts; and analysis of heat pipe operation during start-up and steady state operation.

  15. XID+: Next generation XID development

    NASA Astrophysics Data System (ADS)

    Hurley, Peter

    2017-04-01

    XID+ is a prior-based source extraction tool which carries out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. It uses a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates.

  16. Carbothermic reduction with parallel heat sources

    DOEpatents

    Troup, Robert L.; Stevenson, David T.

    1984-12-04

    Disclosed are apparatus and method of carbothermic direct reduction for producing an aluminum alloy from a raw material mix including aluminum oxide, silicon oxide, and carbon wherein parallel heat sources are provided by a combustion heat source and by an electrical heat source at essentially the same position in the reactor, e.g., such as at the same horizontal level in the path of a gravity-fed moving bed in a vertical reactor. The present invention includes providing at least 79% of the heat energy required in the process by the electrical heat source.

  17. Study on the evaluation method for fault displacement based on characterized source model

    NASA Astrophysics Data System (ADS)

    Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.

    2016-12-01

    In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.

  18. BA3b and BA1 activate in a serial fashion after median nerve stimulation: direct evidence from combining source analysis of evoked fields and cytoarchitectonic probabilistic maps.

    PubMed

    Papadelis, Christos; Eickhoff, Simon B; Zilles, Karl; Ioannides, Andreas A

    2011-01-01

    This study combines source analysis imaging data for early somatosensory processing and the probabilistic cytoarchitectonic maps (PCMs). Human somatosensory evoked fields (SEFs) were recorded by stimulating left and right median nerves. Filtering the recorded responses in different frequency ranges identified the most responsive frequency band. The short-latency averaged SEFs were analyzed using a single equivalent current dipole (ECD) model and magnetic field tomography (MFT). The identified foci of activity were superimposed with PCMs. Two major components of opposite polarity were prominent around 21 and 31 ms. A weak component around 25 ms was also identified. For the most responsive frequency band (50-150 Hz) ECD and MFT revealed one focal source at the contralateral Brodmann area 3b (BA3b) at the peak of N20. The component ~25 ms was localised in Brodmann area 1 (BA1) in 50-150 Hz. By using ECD, focal generators around 28-30 ms located initially in BA3b and 2 ms later to BA1. MFT also revealed two focal sources - one in BA3b and one in BA1 for these latencies. Our results provide direct evidence that the earliest cortical response after median nerve stimulation is generated within the contralateral BA3b. BA1 activation few milliseconds later indicates a serial mode of somatosensory processing within cytoarchitectonic SI subdivisions. Analysis of non-invasive magnetoencephalography (MEG) data and the use of PCMs allow unambiguous and quantitative (probabilistic) interpretation of cytoarchitectonic identity of activated areas following median nerve stimulation, even with the simple ECD model, but only when the model fits the data extremely well. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. Investigation of a Possibility of Chromium-51 Accumulation in the SM-3 Reactor to Fabricate a Neutrino Source

    NASA Astrophysics Data System (ADS)

    Romanov, E. G.; Gavrin, V. N.; Tarasov, V. A.; Malkov, A. P.; Kupriyanov, A. V.; Danshin, S. N.; Veretenkin, E. P.

    2017-01-01

    Compact high intensity neutrino sources based on 51Cr isotope are demanded for very short baseline neutrino experiments. In particular, a 3 MCi 51Cr neutrino source is needed for the experiment BEST on search for transitions of electron neutrinos to sterile states. The paper presents the results of the analysis of options of the irradiation of highly enriched 50Cr in the existing trap of thermal neutrons of high-flux reactor SM-3, as well as using the most promising variants of the trap after upcoming reconstruction of the reactor. It is shown that it is possible to to obtain the intensity of 51Cr up to 3.85 MCi at the end of irradiation of 50Cr enriched to 97% in the high-flux reactor SM-3 of the JSC “SSC NIIAR”.

  20. 40 CFR 63.107 - Identification of process vents subject to this subpart.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... process vents associated with an air oxidation reactor, distillation unit, or reactor that is in a source.... (b) Some, or all, of the gas stream originates as a continuous flow from an air oxidation reactor... specified in paragraphs (c)(1) through (3) of this section. (1) Is directly from an air oxidation reactor...

  1. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  2. Toward a Mechanistic Source Term in Advanced Reactors: A Review of Past U.S. SFR Incidents, Experiments, and Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Brunett, Acacia J.; Grabaskas, David

    In 2015, as part of a Regulatory Technology Development Plan (RTDP) effort for sodium-cooled fast reactors (SFRs), Argonne National Laboratory investigated the current state of knowledge of source term development for a metal-fueled, pool-type SFR. This paper provides a summary of past domestic metal-fueled SFR incidents and experiments and highlights information relevant to source term estimations that were gathered as part of the RTDP effort. The incidents described in this paper include fuel pin failures at the Sodium Reactor Experiment (SRE) facility in July of 1959, the Fermi I meltdown that occurred in October of 1966, and the repeated meltingmore » of a fuel element within an experimental capsule at the Experimental Breeder Reactor II (EBR-II) from November 1967 to May 1968. The experiments described in this paper include the Run-Beyond-Cladding-Breach tests that were performed at EBR-II in 1985 and a series of severe transient overpower tests conducted at the Transient Reactor Test Facility (TREAT) in the mid-1980s.« less

  3. 75 FR 13 - Alternate Fracture Toughness Requirements for Protection Against Pressurized Thermal Shock Events

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ...The Nuclear Regulatory Commission (NRC) is amending its regulations to provide alternate fracture toughness requirements for protection against pressurized thermal shock (PTS) events for pressurized water reactor (PWR) pressure vessels. This final rule provides alternate PTS requirements based on updated analysis methods. This action is desirable because the existing requirements are based on unnecessarily conservative probabilistic fracture mechanics analyses. This action reduces regulatory burden for those PWR licensees who expect to exceed the existing requirements before the expiration of their licenses, while maintaining adequate safety, and may choose to comply with the final rule as an alternative to complying with the existing requirements.

  4. SETS. Set Equation Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worrell, R.B.

    1992-01-13

    SETS is used for symbolic manipulation of Boolean equations, particularly the reduction of equations by the application of Boolean identities. It is a flexible and efficient tool for performing probabilistic risk analysis (PRA), vital area analysis, and common cause analysis. The equation manipulation capabilities of SETS can also be used to analyze noncoherent fault trees and determine prime implicants of Boolean functions, to verify circuit design implementation, to determine minimum cost fire protection requirements for nuclear reactor plants, to obtain solutions to combinatorial optimization problems with Boolean constraints, and to determine the susceptibility of a facility to unauthorized access throughmore » nullification of sensors in its protection system.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  6. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha

    Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.

  8. Background radiation measurements at high power research reactors

    DOE PAGES

    Ashenfelter, J.; Yeh, M.; Balantekin, B.; ...

    2015-10-23

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the backgroundmore » fields encountered. Furthermore, the general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.« less

  9. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  10. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  11. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    NASA Astrophysics Data System (ADS)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  12. Simplifying microbial electrosynthesis reactor design.

    PubMed

    Giddings, Cloelle G S; Nevin, Kelly P; Woodward, Trevor; Lovley, Derek R; Butler, Caitlyn S

    2015-01-01

    Microbial electrosynthesis, an artificial form of photosynthesis, can efficiently convert carbon dioxide into organic commodities; however, this process has only previously been demonstrated in reactors that have features likely to be a barrier to scale-up. Therefore, the possibility of simplifying reactor design by both eliminating potentiostatic control of the cathode and removing the membrane separating the anode and cathode was investigated with biofilms of Sporomusa ovata. S. ovata reduces carbon dioxide to acetate and acts as the microbial catalyst for plain graphite stick cathodes as the electron donor. In traditional 'H-cell' reactors, where the anode and cathode chambers were separated with a proton-selective membrane, the rates and columbic efficiencies of microbial electrosynthesis remained high when electron delivery at the cathode was powered with a direct current power source rather than with a potentiostat-poised cathode utilized in previous studies. A membrane-less reactor with a direct-current power source with the cathode and anode positioned to avoid oxygen exposure at the cathode, retained high rates of acetate production as well as high columbic and energetic efficiencies. The finding that microbial electrosynthesis is feasible without a membrane separating the anode from the cathode, coupled with a direct current power source supplying the energy for electron delivery, is expected to greatly simplify future reactor design and lower construction costs.

  13. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  14. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    NASA Astrophysics Data System (ADS)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities around the SCS region, the tsunami hazard and risk should be further highlighted in the future.

  15. The history and perspective of Romania-USA cooperation in the field of technologic transfer of TRIGA reactor concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciocaanescu, M.; Ionescu, M.

    1996-08-01

    The cooperation between Romania and the USA in the field of technologic transfer of nuclear research reactor technology began with the steady state 14 MW{sub t} TRIGA reactor, installed at INR Pitesti, Romania. It is the first in the range of TRIGA reactors proposed as a materials testing reactor. The first criticality was reached in November 19, 1979 and first operation at 14 MW{sub t} level was in February 1980. The paper will present the short history of this cooperation and the perspective for a new cooperation for building a Nuclear Heating Plant using the TRIGA reactor concept for demonstrationmore » purpose. The energy crisis is a world-wide problem which affects each country in different ways because the resources and the consumption are unfairly distributed. World-wide research points out that the fossil fuel sources are not to be considered the main energy sources for the long term as they are limited.« less

  16. Probabilistic power flow using improved Monte Carlo simulation method with correlated wind sources

    NASA Astrophysics Data System (ADS)

    Bie, Pei; Zhang, Buhan; Li, Hang; Deng, Weisi; Wu, Jiasi

    2017-01-01

    Probabilistic Power Flow (PPF) is a very useful tool for power system steady-state analysis. However, the correlation among different random injection power (like wind power) brings great difficulties to calculate PPF. Monte Carlo simulation (MCS) and analytical methods are two commonly used methods to solve PPF. MCS has high accuracy but is very time consuming. Analytical method like cumulants method (CM) has high computing efficiency but the cumulants calculating is not convenient when wind power output does not obey any typical distribution, especially when correlated wind sources are considered. In this paper, an Improved Monte Carlo simulation method (IMCS) is proposed. The joint empirical distribution is applied to model different wind power output. This method combines the advantages of both MCS and analytical method. It not only has high computing efficiency, but also can provide solutions with enough accuracy, which is very suitable for on-line analysis.

  17. bnstruct: an R package for Bayesian Network structure learning in the presence of missing data.

    PubMed

    Franzin, Alberto; Sambo, Francesco; Di Camillo, Barbara

    2017-04-15

    A Bayesian Network is a probabilistic graphical model that encodes probabilistic dependencies between a set of random variables. We introduce bnstruct, an open source R package to (i) learn the structure and the parameters of a Bayesian Network from data in the presence of missing values and (ii) perform reasoning and inference on the learned Bayesian Networks. To the best of our knowledge, there is no other open source software that provides methods for all of these tasks, particularly the manipulation of missing data, which is a common situation in practice. The software is implemented in R and C and is available on CRAN under a GPL licence. francesco.sambo@unipd.it. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  19. Pyrolysis reactor and fluidized bed combustion chamber

    DOEpatents

    Green, Norman W.

    1981-01-06

    A solid carbonaceous material is pyrolyzed in a descending flow pyrolysis reactor in the presence of a particulate source of heat to yield a particulate carbon containing solid residue. The particulate source of heat is obtained by educting with a gaseous source of oxygen the particulate carbon containing solid residue from a fluidized bed into a first combustion zone coupled to a second combustion zone. A source of oxygen is introduced into the second combustion zone to oxidize carbon monoxide formed in the first combustion zone to heat the solid residue to the temperature of the particulate source of heat.

  20. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less

  1. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  2. ProbCD: enrichment analysis accounting for categorization uncertainty.

    PubMed

    Vêncio, Ricardo Z N; Shmulevich, Ilya

    2007-10-12

    As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.

  3. MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Tarquini, Simone

    2018-01-01

    A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.

  4. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Yin; Gao, Wenzhong; Momoh, James

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgridmore » system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.« less

  5. High Power LaB6 Plasma Source Performance for the Lockheed Martin Compact Fusion Reactor Experiment

    NASA Astrophysics Data System (ADS)

    Heinrich, Jonathon

    2016-10-01

    Lockheed Martin's Compact Fusion Reactor (CFR) concept is a linear encapsulated ring cusp. Due to the complex field geometry, plasma injection into the device requires careful consideration. A high power thermionic plasma source (>0.25MW; >10A/cm2) has been developed with consideration to phase space for optimal coupling. We present the performance of the plasma source, comparison with alternative plasma sources, and plasma coupling with the CFR field configuration. ©2016 Lockheed Martin Corporation. All Rights Reserved.

  6. Applying probabilistic temporal and multisite data quality control methods to a public health mortality registry in Spain: a systematic approach to quality control of repositories.

    PubMed

    Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M

    2016-11-01

    To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  8. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less

  9. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  10. Seismic hazard assessment for Guam and the Northern Mariana Islands

    USGS Publications Warehouse

    Mueller, Charles S.; Haller, Kathleen M.; Luco, Nicholas; Petersen, Mark D.; Frankel, Arthur D.

    2012-01-01

    We present the results of a new probabilistic seismic hazard assessment for Guam and the Northern Mariana Islands. The Mariana island arc has formed in response to northwestward subduction of the Pacific plate beneath the Philippine Sea plate, and this process controls seismic activity in the region. Historical seismicity, the Mariana megathrust, and two crustal faults on Guam were modeled as seismic sources, and ground motions were estimated by using published relations for a firm-rock site condition. Maps of peak ground acceleration, 0.2-second spectral acceleration for 5 percent critical damping, and 1.0-second spectral acceleration for 5 percent critical damping were computed for exceedance probabilities of 2 percent and 10 percent in 50 years. For 2 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.94 gravitational acceleration at Guam and 0.57 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 2.86 gravitational acceleration at Guam and 1.75 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.61 gravitational acceleration at Guam and 0.37 gravitational acceleration at Saipan. For 10 percent probability of exceedance in 50 years, probabilistic peak ground acceleration is 0.49 gravitational acceleration at Guam and 0.29 gravitational acceleration at Saipan, 0.2-second spectral acceleration is 1.43 gravitational acceleration at Guam and 0.83 gravitational acceleration at Saipan, and 1.0-second spectral acceleration is 0.30 gravitational acceleration at Guam and 0.18 gravitational acceleration at Saipan. The dominant hazard source at the islands is upper Benioff-zone seismicity (depth 40–160 kilometers). The large probabilistic ground motions reflect the strong concentrations of this activity below the arc, especially near Guam.

  11. Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach

    NASA Astrophysics Data System (ADS)

    Schumacher, Thomas; Straub, Daniel; Higgins, Christopher

    2012-09-01

    Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.

  12. KINETICS OF LOW SOURCE REACTOR STARTUPS. PART II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    hurwitz, H. Jr.; MacMillan, D.B.; Smith, J.H.

    1962-06-01

    A computational technique is described for computation of the probability distribution of power level for a low source reactor startup. The technique uses a mathematical model, for the time-dependent probability distribution of neutron and precursor concentration, having finite neutron lifetime, one group of delayed neutron precursors, and no spatial dependence. Results obtained by the technique are given. (auth)

  13. Subcritical unity for the Argonaut reactor (in Portuguese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mongiovi, G.; Aghina, L.O.B.

    1971-04-01

    tubetype fuel elements aiming at the construction of a subcritical unit employing the internal thermal column of an Argonaut reactor as a source. The results confirmed the feasibility of the use of natural UO/sub 2/ for the proposed arrangement as long as one has a strong source or a subcritical unit diameter greater than 100 cm. (INIS)

  14. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    PubMed

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Station Blackout: A case study in the interaction of mechanistic and probabilistic safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; Diego Mandelli; Cristian Rabiti

    2013-11-01

    The ability to better characterize and quantify safety margins is important to improved decision making about nuclear power plant design, operation, and plant life extension. As research and development (R&D) in the light-water reactor (LWR) Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway R&D is to support plant decisions for risk-informed margin management with the aim tomore » improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario.« less

  16. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  17. How to Produce a Reactor Neutron Spectrum Using a Proton Accelerator

    DOE PAGES

    Burns, Kimberly A.; Wootan, David W.; Gates, Robert O.; ...

    2015-06-18

    A method for reproducing the neutron energy spectrum present in the core of an operating nuclear reactor using an engineered target in an accelerator proton beam is proposed. The protons interact with a target to create neutrons through various (p,n) type reactions. Spectral tailoring of the emitted neutrons can be used to modify the energy of the generated neutron spectrum to represent various reactor spectra. Through the use of moderators and reflectors, the neutron spectrum can be modified to reproduce many different spectra of interest including spectra in small thermal test reactors, large pressurized water reactors, and fast reactors. Themore » particular application of this methodology is the design of an experimental approach for using an accelerator to measure the betas produced during fission to be used to reduce uncertainties in the interpretation of reactor antineutrino measurements. This approach involves using a proton accelerator to produce a neutron field representative of a power reactor, and using this neutron field to irradiate fission foils of the primary isotopes contributing to fission in the reactor, creating unstable, neutron rich fission products that subsequently beta decay and emit electron antineutrinos. A major advantage of an accelerator neutron source over a neutron beam from a thermal reactor is that the fast neutrons can be slowed down or tailored to approximate various power reactor spectra. An accelerator based neutron source that can be tailored to match various reactor neutron spectra provides an advantage for control in studying how changes in the neutron spectra affect parameters such as the resulting fission product beta spectrum.« less

  18. Preparation of dilute magnetic semiconductor films by metalorganic chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Nouhi, Akbar (Inventor); Stirn, Richard J. (Inventor)

    1988-01-01

    A method for preparation of a dilute magnetic semiconductor (DMS) film is provided, in which a Group II metal source, a Group VI metal source and a transition metal magnetic ion source are pyrolyzed in the reactor of a metalorganic chemical vapor deposition (MOCVD) system by contact with a heated substrate. As an example, the preparation of films of Cd(sub 1-x)Mn(sub x)Te, in which 0 is less than or equal to x less than or equal to 0.7, on suitable substrates (e.g., GaAs) is described. As a source of manganese, tricarbonyl (methylcyclopentadienyl) manganese (TCPMn) is employed. To prevent TCPMn condensation during its introduction into the reactor, the gas lines, valves and reactor tubes are heated. A thin-film solar cell of n-i-p structure, in which the i-type layer comprises a DMS, is also described; the i-type layer is suitably prepared by MOCVD.

  19. Preparation of dilute magnetic semiconductor films by metalorganic chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Nouhi, Akbar (Inventor); Stirn, Richard J. (Inventor)

    1990-01-01

    A method for preparation of a dilute magnetic semiconductor (DMS) film is provided, wherein a Group II metal source, a Group VI metal source and a transition metal magnetic ion source are pyrolyzed in the reactor of a metalorganic chemical vapor deposition (MOCVD) system by contact with a heated substrate. As an example, the preparation of films of Cd.sub.1-x Mn.sub.x Te, wherein 0.ltoreq..times..ltoreq.0.7, on suitable substrates (e.g., GaAs) is described. As a source of manganese, tricarbonyl (methylcyclopentadienyl) maganese (TCPMn) is employed. To prevent TCPMn condensation during the introduction thereof int the reactor, the gas lines, valves and reactor tubes are heated. A thin-film solar cell of n-i-p structure, wherein the i-type layer comprises a DMS, is also described; the i-type layer is suitably prepared by MOCVD.

  20. Analysis of feature selection with Probabilistic Neural Network (PNN) to classify sources influencing indoor air quality

    NASA Astrophysics Data System (ADS)

    Saad, S. M.; Shakaff, A. Y. M.; Saad, A. R. M.; Yusof, A. M.; Andrew, A. M.; Zakaria, A.; Adom, A. H.

    2017-03-01

    There are various sources influencing indoor air quality (IAQ) which could emit dangerous gases such as carbon monoxide (CO), carbon dioxide (CO2), ozone (O3) and particulate matter. These gases are usually safe for us to breathe in if they are emitted in safe quantity but if the amount of these gases exceeded the safe level, they might be hazardous to human being especially children and people with asthmatic problem. Therefore, a smart indoor air quality monitoring system (IAQMS) is needed that able to tell the occupants about which sources that trigger the indoor air pollution. In this project, an IAQMS that able to classify sources influencing IAQ has been developed. This IAQMS applies a classification method based on Probabilistic Neural Network (PNN). It is used to classify the sources of indoor air pollution based on five conditions: ambient air, human activity, presence of chemical products, presence of food and beverage, and presence of fragrance. In order to get good and best classification accuracy, an analysis of several feature selection based on data pre-processing method is done to discriminate among the sources. The output from each data pre-processing method has been used as the input for the neural network. The result shows that PNN analysis with the data pre-processing method give good classification accuracy of 99.89% and able to classify the sources influencing IAQ high classification rate.

  1. Strengthening the fission reactor nuclear science and engineering program at UCLA. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okrent, D.

    1997-06-23

    This is the final report on DOE Award No. DE-FG03-92ER75838 A000, a three year matching grant program with Pacific Gas and Electric Company (PG and E) to support strengthening of the fission reactor nuclear science and engineering program at UCLA. The program began on September 30, 1992. The program has enabled UCLA to use its strong existing background to train students in technological problems which simultaneously are of interest to the industry and of specific interest to PG and E. The program included undergraduate scholarships, graduate traineeships and distinguished lecturers. Four topics were selected for research the first year, withmore » the benefit of active collaboration with personnel from PG and E. These topics remained the same during the second year of this program. During the third year, two topics ended with the departure o the students involved (reflux cooling in a PWR during a shutdown and erosion/corrosion of carbon steel piping). Two new topics (long-term risk and fuel relocation within the reactor vessel) were added; hence, the topics during the third year award were the following: reflux condensation and the effect of non-condensable gases; erosion/corrosion of carbon steel piping; use of artificial intelligence in severe accident diagnosis for PWRs (diagnosis of plant status during a PWR station blackout scenario); the influence on risk of organization and management quality; considerations of long term risk from the disposal of hazardous wastes; and a probabilistic treatment of fuel motion and fuel relocation within the reactor vessel during a severe core damage accident.« less

  2. Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David

    2015-07-01

    Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less

  3. Sodium leak detection system for liquid metal cooled nuclear reactors

    DOEpatents

    Modarres, Dariush

    1991-01-01

    A light source is projected across the gap between the containment vessel and the reactor vessel. The reflected light is then analyzed with an absorption spectrometer. The presence of any sodium vapor along the optical path results in a change of the optical transmissivity of the media. Since the absorption spectrum of sodium is well known, the light source is chosen such that the sensor is responsive only to the presence of sodium molecules. The optical sensor is designed to be small and require a minimum of amount of change to the reactor containment vessel.

  4. Abstract probabilistic CNOT gate model based on double encoding: study of the errors and physical realizability

    NASA Astrophysics Data System (ADS)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2015-03-01

    In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.

  5. Synfuel production in nuclear reactors

    DOEpatents

    Henning, C.D.

    Apparatus and method for producing synthetic fuels and synthetic fuel components by using a neutron source as the energy source, such as a fusion reactor. Neutron absorbers are disposed inside a reaction pipe and are heated by capturing neutrons from the neutron source. Synthetic fuel feedstock is then placed into contact with the heated neutron absorbers. The feedstock is heated and dissociates into its constituent synfuel components, or alternatively is at least preheated sufficiently to use in a subsequent electrolysis process to produce synthetic fuels and synthetic fuel components.

  6. High aspect ratio catalytic reactor and catalyst inserts therefor

    DOEpatents

    Lin, Jiefeng; Kelly, Sean M.

    2018-04-10

    The present invention relates to high efficient tubular catalytic steam reforming reactor configured from about 0.2 inch to about 2 inch inside diameter high temperature metal alloy tube or pipe and loaded with a plurality of rolled catalyst inserts comprising metallic monoliths. The catalyst insert substrate is formed from a single metal foil without a central supporting structure in the form of a spiral monolith. The single metal foil is treated to have 3-dimensional surface features that provide mechanical support and establish open gas channels between each of the rolled layers. This unique geometry accelerates gas mixing and heat transfer and provides a high catalytic active surface area. The small diameter, high aspect ratio tubular catalytic steam reforming reactors loaded with rolled catalyst inserts can be arranged in a multi-pass non-vertical parallel configuration thermally coupled with a heat source to carry out steam reforming of hydrocarbon-containing feeds. The rolled catalyst inserts are self-supported on the reactor wall and enable efficient heat transfer from the reactor wall to the reactor interior, and lower pressure drop than known particulate catalysts. The heat source can be oxygen transport membrane reactors.

  7. Probabilistic Description of the Hydrologic Risk in Agriculture

    NASA Astrophysics Data System (ADS)

    Vico, G.; Porporato, A. M.

    2011-12-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climatic variability on agroecosystems productivity and profitability, at the expenses of increasing water requirements for irrigation purposes. Optimizing water allocation for crop yield preservation and sustainable development needs to account for hydro-climatic variability, which is by far the main source of uncertainty affecting crop yields and irrigation water requirements. In this contribution, a widely applicable probabilistic framework is proposed to quantitatively define the hydrologic risk of yield reduction for both rainfed and irrigated agriculture. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season. Based on these linkages, long-term and real-time yield reduction risk indices are defined as a function of climate, soil and crop parameters, as well as irrigation strategy. The former risk index is suitable for long-term irrigation strategy assessment and investment planning, while the latter risk index provides a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season. This probabilistic framework allows also assessing the impact of limited water availability on crop yield, thus guiding the optimal allocation of water resources for human and environmental needs. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios, thus facilitating the assessment of the impact of increasingly frequent water shortages on agricultural productivity, profitability, and sustainability.

  8. Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse

    EPA Science Inventory

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...

  9. Reactor monitoring using antineutrino detectors

    NASA Astrophysics Data System (ADS)

    Bowden, N. S.

    2011-08-01

    Nuclear reactors have served as the antineutrino source for many fundamental physics experiments. The techniques developed by these experiments make it possible to use these weakly interacting particles for a practical purpose. The large flux of antineutrinos that leaves a reactor carries information about two quantities of interest for safeguards: the reactor power and fissile inventory. Measurements made with antineutrino detectors could therefore offer an alternative means for verifying the power history and fissile inventory of a reactor as part of International Atomic Energy Agency (IAEA) and/or other reactor safeguards regimes. Several efforts to develop this monitoring technique are underway worldwide.

  10. Nuclear fuel requirements for the American economy - A model

    NASA Astrophysics Data System (ADS)

    Curtis, Thomas Dexter

    A model is provided to determine the amounts of various fuel streams required to supply energy from planned and projected nuclear plant operations, including new builds. Flexible, user-defined scenarios can be constructed with respect to energy requirements, choices of reactors and choices of fuels. The model includes interactive effects and extends through 2099. Outputs include energy provided by reactors, the number of reactors, and masses of natural Uranium and other fuels used. Energy demand, including electricity and hydrogen, is obtained from US DOE historical data and projections, along with other studies of potential hydrogen demand. An option to include other energy demand to nuclear power is included. Reactor types modeled include (thermal reactors) PWRs, BWRs and MHRs and (fast reactors) GFRs and SFRs. The MHRs (VHTRs), GFRs and SFRs are similar to those described in the 2002 DOE "Roadmap for Generation IV Nuclear Energy Systems." Fuel source choices include natural Uranium, self-recycled spent fuel, Plutonium from breeder reactors and existing stockpiles of surplus HEU, military Plutonium, LWR spent fuel and depleted Uranium. Other reactors and fuel sources can be added to the model. Fidelity checks of the model's results indicate good agreement with historical Uranium use and number of reactors, and with DOE projections. The model supports conclusions that substantial use of natural Uranium will likely continue to the end of the 21st century, though legacy spent fuel and depleted uranium could easily supply all nuclear energy demand by shifting to predominant use of fast reactors.

  11. Towards a robust framework for Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami in New Zealand

    NASA Astrophysics Data System (ADS)

    Mueller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming

    2013-04-01

    Probabilistic Tsunami Hazard Assessment (PTHA) is conceptually closely related to Probabilistic Seismic Hazard Assessment (PSHA). The main difference is that PTHA needs to simulate propagation of tsunami waves through the ocean and cannot rely on attenuation relationships, which makes PTHA computationally more expensive. The wave propagation process can be assumed to be linear as long as water depth is much larger than the wave amplitude of the tsunami. Beyond this limit a non-linear scheme has to be employed with significantly higher algorithmic run times. PTHA considering far-field tsunami sources typically uses unit source simulations, and relies on the linearity of the process by later scaling and combining the wave fields of individual simulations to represent the intended earthquake magnitude and rupture area. Probabilistic assessments are typically made for locations offshore but close to the coast. Inundation is calculated only for significantly contributing events (de-aggregation). For local and regional tsunami it has been demonstrated that earthquake rupture complexity has a significant effect on the tsunami amplitude distribution offshore and also on inundation. In this case PTHA has to take variable slip distributions and non-linearity into account. A unit source approach cannot easily be applied. Rupture complexity is seen as an aleatory uncertainty and can be incorporated directly into the rate calculation. We have developed a framework that manages the large number of simulations required for local PTHA. As an initial case study the effect of rupture complexity on tsunami inundation and the statistics of the distribution of wave heights have been investigated for plate-interface earthquakes in the Hawke's Bay region in New Zealand. Assessing the probability that water levels will be in excess of a certain threshold requires the calculation of empirical cumulative distribution functions (ECDF). We compare our results with traditional estimates for tsunami inundation simulations that do not consider rupture complexity. De-aggregation based on moment magnitude alone might not be appropriate, because the hazard posed by any individual event can be underestimated locally if rupture complexity is ignored.

  12. Internal validation of STRmix™ for the interpretation of single source and mixed DNA profiles.

    PubMed

    Moretti, Tamyra R; Just, Rebecca S; Kehl, Susannah C; Willis, Leah E; Buckleton, John S; Bright, Jo-Anne; Taylor, Duncan A; Onorato, Anthony J

    2017-07-01

    The interpretation of DNA evidence can entail analysis of challenging STR typing results. Genotypes inferred from low quality or quantity specimens, or mixed DNA samples originating from multiple contributors, can result in weak or inconclusive match probabilities when a binary interpretation method and necessary thresholds (such as a stochastic threshold) are employed. Probabilistic genotyping approaches, such as fully continuous methods that incorporate empirically determined biological parameter models, enable usage of more of the profile information and reduce subjectivity in interpretation. As a result, software-based probabilistic analyses tend to produce more consistent and more informative results regarding potential contributors to DNA evidence. Studies to assess and internally validate the probabilistic genotyping software STRmix™ for casework usage at the Federal Bureau of Investigation Laboratory were conducted using lab-specific parameters and more than 300 single-source and mixed contributor profiles. Simulated forensic specimens, including constructed mixtures that included DNA from two to five donors across a broad range of template amounts and contributor proportions, were used to examine the sensitivity and specificity of the system via more than 60,000 tests comparing hundreds of known contributors and non-contributors to the specimens. Conditioned analyses, concurrent interpretation of amplification replicates, and application of an incorrect contributor number were also performed to further investigate software performance and probe the limitations of the system. In addition, the results from manual and probabilistic interpretation of both prepared and evidentiary mixtures were compared. The findings support that STRmix™ is sufficiently robust for implementation in forensic laboratories, offering numerous advantages over historical methods of DNA profile analysis and greater statistical power for the estimation of evidentiary weight, and can be used reliably in human identification testing. With few exceptions, likelihood ratio results reflected intuitively correct estimates of the weight of the genotype possibilities and known contributor genotypes. This comprehensive evaluation provides a model in accordance with SWGDAM recommendations for internal validation of a probabilistic genotyping system for DNA evidence interpretation. Copyright © 2017. Published by Elsevier B.V.

  13. Comparison of irradiation behaviour of HTR graphite grades

    NASA Astrophysics Data System (ADS)

    Heijna, M. C. R.; de Groot, S.; Vreeling, J. A.

    2017-08-01

    The INNOGRAPH irradiations were executed in the High Flux Reactor (HFR) in Petten by NRG supported by the European Framework programs HTR-M, RAPHAEL, and ARCHER to generate data on the irradiation behaviour of graphite grades for High Temperature Reactor (HTR) application available at that time. Samples of the graphite grades NBG-10, NBG-17, NBG-18, NBG-20, NBG-25, PCEA, PPEA, PCIB, and IG-110 have been irradiated at 750 °C and 950 °C. The inherent scatter induced by the probabilistic material behaviour of graphite requires uncertainty and scatter induced by test conditions and post-irradiation examination to be minimized. The INNOGRAPH irradiations supplied an adequate number of irradiated samples to enable accurate determination of material properties and their evolution under irradiation. This allows comparison of different graphite grades and a qualitative assessment of their appropriateness for HTR applications, as a basis of selection, design and core component lifetime. The results indicate that coarse grained graphite grades exhibit more favourable behaviour for application in HTRs due to their low dimensional anisotropy and fracture propagation resilience.

  14. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  15. Un regard international sur la sécurité nucléaire

    NASA Astrophysics Data System (ADS)

    Birkhofer, Adolf

    2002-10-01

    Safety has always been an important objective in nuclear technology. Starting with a set of sound physical principles and prudent design approaches, safety concepts have gradually been refined and cover now a wide range of provisions related to design, quality and operation. Research, the evaluation of operating experiences and probabilistic risk assessments constitute an essential basis and international co-operation plays a significant role in that context. Concerning future developments a major objective for new reactor concepts, such as the EPR, is to practically exclude a severe core damage accident with large scale consequences outside the plant. To cite this article: A. Birkhofer, C. R. Physique 3 (2002) 1059-1065.

  16. Upgrading the fuel-handling machine of the Novovoronezh nuclear power plant unit no. 5

    NASA Astrophysics Data System (ADS)

    Terekhov, D. V.; Dunaev, V. I.

    2014-02-01

    The calculation of safety parameters was carried out in the process of upgrading the fuel-handling machine (FHM) of the Novovoronezh nuclear power plant (NPP) unit no. 5 based on the results of quantitative safety analysis of nuclear fuel transfer operations using a dynamic logical-and-probabilistic model of the processing procedure. Specific engineering and design concepts that made it possible to reduce the probability of damaging the fuel assemblies (FAs) when performing various technological operations by an order of magnitude and introduce more flexible algorithms into the modernized FHM control system were developed. The results of pilot operation during two refueling campaigns prove that the total reactor shutdown time is lowered.

  17. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... nuclear material, facility and operator licenses. (a) If the Director, Office of Nuclear Reactor... repository operations area under parts 60 or 63 of this chapter, the Director, Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear Material Safety and Safeguards, or...

  18. 10 CFR 2.103 - Action on applications for byproduct, source, special nuclear material, facility and operator...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... nuclear material, facility and operator licenses. (a) If the Director, Office of Nuclear Reactor... repository operations area under parts 60 or 63 of this chapter, the Director, Office of Nuclear Reactor Regulation, Director, Office of New Reactors, Director, Office of Nuclear Material Safety and Safeguards, or...

  19. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  20. Comparing the new generation accelerator driven subcritical reactor system (ADS) to traditional critical reactors

    NASA Astrophysics Data System (ADS)

    Kemah, Elif; Akkaya, Recep; Tokgöz, Seyit Rıza

    2017-02-01

    In recent years, the accelerator driven subcritical reactors have taken great interest worldwide. The Accelerator Driven System (ADS) has been used to produce neutron in subcritical state by the external proton beam source. These reactors, which are hybrid systems, are important in production of clean and safe energy and conversion of radioactive waste. The ADS with the selection of reliability and robust target materials have been the new generation of fission reactors. In addition, in the ADS Reactors the problems of long-lived radioactive fission products and waste actinides encountered in the fission process of the reactor during incineration can be solved, and ADS has come to the forefront of thorium as fuel for the reactors.

  1. Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.

    This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalizedmore » cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.« less

  2. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  3. Reactors are indispensable for radioisotope production.

    PubMed

    Mushtaq, Ahmad

    2010-12-01

    Radioisotopes can be produced by reactors and accelerators. For certain isotopes there could be an advantage to a certain production method. However, nowadays many reports suggest, that useful isotopes needed in medicine, industry and research could be produced efficiently and dependence on reactors using enriched U-235 may be eliminated. In my view reactors and accelerators will continue to play their role side by side in the supply of suitable and economical sources of isotopes.

  4. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  5. Variational approach to probabilistic finite elements

    NASA Astrophysics Data System (ADS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  6. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1987-01-01

    Probabilistic finite element method (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties, and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  7. A high-resolution probabilistic in vivo atlas of human subcortical brain nuclei

    PubMed Central

    Pauli, Wolfgang M.; Nili, Amanda N.; Tyszka, J. Michael

    2018-01-01

    Recent advances in magnetic resonance imaging methods, including data acquisition, pre-processing and analysis, have benefited research on the contributions of subcortical brain nuclei to human cognition and behavior. At the same time, these developments have led to an increasing need for a high-resolution probabilistic in vivo anatomical atlas of subcortical nuclei. In order to address this need, we constructed high spatial resolution, three-dimensional templates, using high-accuracy diffeomorphic registration of T1- and T2- weighted structural images from 168 typical adults between 22 and 35 years old. In these templates, many tissue boundaries are clearly visible, which would otherwise be impossible to delineate in data from individual studies. The resulting delineations of subcortical nuclei complement current histology-based atlases. We further created a companion library of software tools for atlas development, to offer an open and evolving resource for the creation of a crowd-sourced in vivo probabilistic anatomical atlas of the human brain. PMID:29664465

  8. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  9. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  10. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    PubMed

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  11. Common Randomness Principles of Secrecy

    ERIC Educational Resources Information Center

    Tyagi, Himanshu

    2013-01-01

    This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…

  12. [Optimization Study on the Nitrogen and Phosphorus Removal of Modified Two- sludge System Under the Condition of Low Carbon Source].

    PubMed

    Yang, Wei-qiang; Wang, Dong-bo; Li, Xiao-ming; Yang, Qi; Xu, Qiu-xiang; Zhang, Zhi-bei; Li, Zhi-jun; Xiang, Hai-hong; Wang, Ya-li; Sun, Jian

    2016-04-15

    This paper explored the method of resolving insufficient carbon source in urban sewage by comparing and analyzing denitrification and phosphorus removal (NPR) effect between modified two-sludge system and traditional anaerobic-aerobic-anoxic process under the condition of low carbon source wastewater. The modified two-sludge system was the experimental reactor, which was optimized by adding two stages of micro-aeration (aeration rate 0.5 L · mm⁻¹) in the anoxic period of the original two-sludge system, and multi-stage anaerobic-aerobic-anoxic SBR was the control reactor. When the influent COD, ammonia nitrogen, SOP concentration were respectively 200, 35, 10 mg · L⁻¹, the NPR effect of the experimental reactor was hetter than that of thecontrol reactor with the removal efficiency of TN being 94.8% vs 60.9%, and TP removal being 96.5% vs 75%, respectively. The effluent SOP, ammonia, TN concentration of the experimental reactor were 0.35, 0.50, 1.82 mg · L⁻¹, respectively, which could fully meet the first class of A standard of the Pollutants Emission Standard of Urban Wastewater Treatment Firm (GB 18918-2002). Using the optimized treatment process, the largest amounts of nitrogen and phosphorus removal per unit carbon source (as COD) were 0.17 g · g⁻¹ and 0.048 g · g⁻¹ respectively, which could furthest solve the lower carbon concentration in current municipal wastewater.

  13. Probabilistic margin evaluation on accidental transients for the ASTRID reactor project

    NASA Astrophysics Data System (ADS)

    Marquès, Michel

    2014-06-01

    ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.

  14. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  15. Integrated reformer and shift reactor

    DOEpatents

    Bentley, Jeffrey M.; Clawson, Lawrence G.; Mitchell, William L.; Dorson, Matthew H.

    2006-06-27

    A hydrocarbon fuel reformer for producing diatomic hydrogen gas is disclosed. The reformer includes a first reaction vessel, a shift reactor vessel annularly disposed about the first reaction vessel, including a first shift reactor zone, and a first helical tube disposed within the first shift reactor zone having an inlet end communicating with a water supply source. The water supply source is preferably adapted to supply liquid-phase water to the first helical tube at flow conditions sufficient to ensure discharge of liquid-phase and steam-phase water from an outlet end of the first helical tube. The reformer may further include a first catalyst bed disposed in the first shift reactor zone, having a low-temperature shift catalyst in contact with the first helical tube. The catalyst bed includes a plurality of coil sections disposed in coaxial relation to other coil sections and to the central longitudinal axis of the reformer, each coil section extending between the first and second ends, and each coil section being in direct fluid communication with at least one other coil section.

  16. Hydroponic potato production on nutrients derived from anaerobically-processed potato plant residues

    NASA Astrophysics Data System (ADS)

    Mackowiak, C. L.; Stutte, G. W.; Garland, J. L.; Finger, B. W.; Ruffe, L. M.

    1997-01-01

    Bioregenerative methods are being developed for recycling plant minerals from harvested inedible biomass as part of NASA's Advanced Life Support (ALS) research. Anaerobic processing produces secondary metabolites, a food source for yeast production, while providing a source of water soluble nutrients for plant growth. Since NH_4-N is the nitrogen product, processing the effluent through a nitrification reactor was used to convert this to NO_3-N, a more acceptable form for plants. Potato (Solanum tuberosum L.) cv. Norland plants were used to test the effects of anaerobically-produced effluent after processing through a yeast reactor or nitrification reactor. These treatments were compared to a mixed-N treatment (75:25, NO_3:NH_4) or a NO_3-N control, both containing only reagent-grade salts. Plant growth and tuber yields were greatest in the NO_3-N control and yeast reactor effluent treatments, which is noteworthy, considering the yeast reactor treatment had high organic loading in the nutrient solution and concomitant microbial activity.

  17. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  18. Deaggregation of Probabilistic Ground Motions in the Central and Eastern United States

    USGS Publications Warehouse

    Harmsen, S.; Perkins, D.; Frankel, A.

    1999-01-01

    Probabilistic seismic hazard analysis (PSHA) is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to known and suspected earthquake sources. The relative contributions of the various sources to the total seismic hazard are determined as a function of their occurrence rates and their ground-motion potential. The separation of the exceedance contributions into bins whose base dimensions are magnitude and distance is called deaggregation. We have deaggregated the hazard analyses for the new USGS national probabilistic ground-motion hazard maps (Frankel et al., 1996). For points on a 0.2?? grid in the central and eastern United States (CEUS), we show color maps of the geographical variation of mean and modal magnitudes (M??, M??) and distances (D??, D??) for ground motions having a 2% chance of exceedance in 50 years. These maps are displayed for peak horizontal acceleration and for spectral response accelerations of 0.2, 0.3, and 1.0 sec. We tabulate M??, D??, M??, and D?? for 49 CEUS cities for 0.2- and 1.0-sec response. Thus, these maps and tables are PSHA-derived estimates of the potential earthquakes that dominate seismic hazard at short and intermediate periods in the CEUS. The contribution to hazard of the New Madrid and Charleston sources dominates over much of the CEUS; for 0.2-sec response, over 40% of the area; for 1.0-sec response, over 80% of the area. For 0.2-sec response, D?? ranges from 20 to 200 km, for 1.0 sec, 30 to 600 km. For sites influenced by New Madrid or Charleston, D is less than the distance to these sources, and M?? is less than the characteristic magnitude of these sources, because averaging takes into account the effect of smaller magnitude and closer sources. On the other hand, D?? is directly the distance to New Madrid or Charleston and M?? for 0.2- and 1.0-sec response corresponds to the dominating source over much of the CEUS. For some cities in the North Atlantic states, short-period seismic hazard is apt to be controlled by local seismicity, whereas intermediate period (1.0 sec) hazard is commonly controlled by regional seismicity, such as that of the Charlevoix seismic zone.

  19. Information Expensiveness Perceived by Vietnamese Patients with Respect to Healthcare Provider's Choice.

    PubMed

    Quan-Hoang, Vuong

    2016-10-01

    Patients have to acquire information to support their decision on choosing a suitable healthcare provider. But in developing countries like Vietnam, accessibility issues remain an obstacle, thus adversely affect both quality and costliness of healthcare information. Vietnamese use both sources from health professionals and friends/relatives, especially when quality of the Internet-based cheaper sources appear to be still questionable. The search of information from both professionals and friends/relatives incurs some cost, which can be viewed as low or high depending low or high accessibility to the sources. These views potentially affect their choices. To investigate the effects that medical/health services information on perceived expensiveness of patients' labor costs. Two related objectives are a) establishing empirical relations between accessibility to sources and expensiveness; and, b) probabilistic trends of probabilities for perceived expensiveness. There is evidence for established relations among the variables "Convexp" and "Convrel" (all p's < 0.01), indicating that both information sources (experts and friends/relatives) have influence on patients perception of information expensiveness. The use of experts source tends to increase the probability of perceived expensiveness. a) Probabilistic trends show Vietnamese patients have propensity to value healthcare information highly and do not see it as "expensive"; b) The majority of Vietnamese households still take non-professional advices at their own risks; c) There is more for the public healthcare information system to do to reduce costliness and risk of information. The Internet-based health service users communities cannot replace this system.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, Jon; Hayes, Steven; Walters, L. C.

    This document explores startup fuel options for a proposed test/demonstration fast reactor. The fuel options considered are the metallic fuels U-Zr and U-Pu-Zr and the ceramic fuels UO 2 and UO 2-PuO 2 (MOX). Attributes of the candidate fuel choices considered were feedstock availability, fabrication feasibility, rough order of magnitude cost and schedule, and the existing irradiation performance database. The reactor-grade plutonium bearing fuels (U-Pu-Zr and MOX) were eliminated from consideration as the initial startup fuels because the availability and isotopics of domestic plutonium feedstock is uncertain. There are international sources of reactor grade plutonium feedstock but isotopics and availabilitymore » are also uncertain. Weapons grade plutonium is the only possible source of Pu feedstock in sufficient quantities needed to fuel a startup core. Currently, the available U.S. source of (excess) weapons-grade plutonium is designated for irradiation in commercial light water reactors (LWR) to a level that would preclude diversion. Weapons-grade plutonium also contains a significant concentration of gallium. Gallium presents a potential issue for both the fabrication of MOX fuel as well as possible performance issues for metallic fuel. Also, the construction of a fuel fabrication line for plutonium fuels, with or without a line to remove gallium, is expected to be considerably more expensive than for uranium fuels. In the case of U-Pu-Zr, a relatively small number of fuel pins have been irradiated to high burnup, and in no case has a full assembly been irradiated to high burnup without disassembly and re-constitution. For MOX fuel, the irradiation database from the Fast Flux Test Facility (FFTF) is extensive. If a significant source of either weapons-grade or reactor-grade Pu became available (i.e., from an international source), a startup core based on Pu could be reconsidered.« less

  1. KINETICS OF LOW SOURCE REACTOR STARTUPS. PART I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurwitz, H. Jr.; MacMillan, D.B.; Smith, J.H.

    1962-06-01

    Statistical fluctuntions of neutron populations in reactors are analyzed by means of an approximate theoretical model. Development of the model is given in detail; also included are extensive numerical results derived from its application to systems with time-dependent reactivity, namely, a reactor during start-up. The special relationships of fluctuations to safety considerations are discussed. (auth)

  2. Si impurity concentration in nominally undoped Al0.7Ga0.3N grown in a planetary MOVPE reactor

    NASA Astrophysics Data System (ADS)

    Jeschke, J.; Knauer, A.; Weyers, M.

    2018-02-01

    The unintentional silicon incorporation during the metalorganic vapor phase epitaxy (MOVPE) of nominally undoped Al0.7Ga0.3N in a Planetary Reactor under various growth conditions was investigated. Dependent on growth temperature, pressure and V/III ratio, Si concentrations of below 1 × 1016 up to 4 × 1017 cm-3 were measured. Potential Si sources are discussed and, by comparing samples grown in a SiC coated reactor setup and in a TaC coated setup, the SiC coatings in the reactor are identified as the most likely source for the unintentional Si doping at elevated temperatures above 1080 °C. Under identical growth conditions the background Si concentration can be reduced by up to an order of magnitude when using TaC coatings.

  3. Application of the Monte Carlo method to estimate doses due to neutron activation of different materials in a nuclear reactor

    NASA Astrophysics Data System (ADS)

    Ródenas, José

    2017-11-01

    All materials exposed to some neutron flux can be activated independently of the kind of the neutron source. In this study, a nuclear reactor has been considered as neutron source. In particular, the activation of control rods in a BWR is studied to obtain the doses produced around the storage pool for irradiated fuel of the plant when control rods are withdrawn from the reactor and installed into this pool. It is very important to calculate these doses because they can affect to plant workers in the area. The MCNP code based on the Monte Carlo method has been applied to simulate activation reactions produced in the control rods inserted into the reactor. Obtained activities are introduced as input into another MC model to estimate doses produced by them. The comparison of simulation results with experimental measurements allows the validation of developed models. The developed MC models have been also applied to simulate the activation of other materials, such as components of a stainless steel sample introduced into a training reactors. These models, once validated, can be applied to other situations and materials where a neutron flux can be found, not only nuclear reactors. For instance, activation analysis with an Am-Be source, neutrography techniques in both medical applications and non-destructive analysis of materials, civil engineering applications using a Troxler, analysis of materials in decommissioning of nuclear power plants, etc.

  4. TUNABLE IRRADIATION TESTBED

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootan, David W.; Casella, Andrew M.; Asner, David M.

    PNNL has developed and continues to develop innovative methods for characterizing irradiated materials from nuclear reactors and particle accelerators for various clients and collaborators around the world. The continued development of these methods, in addition to the ability to perform unique scientific investigations of the effects of radiation on materials could be greatly enhanced with easy access to irradiation facilities. A Tunable Irradiation Testbed with customized targets (a 30 MeV, 1mA cyclotron or similar coupled to a unique target system) is shown to provide a much more flexible and cost-effective source of irradiating particles than a test reactor or isotopicmore » source. The configuration investigated was a single shielded building with multiple beam lines from a small, flexible, high flux irradiation source. Potential applications investigated were the characterization of radiation damage to materials applicable to advanced reactors, fusion reactor, legacy waste, (via neutron spectra tailored to HTGR, molten salt, LWR, LMR, fusion environments); 252Cf replacement; characterization of radiation damage to materials of interest to High Energy Physics to enable the neutrino program; and research into production of short lived isotopes for potential medical and other applications.« less

  5. Piloted rich-catalytic lean-burn hybrid combustor

    DOEpatents

    Newburry, Donald Maurice

    2002-01-01

    A catalytic combustor assembly which includes, an air source, a fuel delivery means, a catalytic reactor assembly, a mixing chamber, and a means for igniting a fuel/air mixture. The catalytic reactor assembly is in fluid communication with the air source and fuel delivery means and has a fuel/air plenum which is coated with a catalytic material. The fuel/air plenum has cooling air conduits passing therethrough which have an upstream end. The upstream end of the cooling conduits is in fluid communication with the air source but not the fuel delivery means.

  6. National Centers for Environmental Prediction

    Science.gov Websites

    ENSEMBLE PRODUCTS & DATA SOURCES Probabilistic Forecasts of Quantitative Precipitation from the NCEP Predictability Research with Indian Monsoon Examples - PDF - 28 Mar 2005 North American Ensemble Forecast System QUANTITATIVE PRECIPITATION *PQPF* In these charts, the probability that 24-hour precipitation amounts over a

  7. Maritime Threat Detection using Plan Recognition

    DTIC Science & Technology

    2012-11-01

    logic with a probabilistic interpretation to represent expert domain knowledge [13]. We used Alchemy [14] to implement MLN-BR. It interfaces with the...Domingos, P., & Lowd, D. (2009). Markov logic: An interface layer for AI. Morgan & Claypool. [14] Alchemy (2011). Alchemy ─ Open source AI. [http

  8. Fully probabilistic earthquake source inversion on teleseismic scales

    NASA Astrophysics Data System (ADS)

    Stähler, Simon; Sigloch, Karin

    2017-04-01

    Seismic source inversion is a non-linear problem in seismology where not just the earthquake parameters but also estimates of their uncertainties are of great practical importance. We have developed a method of fully Bayesian inference for source parameters, based on measurements of waveform cross-correlation between broadband, teleseismic body-wave observations and their modelled counterparts. This approach yields not only depth and moment tensor estimates but also source time functions. These unknowns are parameterised efficiently by harnessing as prior knowledge solutions from a large number of non-Bayesian inversions. The source time function is expressed as a weighted sum of a small number of empirical orthogonal functions, which were derived from a catalogue of >1000 source time functions (STFs) by a principal component analysis. We use a likelihood model based on the cross-correlation misfit between observed and predicted waveforms. The resulting ensemble of solutions provides full uncertainty and covariance information for the source parameters, and permits propagating these source uncertainties into travel time estimates used for seismic tomography. The computational effort is such that routine, global estimation of earthquake mechanisms and source time functions from teleseismic broadband waveforms is feasible. A prerequisite for Bayesian inference is the proper characterisation of the noise afflicting the measurements. We show that, for realistic broadband body-wave seismograms, the systematic error due to an incomplete physical model affects waveform misfits more strongly than random, ambient background noise. In this situation, the waveform cross-correlation coefficient CC, or rather its decorrelation D = 1 - CC, performs more robustly as a misfit criterion than ℓp norms, more commonly used as sample-by-sample measures of misfit based on distances between individual time samples. From a set of over 900 user-supervised, deterministic earthquake source solutions treated as a quality-controlled reference, we derive the noise distribution on signal decorrelation D of the broadband seismogram fits between observed and modelled waveforms. The noise on D is found to approximately follow a log-normal distribution, a fortunate fact that readily accommodates the formulation of an empirical likelihood function for D for our multivariate problem. The first and second moments of this multivariate distribution are shown to depend mostly on the signal-to-noise ratio (SNR) of the CC measurements and on the back-azimuthal distances of seismic stations. References: Stähler, S. C. and Sigloch, K.: Fully probabilistic seismic source inversion - Part 1: Efficient parameterisation, Solid Earth, 5, 1055-1069, doi:10.5194/se-5-1055-2014, 2014. Stähler, S. C. and Sigloch, K.: Fully probabilistic seismic source inversion - Part 2: Modelling errors and station covariances, Solid Earth, 7, 1521-1536, doi:10.5194/se-7-1521-2016, 2016.

  9. Relative multiplexing for minimising switching in linear-optical quantum computing

    NASA Astrophysics Data System (ADS)

    Gimeno-Segovia, Mercedes; Cable, Hugo; Mendoza, Gabriel J.; Shadbolt, Pete; Silverstone, Joshua W.; Carolan, Jacques; Thompson, Mark G.; O'Brien, Jeremy L.; Rudolph, Terry

    2017-06-01

    Many existing schemes for linear-optical quantum computing (LOQC) depend on multiplexing (MUX), which uses dynamic routing to enable near-deterministic gates and sources to be constructed using heralded, probabilistic primitives. MUXing accounts for the overwhelming majority of active switching demands in current LOQC architectures. In this manuscript we introduce relative multiplexing (RMUX), a general-purpose optimisation which can dramatically reduce the active switching requirements for MUX in LOQC, and thereby reduce hardware complexity and energy consumption, as well as relaxing demands on performance for various photonic components. We discuss the application of RMUX to the generation of entangled states from probabilistic single-photon sources, and argue that an order of magnitude improvement in the rate of generation of Bell states can be achieved. In addition, we apply RMUX to the proposal for percolation of a 3D cluster state by Gimeno-Segovia et al (2015 Phys. Rev. Lett. 115 020502), and we find that RMUX allows an 2.4× increase in loss tolerance for this architecture.

  10. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less

  11. Scoping Calculations of Power Sources for Nuclear Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Difilippo, F. C.

    1994-01-01

    This technical memorandum describes models and calculational procedures to fully characterize the nuclear island of power sources for nuclear electric propulsion. Two computer codes were written: one for the gas-cooled NERVA derivative reactor and the other for liquid metal-cooled fuel pin reactors. These codes are going to be interfaced by NASA with the balance of plant in order to make scoping calculations for mission analysis.

  12. MTR BASEMENT. DOORWAY TO SOURCE STORAGE VAULT IS AT CENTER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR BASEMENT. DOORWAY TO SOURCE STORAGE VAULT IS AT CENTER OF VIEW; TO DECONTAMINATION ROOM, AT RIGHT. PART OF MAZE ENTRY IS VISIBLE INSIDE VAULT DOORWAY. INL NEGATIVE NO. 7763. Unknown Photographer, photo was dated as 3/30/1953, but this was probably an error. The more likely date is 3/30/1952. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. Probabilistic tsunami inundation map based on stochastic earthquake source model: A demonstration case in Macau, the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert

    2017-04-01

    Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.

  14. Contribution of crop model structure, parameters and climate projections to uncertainty in climate change impact assessments.

    PubMed

    Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H

    2018-03-01

    Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources. © 2017 John Wiley & Sons Ltd.

  15. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  16. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    NASA Technical Reports Server (NTRS)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  17. The Angra Neutrino Project: precise measurement of {theta}{sub 13} and safeguards applications of neutrino detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casimiro, E.; Anjos, J. C.

    2009-04-20

    We present an introduction to the Angra Neutrino Project. The goal of the project is to explore the use of neutrino detectors to monitor the reactor activity. The Angra Project, willl employ as neutrino sources the reactors of the nuclear power complex in Brazil, located in Angra dos Reis, some 150 Km south from the city of Rio de Janeiro. The Angra collaboration will develop and operate a low-mass neutrino detector to monitor the nuclear reactor activity, in particular to measure the reactor thermal power and the reactor fuel isotopic composition.

  18. The Angra Neutrino Project: precise measurement of θ13 and safeguards applications of neutrino detectors

    NASA Astrophysics Data System (ADS)

    Casimiro, E.; Anjos, J. C.

    2009-04-01

    We present an introduction to the Angra Neutrino Project. The goal of the project is to explore the use of neutrino detectors to monitor the reactor activity. The Angra Project, willl employ as neutrino sources the reactors of the nuclear power complex in Brazil, located in Angra dos Reis, some 150 Km south from the city of Rio de Janeiro. The Angra collaboration will develop and operate a low-mass neutrino detector to monitor the nuclear reactor activity, in particular to measure the reactor thermal power and the reactor fuel isotopic composition.

  19. Neutrino oscillation studies with reactors

    PubMed Central

    Vogel, P.; Wen, L.J.; Zhang, C.

    2015-01-01

    Nuclear reactors are one of the most intense, pure, controllable, cost-effective and well-understood sources of neutrinos. Reactors have played a major role in the study of neutrino oscillations, a phenomenon that indicates that neutrinos have mass and that neutrino flavours are quantum mechanical mixtures. Over the past several decades, reactors were used in the discovery of neutrinos, were crucial in solving the solar neutrino puzzle, and allowed the determination of the smallest mixing angle θ13. In the near future, reactors will help to determine the neutrino mass hierarchy and to solve the puzzling issue of sterile neutrinos. PMID:25913819

  20. Neutrino oscillation studies with reactors

    DOE PAGES

    Vogel, P.; Wen, L.J.; Zhang, C.

    2015-04-27

    Nuclear reactors are one of the most intense, pure, controllable, cost-effective and well-understood sources of neutrinos. Reactors have played a major role in the study of neutrino oscillations, a phenomenon that indicates that neutrinos have mass and that neutrino flavours are quantum mechanical mixtures. Over the past several decades, reactors were used in the discovery of neutrinos, were crucial in solving the solar neutrino puzzle, and allowed the determination of the smallest mixing angle θ 13. In the near future, reactors will help to determine the neutrino mass hierarchy and to solve the puzzling issue of sterile neutrinos.

  1. Neutrino oscillation studies with reactors.

    PubMed

    Vogel, P; Wen, L J; Zhang, C

    2015-04-27

    Nuclear reactors are one of the most intense, pure, controllable, cost-effective and well-understood sources of neutrinos. Reactors have played a major role in the study of neutrino oscillations, a phenomenon that indicates that neutrinos have mass and that neutrino flavours are quantum mechanical mixtures. Over the past several decades, reactors were used in the discovery of neutrinos, were crucial in solving the solar neutrino puzzle, and allowed the determination of the smallest mixing angle θ13. In the near future, reactors will help to determine the neutrino mass hierarchy and to solve the puzzling issue of sterile neutrinos.

  2. USSR Report, Energy, No. 147.

    DTIC Science & Technology

    1983-05-18

    based on low-temperature reactors ; atomic heat and electric power stations (ATETs); The restructuring of the energy balance for the 1980-2000 period...ASPT) based on low-temperature reactors ; atomic heat and electric power stations (TETs); industrial atomic power stations (AETS) based on high-temper...ature reactors ) and high-efficiency long-distance heat transport (in conjunc- tion with high-temperature nuclear power sources: ASDT). The

  3. Method for reducing iron losses in an iron smelting process

    DOEpatents

    Sarma, B.; Downing, K.B.

    1999-03-23

    A process of smelting iron that comprises the steps of: (a) introducing a source of iron oxide, oxygen, nitrogen, and a source of carbonaceous fuel to a smelting reactor, at least some of said oxygen being continuously introduced through an overhead lance; (b) maintaining conditions in said reactor to cause (1) at least some of the iron oxide to be chemically reduced, (2) a bath of molten iron to be created and stirred in the bottom of the reactor, surmounted by a layer of slag, and (3) carbon monoxide gas to rise through the slag; (c) causing at least some of said carbon monoxide to react in the reactor with the incoming oxygen, thereby generating heat for reactions taking place in the reactor; and (d) releasing from the reactor an offgas effluent, is run in a way that keeps iron losses in the offgas relatively low. After start-up of the process is complete, steps (a) and (b) are controlled so as to: (1) keep the temperature of the molten iron at or below about 1550 C and (2) keep the slag weight at or above about 0.8 ton per square meter. 13 figs.

  4. Method for reducing iron losses in an iron smelting process

    DOEpatents

    Sarma, Balu; Downing, Kenneth B.

    1999-01-01

    A process of smelting iron that comprises the steps of: a) introducing a source of iron oxide, oxygen, nitrogen, and a source of carbonaceous fuel to a smelting reactor, at least some of said oxygen being continuously introduced through an overhead lance; b) maintaining conditions in said reactor to cause (i) at least some of the iron oxide to be chemically reduced, (ii) a bath of molten iron to be created and stirred in the bottom of the reactor, surmounted by a layer of slag, and (iii) carbon monoxide gas to rise through the slag; c) causing at least some of said carbon monoxide to react in the reactor with the incoming oxygen, thereby generating heat for reactions taking place in the reactor; and d) releasing from the reactor an offgas effluent, is run in a way that keeps iron losses in the offgas relatively low. After start-up of the process is complete, steps (a) and (b) are controlled so as to: e) keep the temperature of the molten iron at or below about 1550.degree. C. and f) keep the slag weight at or above about 0.8 tonne per square meter.

  5. Markov Model of Accident Progression at Fukushima Daiichi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuadra A.; Bari R.; Cheng, L-Y

    2012-11-11

    On March 11, 2011, a magnitude 9.0 earthquake followed by a tsunami caused loss of offsite power and disabled the emergency diesel generators, leading to a prolonged station blackout at the Fukushima Daiichi site. After successful reactor trip for all operating reactors, the inability to remove decay heat over an extended period led to boil-off of the water inventory and fuel uncovery in Units 1-3. A significant amount of metal-water reaction occurred, as evidenced by the quantities of hydrogen generated that led to hydrogen explosions in the auxiliary buildings of the Units 1 & 3, and in the de-fuelled Unitmore » 4. Although it was assumed that extensive fuel damage, including fuel melting, slumping, and relocation was likely to have occurred in the core of the affected reactors, the status of the fuel, vessel, and drywell was uncertain. To understand the possible evolution of the accident conditions at Fukushima Daiichi, a Markov model of the likely state of one of the reactors was constructed and executed under different assumptions regarding system performance and reliability. The Markov approach was selected for several reasons: It is a probabilistic model that provides flexibility in scenario construction and incorporates time dependence of different model states. It also readily allows for sensitivity and uncertainty analyses of different failure and repair rates of cooling systems. While the analysis was motivated by a need to gain insight on the course of events for the damaged units at Fukushima Daiichi, the work reported here provides a more general analytical basis for studying and evaluating severe accident evolution over extended periods of time. This work was performed at the request of the U.S. Department of Energy to explore 'what-if' scenarios in the immediate aftermath of the accidents.« less

  6. POPULATION EXPOSURES TO PARTICULATE MATTER: A COMPARISON OF EXPOSURE MODEL PREDICTIONS AND MEASUREMENT DATA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate models that use a probabilistic approach to predict population exposures to environmental ...

  7. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    EPA Science Inventory

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  8. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    NASA Astrophysics Data System (ADS)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA method is suitable to compute the valueof the parameter C 2 .When no mathematical model of the source can be made available, estimations of the value C2 can be find in literature.In this paper a probabilistic mathematical representation of the unknown source is proposed, such that the asparagus patch model of the source can be approximated. The computation of the value C2 can be done in conjunction with the CMSA method, knowing the apparent mass of the load and the random acceleration specification at the interface between load and source, respectively.Strength & stiffness design rules for spacecraft, instrumentation, units, etc. will be practiced, as mentioned in ECSS Standards and Handbooks, Launch Vehicle User's manuals, papers, books , etc. A probabilistic description of the design parameters is foreseen.As an example a simple experiment has been worked out.

  9. Performance assessment of conventional and base-isolated nuclear power plants for earthquake and blast loadings

    NASA Astrophysics Data System (ADS)

    Huang, Yin-Nan

    Nuclear power plants (NPPs) and spent nuclear fuel (SNF) are required by code and regulations to be designed for a family of extreme events, including very rare earthquake shaking, loss of coolant accidents, and tornado-borne missile impacts. Blast loading due to malevolent attack became a design consideration for NPPs and SNF after the terrorist attacks of September 11, 2001. The studies presented in this dissertation assess the performance of sample conventional and base isolated NPP reactor buildings subjected to seismic effects and blast loadings. The response of the sample reactor building to tornado-borne missile impacts and internal events (e.g., loss of coolant accidents) will not change if the building is base isolated and so these hazards were not considered. The sample NPP reactor building studied in this dissertation is composed of containment and internal structures with a total weight of approximately 75,000 tons. Four configurations of the reactor building are studied, including one conventional fixed-base reactor building and three base-isolated reactor buildings using Friction Pendulum(TM), lead rubber and low damping rubber bearings. The seismic assessment of the sample reactor building is performed using a new procedure proposed in this dissertation that builds on the methodology presented in the draft ATC-58 Guidelines and the widely used Zion method, which uses fragility curves defined in terms of ground-motion parameters for NPP seismic probabilistic risk assessment. The new procedure improves the Zion method by using fragility curves that are defined in terms of structural response parameters since damage and failure of NPP components are more closely tied to structural response parameters than to ground motion parameters. Alternate ground motion scaling methods are studied to help establish an optimal procedure for scaling ground motions for the purpose of seismic performance assessment. The proposed performance assessment procedure is used to evaluate the vulnerability of the conventional and base-isolated NPP reactor buildings. The seismic performance assessment confirms the utility of seismic isolation at reducing spectral demands on secondary systems. Procedures to reduce the construction cost of secondary systems in isolated reactor buildings are presented. A blast assessment of the sample reactor building is performed for an assumed threat of 2000 kg of TNT explosive detonated on the surface with a closest distance to the reactor building of 10 m. The air and ground shock waves produced by the design threat are generated and used for performance assessment. The air blast loading to the sample reactor building is computed using a Computational Fluid Dynamics code Air3D and the ground shock time series is generated using an attenuation model for soil/rock response. Response-history analysis of the sample conventional and base isolated reactor buildings to external blast loadings is performed using the hydrocode LS-DYNA. The spectral demands on the secondary systems in the isolated reactor building due to air blast loading are greater than those for the conventional reactor building but much smaller than those spectral demands associated with Safe Shutdown Earthquake shaking. The isolators are extremely effective at filtering out high acceleration, high frequency ground shock loading.

  10. Nuclear reactor transient analysis via a quasi-static kinetics Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jo, YuGwon; Cho, Bumhee; Cho, Nam Zin, E-mail: nzcho@kaist.ac.kr

    2015-12-31

    The predictor-corrector quasi-static (PCQS) method is applied to the Monte Carlo (MC) calculation for reactor transient analysis. To solve the transient fixed-source problem of the PCQS method, fission source iteration is used and a linear approximation of fission source distributions during a macro-time step is introduced to provide delayed neutron source. The conventional particle-tracking procedure is modified to solve the transient fixed-source problem via MC calculation. The PCQS method with MC calculation is compared with the direct time-dependent method of characteristics (MOC) on a TWIGL two-group problem for verification of the computer code. Then, the results on a continuous-energy problemmore » are presented.« less

  11. Solid0Core Heat-Pipe Nuclear Batterly Type Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehud Greenspan

    This project was devoted to a preliminary assessment of the feasibility of designing an Encapsulated Nuclear Heat Source (ENHS) reactor to have a solid core from which heat is removed by liquid-metal heat pipes (HP).

  12. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  13. Probing light sterile neutrino signatures at reactor and Spallation Neutron Source neutrino experiments

    NASA Astrophysics Data System (ADS)

    Kosmas, T. S.; Papoulias, D. K.; Tórtola, M.; Valle, J. W. F.

    2017-09-01

    We investigate the impact of a fourth sterile neutrino at reactor and Spallation Neutron Source neutrino detectors. Specifically, we explore the discovery potential of the TEXONO and COHERENT experiments to subleading sterile neutrino effects through the measurement of the coherent elastic neutrino-nucleus scattering event rate. Our dedicated χ2-sensitivity analysis employs realistic nuclear structure calculations adequate for high purity sub-keV threshold Germanium detectors.

  14. Effects of humidity on sterilization of Geobacillus stearothermophilus spores with plasma-excited neutral gas

    NASA Astrophysics Data System (ADS)

    Matsui, Kei; Ikenaga, Noriaki; Sakudo, Noriyuki

    2015-06-01

    We investigate the effects of relative humidity on the sterilization process using a plasma-excited neutral gas that uniformly sterilizes both the space and inner wall of the reactor chamber at atmospheric pressure. Only reactive neutral species such as plasma-excited gas molecules and radicals were separated from the plasma and sent to the reactor chamber for chemical sterilization. The plasma source gas is nitrogen mixed with 0.1% oxygen, and the relative humidity in the source gas is controlled by changing the mixing ratio of water vapor. The relative humidity near the sample in the reactor chamber is controlled by changing the sample temperature. As a result, the relative humidity near the sample should be kept in the range from 60 to 90% for the sterilization of Geobacillus stearothermophilus spores. When the relative humidity in the source gas increases from 30 to 90%, the sterilization effect is enhanced by the same degree.

  15. Investigation of applications for high-power, self-critical fissioning uranium plasma reactors

    NASA Technical Reports Server (NTRS)

    Rodgers, R. J.; Latham, T. S.; Krascella, N. L.

    1976-01-01

    Analytical studies were conducted to investigate potentially attractive applications for gaseous nuclear cavity reactors fueled by uranium hexafluoride and its decomposition products at temperatures of 2000 to 6000 K and total pressures of a few hundred atmospheres. Approximate operating conditions and performance levels for a class of nuclear reactors in which fission energy removal is accomplished principally by radiant heat transfer from the high temperature gaseous nuclear fuel to surrounding absorbing media were determined. The results show the radiant energy deposited in the absorbing media may be efficiently utilized in energy conversion system applications which include (1) a primary energy source for high thrust, high specific impulse space propulsion, (2) an energy source for highly efficient generation of electricity, and (3) a source of high intensity photon flux for heating working fluid gases for hydrogen production or MHD power extraction.

  16. International workshop on cold neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, G.J.; West, C.D.

    1991-08-01

    The first meeting devoted to cold neutron sources was held at the Los Alamos National Laboratory on March 5--8, 1990. Cosponsored by Los Alamos and Oak Ridge National Laboratories, the meeting was organized as an International Workshop on Cold Neutron Sources and brought together experts in the field of cold-neutron-source design for reactors and spallation sources. Eighty-four people from seven countries attended. Because the meeting was the first of its kind in over forty years, much time was spent acquainting participants with past and planned activities at reactor and spallation facilities worldwide. As a result, the meeting had more ofmore » a conference flavor than one of a workshop. The general topics covered at the workshop included: Criteria for cold source design; neutronic predictions and performance; energy deposition and removal; engineering design, fabrication, and operation; material properties; radiation damage; instrumentation; safety; existing cold sources; and future cold sources.« less

  17. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less

  18. Adequacy assessment of composite generation and transmission systems incorporating wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Gao, Yi

    The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.

  19. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  20. Aggregate exposure approaches for parabens in personal care products: a case assessment for children between 0 and 3 years old

    PubMed Central

    Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke

    2014-01-01

    In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276

  1. A Parallel Fast Sweeping Method for the Eikonal Equation

    NASA Astrophysics Data System (ADS)

    Baker, B.

    2017-12-01

    Recently, there has been an exciting emergence of probabilistic methods for travel time tomography. Unlike gradient-based optimization strategies, probabilistic tomographic methods are resistant to becoming trapped in a local minimum and provide a much better quantification of parameter resolution than, say, appealing to ray density or performing checkerboard reconstruction tests. The benefits associated with random sampling methods however are only realized by successive computation of predicted travel times in, potentially, strongly heterogeneous media. To this end this abstract is concerned with expediting the solution of the Eikonal equation. While many Eikonal solvers use a fast marching method, the proposed solver will use the iterative fast sweeping method because the eight fixed sweep orderings in each iteration are natural targets for parallelization. To reduce the number of iterations and grid points required the high-accuracy finite difference stencil of Nobel et al., 2014 is implemented. A directed acyclic graph (DAG) is created with a priori knowledge of the sweep ordering and finite different stencil. By performing a topological sort of the DAG sets of independent nodes are identified as candidates for concurrent updating. Additionally, the proposed solver will also address scalability during earthquake relocation, a necessary step in local and regional earthquake tomography and a barrier to extending probabilistic methods from active source to passive source applications, by introducing an asynchronous parallel forward solve phase for all receivers in the network. Synthetic examples using the SEG over-thrust model will be presented.

  2. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adamov, E.O.; Lebedev, V.A.; Kuznetsov, Yu.N.

    Zheleznogorsk is situated near the territorial center -- Krasnoyarsk on the Yenisei river. Mining and chemical complex is the main industrial enterprise of the town, which has been constructed for generation and used for isolation of weapons-grade plutonium. Heat supply to the chemical complex and town at the moment is largely provided by nuclear co-generation plant (NCGP) on the basis of the ADEh-2 dual-purpose reactor, generating 430 Gcal/h of heat and, partially, by coal backup peak-load boiler houses. NCGP also provides 73% of electric power consumed. In line with agreements between Russia and USA on strategic arms reduction and phasingmore » out of weapons-grade plutonium production, decommissioning of the ADEh-2 reactor by 2000 is planned. Thus, a problem arises relative to compensation for electric and thermal power generation for the needs of the town and industrial enterprises, which is now supplied by the reactor. A nuclear power plant constructed on the same site as a substituting power source should be considered as the most practical option. Basic requirements to the reactor of substituting nuclear power plant are as follows. It is to be a new generation reactor on the basis of verified technologies, having an operating prototype optimal for underground siting and permitting utmost utilization of the available mining workings and those being disengaged. NCGP with the reactor is to be constructed in the time period required and is to become competitive with other possible power sources. Analysis has shown that the VK-300 simplified vessel-type boiling reactor meets the requirements made in the maximum extent. Its design is based on the experience of the VK-50 reactor operation for a period of 30 years in Dimitrovgrad (Russia) and allows for experience in the development of the SBWR type reactors. The design of the reactor is discussed.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun

    This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less

  5. Level-2 IPE for the Laguna Verde NPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arellano, J.; De Loera, M.A.; Rea, R.

    1996-12-31

    In response to generic letter GL 88-20, Comision Federal de Electricidad and Instituto de Investigaciones Electricas have jointly developed the individual plant examination (IPE) for the Laguna Verde nuclear power station unit I (LVNPS). This plant is a 675-MW(electric) boiling water reactor (BWR/5) with a reinforced concrete Mark-II containment. The approach used to fulfill the IPE requirements was to make a level-1 probabilistic risk assessment (IPE level 1) plus a containment performance analysis including the behavior and release of the fission products to the environment (IPE level 2). This paper describes the level-2 portion of the LVNPS IPE, paying specialmore » attention to both some improvements to the traditional analytical methods and to the main results.« less

  6. Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses

    PubMed Central

    Myers, Risa B.; Herskovic, Jorge R.

    2011-01-01

    Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our Bayesian framework. Use of these probabilistic techniques will enable more accurate patient counts and better results for applications requiring this metric. PMID:21986292

  7. A probabilistic seismic model for the European Arctic

    NASA Astrophysics Data System (ADS)

    Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes

    2011-01-01

    The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.

  8. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  9. Applications of plasma core reactors to terrestrial energy systems

    NASA Technical Reports Server (NTRS)

    Latham, T. S.; Biancardi, F. R.; Rodgers, R. J.

    1974-01-01

    Plasma core reactors offer several new options for future energy needs in addition to space power and propulsion applications. Power extraction from plasma core reactors with gaseous nuclear fuel allows operation at temperatures higher than conventional reactors. Highly efficient thermodynamic cycles and applications employing direct coupling of radiant energy are possible. Conceptual configurations of plasma core reactors for terrestrial applications are described. Closed-cycle gas turbines, MHD systems, photo- and thermo-chemical hydrogen production processes, and laser systems using plasma core reactors as prime energy sources are considered. Cycle efficiencies in the range of 50 to 65 percent are calculated for closed-cycle gas turbine and MHD electrical generators. Reactor advantages include continuous fuel reprocessing which limits inventory of radioactive by-products and thorium-U-233 breeder configurations with about 5-year doubling times.-

  10. Flowsheets and source terms for radioactive waste projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  11. Quantifying uncertainty in stable isotope mixing models

    DOE PAGES

    Davis, Paul; Syme, James; Heikoop, Jeffrey; ...

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [ Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ 15N and δ 18O) butmore » all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.« less

  12. The United Arab Emirates Nuclear Program and Proposed U.S. Nuclear Cooperation

    DTIC Science & Technology

    2009-10-28

    global efforts to prevent nuclear proliferation” and, “the establishment of reliable sources of nuclear fuel for future civilian light water reactors ...nuclear reactor or on handling spent reactor fuel. (...continued) May 4, 2008; and, Chris...related to the UAE’s proposed nuclear program has already taken place. In August 2008, Virginia’s Thorium Power Ltd. signed two consulting and

  13. The United Arab Emirates Nuclear Program and Proposed U.S. Nuclear Cooperation

    DTIC Science & Technology

    2009-07-17

    global efforts to prevent nuclear proliferation” and, “the establishment of reliable sources of nuclear fuel for future civilian light water reactors ...planned nuclear reactor or on handling spent reactor fuel. (...continued) May 4, 2008...contracting between U.S. firms and the UAE related to the UAE’s proposed nuclear program has already taken place. In August 2008, Virginia’s Thorium Power

  14. Safety and Environment aspects of Tokamak- type Fusion Power Reactor- An Overview

    NASA Astrophysics Data System (ADS)

    Doshi, Bharat; Reddy, D. Chenna

    2017-04-01

    Naturally occurring thermonuclear fusion reaction (of light atoms to form a heavier nucleus) in the sun and every star in the universe, releases incredible amounts of energy. Demonstrating the controlled and sustained reaction of deuterium-tritium plasma should enable the development of fusion as an energy source here on Earth. The promising fusion power reactors could be operated on the deuterium-tritium fuel cycle with fuel self-sufficiency. The potential impact of fusion power on the environment and the possible risks associated with operating large-scale fusion power plants is being studied by different countries. The results show that fusion can be a very safe and sustainable energy source. A fusion power plant possesses not only intrinsic advantages with respect to safety compared to other sources of energy, but also a negligible long term impact on the environment provided certain precautions are taken in its design. One of the important considerations is in the selection of low activation structural materials for reactor vessel. Selection of the materials for first wall and breeding blanket components is also important from safety issues. It is possible to fully benefit from the advantages of fusion energy if safety and environmental concerns are taken into account when considering the conceptual studies of a reactor design. The significant safety hazards are due to the tritium inventory and energetic neutron fluence induced activity in the reactor vessel, first wall components, blanket system etc. The potential of release of radioactivity under operational and accident conditions needs attention while designing the fusion reactor. Appropriate safety analysis for the quantification of the risk shall be done following different methods such as FFMEA (Functional Failure Modes and Effects Analysis) and HAZOP (Hazards and operability). Level of safety and safety classification such as nuclear safety and non-nuclear safety is very important for the FPR (Fusion Power Reactor). This paper describes an overview of safety and environmental merits of fusion power reactor, issues and design considerations and need for R&D on safety and environmental aspects of Tokamak type fusion reactor.

  15. Characterization of elemental release during microbe granite interactions at T = 28 °C

    NASA Astrophysics Data System (ADS)

    Wu, Lingling; Jacobson, Andrew D.; Hausner, Martina

    2008-02-01

    This study used batch reactors to characterize the mechanisms and rates of elemental release (Al, Ca, K, Mg, Na, F, Fe, P, Sr, and Si) during interaction of a single bacterial species ( Burkholderia fungorum) with granite at T = 28 °C for 35 days. The objective was to evaluate how actively metabolizing heterotrophic bacteria might influence granite weathering on the continents. We supplied glucose as a C source, either NH 4 or NO 3 as N sources, and either dissolved PO 4 or trace apatite in granite as P sources. Cell growth occurred under all experimental conditions. However, solution pH decreased from ˜7 to 4 in NH 4-bearing reactors, whereas pH remained near-neutral in NO 3-bearing reactors. Measurements of dissolved CO 2 and gluconate together with mass-balances for cell growth suggest that pH lowering in NH 4-bearing reactors resulted from gluconic acid release and H + extrusion during NH 4 uptake. In NO 3-bearing reactors, B. fungormum likely produced gluconic acid and consumed H + simultaneously during NO 3 utilization. Over the entire 35-day period, NH 4-bearing biotic reactors yielded the highest release rates for all elements considered. However, chemical analyses of biomass show that bacteria scavenged Na, P, and Sr during growth. Abiotic control reactors followed different reaction paths and experienced much lower elemental release rates compared to biotic reactors. Because release rates inversely correlate with pH, we conclude that proton-promoted dissolution was the dominant reaction mechanism. Solute speciation modeling indicates that formation of Al-F and Fe-F complexes in biotic reactors may have enhanced mineral solubilities and release rates by lowering Al and Fe activities. Mass-balances further reveal that Ca-bearing trace phases (calcite, fluorite, and fluorapatite) provided most of the dissolved Ca, whereas more abundant phases (plagioclase) contributed negligible amounts. Our findings imply that during the incipient stages of granite weathering, heterotrophic bacteria utilizing glucose and NH 4 only moderately elevate silicate weathering reactions that consume atmospheric CO 2. However, by enhancing the dissolution of non-silicate, Ca-bearing trace minerals, they could contribute to high Ca/Na ratios commonly observed in granitic watersheds.

  16. Probabilistic Modeling of Childhood Multimedia Lead Exposures: Examining the Soil Ingestion Pathway

    EPA Science Inventory

    BACKGROUND: Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)’s National Drinking Water Advis...

  17. Distillation of squeezing from non-Gaussian quantum states.

    PubMed

    Heersink, J; Marquardt, Ch; Dong, R; Filip, R; Lorenz, S; Leuchs, G; Andersen, U L

    2006-06-30

    We show that single copy distillation of squeezing from continuous variable non-Gaussian states is possible using linear optics and conditional homodyne detection. A specific non-Gaussian noise source, corresponding to a random linear displacement, is investigated experimentally. Conditioning the signal on a tap measurement, we observe probabilistic recovery of squeezing.

  18. [The SILENE reactor: a tool adapted for applied study of moderate and large doses].

    PubMed

    Verrey, B; Leo, Y; Fouillaud, P

    2002-07-01

    Designed in 1974 to study the phenomenology and consequences of a critical accident, the SILENE experimental reactor, an intense source of mixed neutron and gamma radiation, is also suited to radiobiological studies.

  19. Two-phase anaerobic digestion of source sorted OFMSW (organic fraction of municipal solid waste): performance and kinetic study.

    PubMed

    Pavan, P; Battistoni, P; Cecchi, F; Mata-Alvarez, J

    2000-01-01

    The results of a two-phase system operated in different conditions, treating the source-sorted organic fraction of municipal solid waste (SS-OFMSW), coming mainly from fruit and vegetable markets, are presented. Hydraulic retention time (HRT) in the hydrolytic reactor and in the methanogenic reactor and also the temperature in the hydrolytic reactor (mesophilic and thermophilic conditions) are varied in order to evaluate the effect of these factors. The methanogenic reactor is always operated within the thermophilic range. Optimum operating conditions are found to be around 12 days (total system) using the mesophilic range of temperature in the first reactor. Specific gas production (SGP) in these conditions is around 0.6 m3/kg TVS. A kinetic study is also carried out, using the first and the step diffusional models. The latter gives much better results, with fitted constants comparable to other studies. Finally, a comparison with a one-phase system is carried out, showing that a two-phase system is much more appropriate for the digestion of this kind of highly biodegradable substrate in thermophilic conditions.

  20. Status and problems of fusion reactor development.

    PubMed

    Schumacher, U

    2001-03-01

    Thermonuclear fusion of deuterium and tritium constitutes an enormous potential for a safe, environmentally compatible and sustainable energy supply. The fuel source is practically inexhaustible. Further, the safety prospects of a fusion reactor are quite favourable due to the inherently self-limiting fusion process, the limited radiologic toxicity and the passive cooling property. Among a small number of approaches, the concept of toroidal magnetic confinement of fusion plasmas has achieved most impressive scientific and technical progress towards energy release by thermonuclear burn of deuterium-tritium fuels. The status of thermonuclear fusion research activity world-wide is reviewed and present solutions to the complicated physical and technological problems are presented. These problems comprise plasma heating, confinement and exhaust of energy and particles, plasma stability, alpha particle heating, fusion reactor materials, reactor safety and environmental compatibility. The results and the high scientific level of this international research activity provide a sound basis for the realisation of the International Thermonuclear Experimental Reactor (ITER), whose goal is to demonstrate the scientific and technological feasibility of a fusion energy source for peaceful purposes.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soldevilla, M.; Salmons, S.; Espinosa, B.

    The new application BDDR (Reactor database) has been developed at CEA in order to manage nuclear reactors technological and operating data. This application is a knowledge management tool which meets several internal needs: -) to facilitate scenario studies for any set of reactors, e.g. non-proliferation assessments; -) to make core physics studies easier, whatever the reactor design (PWR-Pressurized Water Reactor-, BWR-Boiling Water Reactor-, MAGNOX- Magnesium Oxide reactor-, CANDU - CANada Deuterium Uranium-, FBR - Fast Breeder Reactor -, etc.); -) to preserve the technological data of all reactors (past and present, power generating or experimental, naval propulsion,...) in a uniquemore » repository. Within the application database are enclosed location data and operating history data as well as a tree-like structure containing numerous technological data. These data address all kinds of reactors features and components. A few neutronics data are also included (neutrons fluxes). The BDDR application is based on open-source technologies and thin client/server architecture. The software architecture has been made flexible enough to allow for any change. (authors)« less

  2. Information Expensiveness Perceived by Vietnamese Patients with Respect to Healthcare Provider’s Choice

    PubMed Central

    Quan-Hoang, Vuong

    2016-01-01

    Background: Patients have to acquire information to support their decision on choosing a suitable healthcare provider. But in developing countries like Vietnam, accessibility issues remain an obstacle, thus adversely affect both quality and costliness of healthcare information. Vietnamese use both sources from health professionals and friends/relatives, especially when quality of the Internet-based cheaper sources appear to be still questionable. The search of information from both professionals and friends/relatives incurs some cost, which can be viewed as low or high depending low or high accessibility to the sources. These views potentially affect their choices. Aim and Objectives: To investigate the effects that medical/health services information on perceived expensiveness of patients’ labor costs. Two related objectives are a) establishing empirical relations between accessibility to sources and expensiveness; and, b) probabilistic trends of probabilities for perceived expensiveness. Results: There is evidence for established relations among the variables “Convexp” and “Convrel” (all p’s < 0.01), indicating that both information sources (experts and friends/relatives) have influence on patients perception of information expensiveness. The use of experts source tends to increase the probability of perceived expensiveness. Conclusion: a) Probabilistic trends show Vietnamese patients have propensity to value healthcare information highly and do not see it as “expensive”; b) The majority of Vietnamese households still take non-professional advices at their own risks; c) There is more for the public healthcare information system to do to reduce costliness and risk of information. The Internet-based health service users communities cannot replace this system. PMID:28077894

  3. Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes

    NASA Astrophysics Data System (ADS)

    Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.

    2015-12-01

    Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.

  4. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  5. How safe is safe enough. The relation of environmental characteristics and economic competitiveness in fusion-reactor design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdren, J.P.

    The need for fusion energy depends strongly on fusion's potential to achieve ambitious safety goals more completely or more economically than fission can. The history and present complexion of public opinion about environment and safety gives little basis for expecting either that these concerns will prove to be a passing fad or that the public will make demands for zero risk that no energy source can meet. Hazard indices based on ''worst case'' accidents and exposures should be used as design tools to promote combinations of fusion-reactor materials and configurations that bring the worst cases down to levels small comparedmore » to the hazards people tolerate from electricity at the point of end use. It may well be possible, by building such safety into fusion from the ground up, to accomplish this goal at costs competitive with other inexhaustible electricity sources. Indeed, the still rising and ultimately indeterminate costs of meeting safety and environmental requirements in nonbreeder fission reactors and coal-burning power plants mean that fusion reactors meeting ambitious safety goals may be able to compete economically with these ''interim'' electricity sources as well.« less

  6. Gaseous fuel reactors for power systems

    NASA Technical Reports Server (NTRS)

    Kendall, J. S.; Rodgers, R. J.

    1977-01-01

    Gaseous-fuel nuclear reactors have significant advantages as energy sources for closed-cycle power systems. The advantages arise from the removal of temperature limits associated with conventional reactor fuel elements, the wide variety of methods of extracting energy from fissioning gases, and inherent low fissile and fission product in-core inventory due to continuous fuel reprocessing. Example power cycles and their general performance characteristics are discussed. Efficiencies of gaseous fuel reactor systems are shown to be high with resulting minimal environmental effects. A technical overview of the NASA-funded research program in gaseous fuel reactors is described and results of recent tests of uranium hexafluoride (UF6)-fueled critical assemblies are presented.

  7. The role of PRA in the safety assessment of VVER Nuclear Power Plants in Ukraine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kot, C.

    1999-05-10

    Ukraine operates thirteen (13) Soviet-designed pressurized water reactors, VVERS. All Ukrainian plants are currently operating with annually renewable permits until they update their safety analysis reports (SARs), in accordance with new SAR content requirements issued in September 1995, by the Nuclear Regulatory Authority and the Government Nuclear Power Coordinating Committee of Ukraine. The requirements are in three major areas: design basis accident (DBA) analysis, probabilistic risk assessment (PRA), and beyond design-basis accident (BDBA) analysis. The last two requirements, on PRA and BDBA, are new, and the DBA requirements are an expanded version of the older SAR requirements. The US Departmentmore » of Energy (USDOE), as part of its Soviet-Designed Reactor Safety activities, is providing assistance and technology transfer to Ukraine to support their nuclear power plants (NPPs) in developing a Western-type technical basis for the new SARs. USDOE sponsored In-Depth Safety Assessments (ISAs) are in progress at three pilot nuclear reactor units in Ukraine, South Ukraine Unit 1, Zaporizhzhya Unit 5, and Rivne Unit 1, and a follow-on study has been initiated at Khmenytskyy Unit 1. The ISA projects encompass most areas of plant safety evaluation, but the initial emphasis is on performing a detailed, plant-specific Level 1 Internal Events PRA. This allows the early definition of the plant risk profile, the identification of risk significant accident sequences and plant vulnerabilities and provides guidance for the remainder of the safety assessments.« less

  8. Probabilistic seismic hazard analyses for ground motions and fault displacement at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.

    2001-01-01

    Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.

  9. Portable thermo-photovoltaic power source

    DOEpatents

    Zuppero, Anthony C.; Krawetz, Barton; Barklund, C. Rodger; Seifert, Gary D.

    1997-01-14

    A miniature thermo-photovoltaic (TPV) device for generation of electrical power for use in portable electronic devices. A TPV power source is constructed to provide a heat source chemical reactor capable of using various fuels, such as liquid hydrocarbons, including but not limited to propane, LPG, butane, alcohols, oils and diesel fuels to generate a source of photons. A reflector dish guides misdirected photon energy from the photon source toward a photovoltaic array. A thin transparent protector sheet is disposed between the photon source and the array to reflect back thermal energy that cannot be converted to electricity, and protect the array from thermal damage. A microlens disposed between the protector sheet and the array further focuses the tailored band of photon energy from the photon source onto an array of photovoltaic cells, whereby the photon energy is converted to electrical power. A heat recuperator removes thermal energy from reactor chamber exhaust gases, preferably using mini- or micro-bellows to force air and fuel past the exhaust gases, and uses the energy to preheat the fuel and oxidant before it reaches the reactor, increasing system efficiency. Mini- or micro-bellows force ambient air through the system both to supply oxidant and to provide cooling. Finally, an insulator, which is preferably a super insulator, is disposed around the TPV power source to reduce fuel consumption, and to keep the TPV power source cool to the touch so it can be used in hand-held devices.

  10. Nuclear reactors built, being built, or planned, 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, B.

    1992-07-01

    This document contains unclassified information about facilities built, being built, or planned in the United States for domestic use or export as of December 31, 1991. The book is divided into three major sections: Section 1 consists of a reactor locator map and reactor tables; Section 2 includes nuclear reactors that are operating, being built, or planned; and Section 3 includes reactors that have been shut down permanently or dismantled. Sections 2 and 3 contain the following classification of reactors: Civilian, Production, Military, Export, and Critical Assembly. Export reactor refers to a reactor for which the principal nuclear contractor ismore » an American company -- working either independently or in cooperation with a foreign company (Part 4, in each section). Critical assembly refers to an assembly of fuel and assembly of fuel and moderator that requires an external source of neutrons to initiate and maintain fission. A critical assembly is used for experimental measurements (Part 5).« less

  11. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  12. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  13. Method for producing H.sub.2 using a rotating drum reactor with a pulse jet heat source

    DOEpatents

    Paulson, Leland E.

    1990-01-01

    A method of producing hydrogen by an endothermic steam-carbon reaction using a rotating drum reactor and a pulse jet combustor. The pulse jet combustor uses coal dust as a fuel to provide reaction temperatures of 1300.degree. to 1400.degree. F. Low-rank coal, water, limestone and catalyst are fed into the drum reactor where they are heated, tumbled and reacted. Part of the reaction product from the rotating drum reactor is hydrogen which can be utilized in suitable devices.

  14. Movable-molybdenum-reflector reactivity experiments for control studies of compact space power reactor concepts

    NASA Technical Reports Server (NTRS)

    Fox, T. A.

    1973-01-01

    An experimental reflector reactivity study was made with a compact cylindrical reactor using a uranyl fluoride - water fuel solution. The reactor was axially unreflected and radially reflected with segments of molybdenum. The reflector segments were displaced incrementally in both the axial and radial dimensions, and the shutdown of each configuration was measured by using the pulsed-neutron source technique. The reactivity effects for axial and radial displacement of reflector segments are tabulated separately and compared. The experiments provide data for control-system studies of compact-space-power-reactor concepts.

  15. Method of production H/sub 2/ using a rotating drum reactor with a pulse jet heat source

    DOEpatents

    Paulson, L.E.

    1988-05-13

    A method of producing hydrogen by an endothermic steam-carbon reaction using a rotating drum reactor and a pulse jet combustor. The pulse jet combustor uses coal dust as a fuel to provide reaction temperatures of 1300/degree/ to 1400/degree/F. Low-rank coal, water, limestone and catalyst are fed into the drum reactor where they are heated, tumbled and reacted. Part of the reaction product from the rotating drum reactor is hydrogen which can be utilized in suitable devices. 1 fig.

  16. Production and study of radionuclides at the research institute of atomic reactors (NIIAR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karelin, E.A.; Gordeev, Y.N.; Filimonov, V.T.

    1995-01-01

    The main works of the Radionuclide Sources and Preparations Department (ORIP) of the Research Institute of Atomic Reactors (NIIAR) are summarized. The major activity of the Radionuclide Sources and Preparations Department (ORIP) is aimed at production of radioactive preparations of trans-plutonium elements (TPE) and also of lighter elements (from P to Ir), manufacture of ionizing radiation sources thereof, and scientific research to develop new technologies. One of the radionuclides that recently has received major attention is gadolinium-153. Photon sources based on it are used in densimeters for diagnostics of bone deseases. The procedure for separating gadolinium and europium, which ismore » currently used at the Research Institute of Atomic Reactors (NILAR), is based on europium cementation with the use of sodium amalgam. The method, though efficient, did not until recently permit an exhaustive removal of radioactive europium from {sup 153}Gd. The authors have thoroughly studied the separation process in semi-countercurrent mode, using citrate solutions. A special attention was given to the composition of europium complex species.« less

  17. Influence of carbon source and inoculum type on anaerobic biomass adhesion on polyurethane foam in reactors fed with acid mine drainage.

    PubMed

    Rodriguez, Renata P; Zaiat, Marcelo

    2011-04-01

    This paper analyzes the influence of carbon source and inoculum origin on the dynamics of biomass adhesion to an inert support in anaerobic reactors fed with acid mine drainage. Formic acid, lactic acid and ethanol were used as carbon sources. Two different inocula were evaluated: one taken from an UASB reactor and other from the sediment of a uranium mine. The values of average colonization rates and the maximum biomass concentration (C(max)) were inversely proportional to the number of carbon atoms in each substrate. The highest C(max) value (0.35 g TVS g(-1) foam) was observed with formic acid and anaerobic sludge as inoculum. Maximum colonization rates (v(max)) were strongly influenced by the type of inoculum when ethanol and lactic acid were used. For both carbon sources, the use of mine sediment as inoculum resulted in a v(max) of 0.013 g TVS g(-1) foam day(-1), whereas 0.024 g TVS g(-1) foam day(-1) was achieved with anaerobic sludge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Probabilistic assessment of dynamic system performance. Part 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belhadj, Mohamed

    1993-01-01

    Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less

  19. New techniques for modeling the reliability of reactor pressure vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, K.I.; Simonen, F.A.; Liebetrau, A.M.

    1985-12-01

    In recent years several probabilistic fracture mechanics codes, including the VISA code, have been developed to predict the reliability of reactor pressure vessels. This paper describes new modeling techniques used in a second generation of the VISA code entitled VISA-II. Results are presented that show the sensitivity of vessel reliability predictions to such factors as inservice inspection to detect flaws, random positioning of flaws within the vessel walls thickness, and fluence distributions that vary through-out the vessel. The algorithms used to implement these modeling techniques are also described. Other new options in VISA-II are also described in this paper. Themore » effect of vessel cladding has been included in the heat transfer, stress, and fracture mechanics solutions in VISA-II. The algorithm for simulating flaws has been changed to consider an entire vessel rather than a single flaw in a single weld. The flaw distribution was changed to include the distribution of both flaw depth and length. A menu of several alternate equations has been included to predict the shift in RTNDT. For flaws that arrest and later re-initiate, an option was also included to allow correlating the current arrest thoughness with subsequent initiation toughnesses. 21 refs.« less

  20. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  1. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  2. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    PubMed

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  3. Helium-3 blankets for tritium breeding in fusion reactors

    NASA Technical Reports Server (NTRS)

    Steiner, Don; Embrechts, Mark; Varsamis, Georgios; Vesey, Roger; Gierszewski, Paul

    1988-01-01

    It is concluded that He-3 blankets offers considerable promise for tritium breeding in fusion reactors: good breeding potential, low operational risk, and attractive safety features. The availability of He-3 resources is the key issue for this concept. There is sufficient He-3 from decay of military stockpiles to meet the International Thermonuclear Experimental Reactor needs. Extraterrestrial sources of He-3 would be required for a fusion power economy.

  4. Characterization of efficient aerobic denitrifiers isolated from two different sequencing batch reactors by 16S-rRNA analysis.

    PubMed

    Wang, Ping; Li, Xiuting; Xiang, Mufei; Zhai, Qian

    2007-06-01

    By adopting two sequencing batch reactors (SBRs) A and B, nitrate as the substrate, and the intermittent aeration mode, activated sludge was domesticated to enrich aerobic denitrifiers. The pHs of reactor A were approximately 6.3 at DOs 2.2-6.1 mg/l for a carbon source of 720 mg/l COD; the pHs of reactor B were 6.8-7.8 at DOs 2.2-3.0 mg/l for a carbon source of 1500 mg/l COD. Both reactors maintained an influent nitrate concentration of 80 mg/l NO3- -N. When the total inorganic nitrogen (TIN) removal efficiency of both reactors reached 60%, aerobic denitrifier accumulation was regarded completed. By bromthymol blue (BTB) medium, 20 bacteria were isolated from the two SBRs and DNA samples of 8 of these 20 strains were amplified by PCR and processed for 16SrRNA sequencing. The obtained results were analysed by a Blast similarity search of the GenBank database, and constructing a phylogenetic tree for identification by comparison. The 8 bacteria were found to belong to the genera Pseudomonas, Delftia, Herbaspirillum and Comamonas. At present, no Delftia has been reported to be an aerobic denitrifier.

  5. Purified silicon production system

    DOEpatents

    Wang, Tihu; Ciszek, Theodore F.

    2004-03-30

    Method and apparatus for producing purified bulk silicon from highly impure metallurgical-grade silicon source material at atmospheric pressure. Method involves: (1) initially reacting iodine and metallurgical-grade silicon to create silicon tetraiodide and impurity iodide byproducts in a cold-wall reactor chamber; (2) isolating silicon tetraiodide from the impurity iodide byproducts and purifying it by distillation in a distillation chamber; and (3) transferring the purified silicon tetraiodide back to the cold-wall reactor chamber, reacting it with additional iodine and metallurgical-grade silicon to produce silicon diiodide and depositing the silicon diiodide onto a substrate within the cold-wall reactor chamber. The two chambers are at atmospheric pressure and the system is open to allow the introduction of additional source material and to remove and replace finished substrates.

  6. Shielding calculation and criticality safety analysis of spent fuel transportation cask in research reactors.

    PubMed

    Mohammadi, A; Hassanzadeh, M; Gharib, M

    2016-02-01

    In this study, shielding calculation and criticality safety analysis were carried out for general material testing reactor (MTR) research reactors interim storage and relevant transportation cask. During these processes, three major terms were considered: source term, shielding, and criticality calculations. The Monte Carlo transport code MCNP5 was used for shielding calculation and criticality safety analysis and ORIGEN2.1 code for source term calculation. According to the results obtained, a cylindrical cask with body, top, and bottom thicknesses of 18, 13, and 13 cm, respectively, was accepted as the dual-purpose cask. Furthermore, it is shown that the total dose rates are below the normal transport criteria that meet the standards specified. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. NEUTRONIC REACTOR SYSTEM

    DOEpatents

    Treshow, M.

    1959-02-10

    A reactor system incorporating a reactor of the heterogeneous boiling water type is described. The reactor is comprised essentially of a core submerged adwater in the lower half of a pressure vessel and two distribution rings connected to a source of water are disposed within the pressure vessel above the reactor core, the lower distribution ring being submerged adjacent to the uppcr end of the reactor core and the other distribution ring being located adjacent to the top of the pressure vessel. A feed-water control valve, responsive to the steam demand of the load, is provided in the feedwater line to the distribution rings and regulates the amount of feed water flowing to each distribution ring, the proportion of water flowing to the submerged distribution ring being proportional to the steam demand of the load. This invention provides an automatic means exterior to the reactor to control the reactivity of the reactor over relatively long periods of time without relying upon movement of control rods or of other moving parts within the reactor structure.

  8. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, Warren G.; Basaran, Osman A.; Harris, Michael T.

    1998-01-01

    A nozzle for an electric dispersion reactor includes two concentric electrodes, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode.

  9. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, Warren G.; Basaran, Osman A.; Harris, Michael T.

    1995-01-01

    A nozzle for an electric dispersion reactor includes two concentric electrodes, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode.

  10. Inertial confinement fusion method producing line source radiation fluence

    DOEpatents

    Rose, Ronald P.

    1984-01-01

    An inertial confinement fusion method in which target pellets are imploded in sequence by laser light beams or other energy beams at an implosion site which is variable between pellet implosions along a line. The effect of the variability in position of the implosion site along a line is to distribute the radiation fluence in surrounding reactor components as a line source of radiation would do, thereby permitting the utilization of cylindrical geometry in the design of the reactor and internal components.

  11. Fusion pumped laser

    DOEpatents

    Pappas, D.S.

    1987-07-31

    The apparatus of this invention may comprise a system for generating laser radiation from a high-energy neutron source. The neutron source is a tokamak fusion reactor generating a long pulse of high-energy neutrons and having a temperature and magnetic field effective to generate a neutron flux of at least 10/sup 15/ neutrons/cm/sup 2//center dot/s. Conversion means are provided adjacent the fusion reactor at a location operable for converting the high-energy neutrons to an energy source with an intensity and energy effective to excite a preselected lasing medium. A lasing medium is spaced about and responsive to the energy source to generate a population inversion effective to support laser oscillations for generating output radiation. 2 figs., 2 tabs.

  12. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. NORTH AMERICAN FREE TRADE AGREEMENT (NAFTA) BORDER PROJECT - A COMPARISON OF THE ARIZONA BORDER POPULATION WITH THE STATE POPULATION

    EPA Science Inventory

    There is a perception among the population of the border communities that they have increased exposure due to their proximity to pollution sources in Mexico. This study provides exposure data for the border population that will be compared with data from a probabilistic exposure...

  14. Risk-based enteric pathogen reduction targets for non-potable and direct potable use of roof runoff, stormwater, and greywater

    EPA Science Inventory

    This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was use...

  15. THE CONTRIBUTION OF AMBIENT PM2.5 TO TOTAL PERSONAL EXPOSURES: RESULTS FROM A POPULATION EXPOSURE MODEL FOR PHILADELPHIA, PA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate population exposure modules that use a probabilistic approach to predict population exposu...

  16. SHEDS-PM: A POPULATION EXPOSURE MODEL FOR PREDICTING DISTRIBUTIONS OF PM EXPOSURE AND DOSE FROM BOTH OUTDOOR AND INDOOR SOURCES

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) has developed a population exposure and dose model for particulate matter (PM), called the Stochastic Human Exposure and Dose Simulation (SHEDS) model. SHEDS-PM uses a probabilistic approach that incorporates both variabi...

  17. The Power of Implicit Social Relation in Rating Prediction of Social Recommender Systems

    PubMed Central

    Reafee, Waleed; Salim, Naomie; Khan, Atif

    2016-01-01

    The explosive growth of social networks in recent times has presented a powerful source of information to be utilized as an extra source for assisting in the social recommendation problems. The social recommendation methods that are based on probabilistic matrix factorization improved the recommendation accuracy and partly solved the cold-start and data sparsity problems. However, these methods only exploited the explicit social relations and almost completely ignored the implicit social relations. In this article, we firstly propose an algorithm to extract the implicit relation in the undirected graphs of social networks by exploiting the link prediction techniques. Furthermore, we propose a new probabilistic matrix factorization method to alleviate the data sparsity problem through incorporating explicit friendship and implicit friendship. We evaluate our proposed approach on two real datasets, Last.Fm and Douban. The experimental results show that our method performs much better than the state-of-the-art approaches, which indicates the importance of incorporating implicit social relations in the recommendation process to address the poor prediction accuracy. PMID:27152663

  18. Registration of reactor neutrinos with the highly segmented plastic scintillator detector DANSSino

    NASA Astrophysics Data System (ADS)

    Belov, V.; Brudanin, V.; Danilov, M.; Egorov, V.; Fomina, M.; Kobyakin, A.; Rusinov, V.; Shirchenko, M.; Shitov, Yu; Starostin, A.; Zhitnikov, I.

    2013-05-01

    DANSSino is a simplified pilot version of a solid-state detector of reactor antineutrino (it is being created within the DANSS project and will be installed close to an industrial nuclear power reactor). Numerous tests performed under a 3 GWth reactor of the Kalinin NPP at a distance of 11 m from the core demonstrate operability of the chosen design and reveal the main sources of the background. In spite of its small size (20 × 20 × 100 cm3), the pilot detector turned out to be quite sensitive to reactor neutrinos, detecting about 70 IBD events per day with the signal-to-background ratio about unity.

  19. Method of producing gaseous products using a downflow reactor

    DOEpatents

    Cortright, Randy D; Rozmiarek, Robert T; Hornemann, Charles C

    2014-09-16

    Reactor systems and methods are provided for the catalytic conversion of liquid feedstocks to synthesis gases and other noncondensable gaseous products. The reactor systems include a heat exchange reactor configured to allow the liquid feedstock and gas product to flow concurrently in a downflow direction. The reactor systems and methods are particularly useful for producing hydrogen and light hydrocarbons from biomass-derived oxygenated hydrocarbons using aqueous phase reforming. The generated gases may find used as a fuel source for energy generation via PEM fuel cells, solid-oxide fuel cells, internal combustion engines, or gas turbine gensets, or used in other chemical processes to produce additional products. The gaseous products may also be collected for later use or distribution.

  20. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, Warren G.; Harris, Michael T.; Scott, Timothy C.; Basaran, Osman A.

    1998-01-01

    A nozzle for an electric dispersion reactor includes two coaxial cylindrical bodies, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode.

  1. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, Warren G.; Harris, Michael T.; Scott, Timothy C.; Basaran, Osman A.

    1996-01-01

    A nozzle for an electric dispersion reactor includes two coaxial cylindrical bodies, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode.

  2. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, W.G.; Basaran, O.A.; Harris, M.T.

    1998-04-14

    A nozzle for an electric dispersion reactor includes two concentric electrodes, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode. 4 figs.

  3. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, W.G.; Basaran, O.A.; Harris, M.T.

    1995-11-07

    A nozzle for an electric dispersion reactor includes two concentric electrodes, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode. 4 figs.

  4. Reactor Application for Coaching Newbies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-06-17

    RACCOON is a Moose based reactor physics application designed to engage undergraduate and first-year graduate students. The code contains capabilities to solve the multi group Neutron Diffusion equation in eigenvalue and fixed source form and will soon have a provision to provide simple thermal feedback. These capabilities are sufficient to solve example problems found in Duderstadt & Hamilton (the typical textbook of senior level reactor physics classes). RACCOON does not contain any advanced capabilities as found in YAK.

  5. Hybrid fusion reactor for production of nuclear fuel with minimum radioactive contamination of the fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velikhov, E. P.; Kovalchuk, M. V.; Azizov, E. A., E-mail: Azizov-EA@nrcki.ru

    2015-12-15

    The paper presents the results of the system research on the coordinated development of nuclear and fusion power engineering in the current century. Considering the increasing problems of resource procurement, including limited natural uranium resources, it seems reasonable to use fusion reactors as high-power neutron sources for production of nuclear fuel in a blanket. It is shown that the share of fusion sources in this structural configuration of the energy system can be relatively small. A fundamentally important aspect of this solution to the problem of closure of the fuel cycle is that recycling of highly active spent fuel canmore » be abandoned. Radioactivity released during the recycling of the spent fuel from the hybrid reactor blanket is at least two orders of magnitude lower than during the production of the same number of fissile isotopes after the recycling of the spent fuel from a fast reactor.« less

  6. High-irradiance reactor design with practical unfolded optics

    NASA Astrophysics Data System (ADS)

    Feuermann, Daniel; Gordon, Jeffrey M.

    2008-08-01

    In the design of high-temperature chemical reactors and furnaces, as well as high-radiance light projection applications, reconstituting the ultra-high radiance of short-arc discharge lamps at maximum radiative efficiency constitutes a significant challenge. The difficulty is exacerbated by the high numerical aperture necessary at both the source and the target. Separating the optic from both the light source and the target allows practical operation, control, monitoring, diagnostics and maintenance. We present near-field unfolded aplanatic optics as a feasible solution. The concept is illustrated with a design customized to a high-temperature chemical reactor for nano-material synthesis, driven by an ultra-bright xenon short-arc discharge lamp, with near-unity numerical aperture for both light input and light output. We report preliminary optical measurements for the first prototype, which constitutes a double-ellipsoid solution. We also propose compound unfolded aplanats that collect the full angular extent of lamp emission (in lieu of light recycling optics) and additionally permit nearly full-circumference irradiation of the reactor.

  7. Conceptual Design of Low-Temperature Hydrogen Production and High-Efficiency Nuclear Reactor Technology

    NASA Astrophysics Data System (ADS)

    Fukushima, Kimichika; Ogawa, Takashi

    Hydrogen, a potential alternative energy source, is produced commercially by methane (or LPG) steam reforming, a process that requires high temperatures, which are produced by burning fossil fuels. However, as this process generates large amounts of CO2, replacement of the combustion heat source with a nuclear heat source for 773-1173K processes has been proposed in order to eliminate these CO2 emissions. In this paper, a novel method of nuclear hydrogen production by reforming dimethyl ether (DME) with steam at about 573K is proposed. From a thermodynamic equilibrium analysis of DME steam reforming, the authors identified conditions that provide high hydrogen production fraction at low pressure and temperatures of about 523-573K. By setting this low-temperature hydrogen production process upstream from a turbine and nuclear reactor at about 573K, the total energy utilization efficiency according to equilibrium mass and heat balance analysis is about 50%, and it is 75%for a fast breeder reactor (FBR), where turbine is upstream of the reformer.

  8. A Probabilistic Approach to Quantify the Impact of Uncertainty Propagation in Musculoskeletal Simulations

    PubMed Central

    Myers, Casey A.; Laz, Peter J.; Shelburne, Kevin B.; Davidson, Bradley S.

    2015-01-01

    Uncertainty that arises from measurement error and parameter estimation can significantly affect the interpretation of musculoskeletal simulations; however, these effects are rarely addressed. The objective of this study was to develop an open-source probabilistic musculoskeletal modeling framework to assess how measurement error and parameter uncertainty propagate through a gait simulation. A baseline gait simulation was performed for a male subject using OpenSim for three stages: inverse kinematics, inverse dynamics, and muscle force prediction. A series of Monte Carlo simulations were performed that considered intrarater variability in marker placement, movement artifacts in each phase of gait, variability in body segment parameters, and variability in muscle parameters calculated from cadaveric investigations. Propagation of uncertainty was performed by also using the output distributions from one stage as input distributions to subsequent stages. Confidence bounds (5–95%) and sensitivity of outputs to model input parameters were calculated throughout the gait cycle. The combined impact of uncertainty resulted in mean bounds that ranged from 2.7° to 6.4° in joint kinematics, 2.7 to 8.1 N m in joint moments, and 35.8 to 130.8 N in muscle forces. The impact of movement artifact was 1.8 times larger than any other propagated source. Sensitivity to specific body segment parameters and muscle parameters were linked to where in the gait cycle they were calculated. We anticipate that through the increased use of probabilistic tools, researchers will better understand the strengths and limitations of their musculoskeletal simulations and more effectively use simulations to evaluate hypotheses and inform clinical decisions. PMID:25404535

  9. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  10. Associations of Topics of Discussion on Twitter With Survey Measures of Attitudes, Knowledge, and Behaviors Related to Zika: Probabilistic Study in the United States

    PubMed Central

    Winneg, Kenneth; Chan, Man-Pui Sally; Hall Jamieson, Kathleen; Albarracin, Dolores

    2018-01-01

    Background Recent outbreaks of Zika virus around the world led to increased discussions about this issue on social media platforms such as Twitter. These discussions may provide useful information about attitudes, knowledge, and behaviors of the population regarding issues that are important for public policy. Objective We sought to identify the associations of the topics of discussions on Twitter and survey measures of Zika-related attitudes, knowledge, and behaviors, not solely based upon the volume of such discussions but by analyzing the content of conversations using probabilistic techniques. Methods Using probabilistic topic modeling with US county and week as the unit of analysis, we analyzed the content of Twitter online communications to identify topics related to the reported attitudes, knowledge, and behaviors captured in a national representative survey (N=33,193) of the US adult population over 33 weeks. Results Our analyses revealed topics related to “congress funding for Zika,” “microcephaly,” “Zika-related travel discussions,” “insect repellent,” “blood transfusion technology,” and “Zika in Miami” were associated with our survey measures of attitudes, knowledge, and behaviors observed over the period of the study. Conclusions Our results demonstrated that it is possible to uncover topics of discussions from Twitter communications that are associated with the Zika-related attitudes, knowledge, and behaviors of populations over time. Social media data can be used as a complementary source of information alongside traditional data sources to gauge the patterns of attitudes, knowledge, and behaviors in a population. PMID:29426815

  11. The Angra Project: Monitoring Nuclear Reactors with Antineutrino Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anjos, J. C.; Barbosa, A. F.; Lima, H. P. Jr.

    2010-03-30

    We present the status of the Angra Neutrino project, describing the development of an antineutrino detector aimed at monitoring nuclear reactor activity. The experiment will take place at the Brazilian nuclear power plant located in Angra dos Reis. The Angra II reactor, with 4 GW of thermal power, will be used as a source of antineutrinos. A water Cherenkov detector will be placed above ground in a commercial container outside the reactor containment, about 30 m from the reactor core. With a detector of one ton scale a few thousand antineutrino interactions per day are expected. We intend, in amore » first step, to use the measured neutrino event rate to monitor the on--off status and the thermal power delivered by the reactor. In addition to the safeguards issues the project will provide an alternative tool to have an independent measurement of the reactor power.« less

  12. The Angra Project: Monitoring Nuclear Reactors with Antineutrino Detectors

    NASA Astrophysics Data System (ADS)

    Anjos, J. C.; Barbosa, A. F.; Bezerra, T. J. C.; Chimenti, P.; Gonzalez, L. F. G.; Kemp, E.; de Oliveira, M. A. Leigui; Lima, H. P.; Lima, R. M.; Nunokawa, H.

    2010-03-01

    We present the status of the Angra Neutrino project, describing the development of an antineutrino detector aimed at monitoring nuclear reactor activity. The experiment will take place at the Brazilian nuclear power plant located in Angra dos Reis. The Angra II reactor, with 4 GW of thermal power, will be used as a source of antineutrinos. A water Cherenkov detector will be placed above ground in a commercial container outside the reactor containment, about 30 m from the reactor core. With a detector of one ton scale a few thousand antineutrino interactions per day are expected. We intend, in a first step, to use the measured neutrino event rate to monitor the on—off status and the thermal power delivered by the reactor. In addition to the safeguards issues the project will provide an alternative tool to have an independent measurement of the reactor power.

  13. Oxygen transport membrane reactor based method and system for generating electric power

    DOEpatents

    Kelly, Sean M.; Chakravarti, Shrikar; Li, Juan

    2017-02-07

    A carbon capture enabled system and method for generating electric power and/or fuel from methane containing sources using oxygen transport membranes by first converting the methane containing feed gas into a high pressure synthesis gas. Then, in one configuration the synthesis gas is combusted in oxy-combustion mode in oxygen transport membranes based boiler reactor operating at a pressure at least twice that of ambient pressure and the heat generated heats steam in thermally coupled steam generation tubes within the boiler reactor; the steam is expanded in steam turbine to generate power; and the carbon dioxide rich effluent leaving the boiler reactor is processed to isolate carbon. In another configuration the synthesis gas is further treated in a gas conditioning system configured for carbon capture in a pre-combustion mode using water gas shift reactors and acid gas removal units to produce hydrogen or hydrogen-rich fuel gas that fuels an integrated gas turbine and steam turbine system to generate power. The disclosed method and system can also be adapted to integrate with coal gasification systems to produce power from both coal and methane containing sources with greater than 90% carbon isolation.

  14. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.

  15. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  16. Probabilistic Seismic Hazard Analysis of Victoria, British Columbia, Canada: Considering an Active Leech River Fault

    NASA Astrophysics Data System (ADS)

    Kukovica, J.; Molnar, S.; Ghofrani, H.

    2017-12-01

    The Leech River fault is situated on Vancouver Island near the city of Victoria, British Columbia, Canada. The 60km transpressional reverse fault zone runs east to west along the southern tip of Vancouver Island, dividing the lithologic units of Jurassic-Cretaceous Leech River Complex schists to the north and Eocene Metchosin Formation basalts to the south. This fault system poses a considerable hazard due to its proximity to Victoria and 3 major hydroelectric dams. The Canadian seismic hazard model for the 2015 National Building Code of Canada (NBCC) considered the fault system to be inactive. However, recent paleoseismic evidence suggests there to be at least 2 surface-rupturing events to have exceeded a moment magnitude (M) of 6.5 within the last 15,000 years (Morell et al. 2017). We perform a Probabilistic Seismic Hazard Analysis (PSHA) for the city of Victoria with consideration of the Leech River fault as an active source. A PSHA for Victoria which replicates the 2015 NBCC estimates is accomplished to calibrate our PSHA procedure. The same seismic source zones, magnitude recurrence parameters, and Ground Motion Prediction Equations (GMPEs) are used. We replicate the uniform hazard spectrum for a probability of exceedance of 2% in 50 years for a 500 km radial area around Victoria. An active Leech River fault zone is then added; known length and dip. We are determining magnitude recurrence parameters based on a Gutenberg-Richter relationship for the Leech River fault from various catalogues of the recorded seismicity (M 2-3) within the fault's vicinity and the proposed paleoseismic events. We seek to understand whether inclusion of an active Leech River fault source will significantly increase the probabilistic seismic hazard for Victoria. Morell et al. 2017. Quaternary rupture of a crustal fault beneath Victoria, British Columbia, Canada. GSA Today, 27, doi: 10.1130/GSATG291A.1

  17. A generalized sizing method for revolutionary concepts under probabilistic design constraints

    NASA Astrophysics Data System (ADS)

    Nam, Taewoo

    Internal combustion (IC) engines that consume hydrocarbon fuels have dominated the propulsion systems of air-vehicles for the first century of aviation. In recent years, however, growing concern over rapid climate changes and national energy security has galvanized the aerospace community into delving into new alternatives that could challenge the dominance of the IC engine. Nevertheless, traditional aircraft sizing methods have significant shortcomings for the design of such unconventionally powered aircraft. First, the methods are specialized for aircraft powered by IC engines, and thus are not flexible enough to assess revolutionary propulsion concepts that produce propulsive thrust through a completely different energy conversion process. Another deficiency associated with the traditional methods is that a user of these methods must rely heavily on experts' experience and advice for determining appropriate design margins. However, the introduction of revolutionary propulsion systems and energy sources is very likely to entail an unconventional aircraft configuration, which inexorably disqualifies the conjecture of such "connoisseurs" as a means of risk management. Motivated by such deficiencies, this dissertation aims at advancing two aspects of aircraft sizing: (1) to develop a generalized aircraft sizing formulation applicable to a wide range of unconventionally powered aircraft concepts and (2) to formulate a probabilistic optimization technique that is able to quantify appropriate design margins that are tailored towards the level of risk deemed acceptable to a decision maker. A more generalized aircraft sizing formulation, named the Architecture Independent Aircraft Sizing Method (AIASM), was developed for sizing revolutionary aircraft powered by alternative energy sources by modifying several assumptions of the traditional aircraft sizing method. Along with advances in deterministic aircraft sizing, a non-deterministic sizing technique, named the Probabilistic Aircraft Sizing Method (PASM), was developed. The method allows one to quantify adequate design margins to account for the various sources of uncertainty via the application of the chance-constrained programming (CCP) strategy to AIASM. In this way, PASM can also provide insights into a good compromise between cost and safety.

  18. Bayesian probabilistic approach for inverse source determination from limited and noisy chemical or biological sensor concentration measurements

    NASA Astrophysics Data System (ADS)

    Yee, Eugene

    2007-04-01

    Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.

  19. A framework for fast probabilistic centroid-moment-tensor determination—inversion of regional static displacement measurements

    NASA Astrophysics Data System (ADS)

    Käufl, Paul; Valentine, Andrew P.; O'Toole, Thomas B.; Trampert, Jeannot

    2014-03-01

    The determination of earthquake source parameters is an important task in seismology. For many applications, it is also valuable to understand the uncertainties associated with these determinations, and this is particularly true in the context of earthquake early warning (EEW) and hazard mitigation. In this paper, we develop a framework for probabilistic moment tensor point source inversions in near real time. Our methodology allows us to find an approximation to p(m|d), the conditional probability of source models (m) given observations (d). This is obtained by smoothly interpolating a set of random prior samples, using Mixture Density Networks (MDNs)-a class of neural networks which output the parameters of a Gaussian mixture model. By combining multiple networks as `committees', we are able to obtain a significant improvement in performance over that of a single MDN. Once a committee has been constructed, new observations can be inverted within milliseconds on a standard desktop computer. The method is therefore well suited for use in situations such as EEW, where inversions must be performed routinely and rapidly for a fixed station geometry. To demonstrate the method, we invert regional static GPS displacement data for the 2010 MW 7.2 El Mayor Cucapah earthquake in Baja California to obtain estimates of magnitude, centroid location and depth and focal mechanism. We investigate the extent to which we can constrain moment tensor point sources with static displacement observations under realistic conditions. Our inversion results agree well with published point source solutions for this event, once the uncertainty bounds of each are taken into account.

  20. Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion

    NASA Astrophysics Data System (ADS)

    Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.

    2017-03-01

    Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small image patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie B.; Coles, Garill A.; Ramuhalli, Pradeep

    Advanced small modular reactors (aSMRs) can provide the United States with a safe, sustainable, and carbon-neutral energy source. The controllable day-to-day costs of aSMRs are expected to be dominated by operation and maintenance costs. Health and condition assessment coupled with online risk monitors can potentially enhance affordability of aSMRs through optimized operational planning and maintenance scheduling. Currently deployed risk monitors are an extension of probabilistic risk assessment (PRA). For complex engineered systems like nuclear power plants, PRA systematically combines event likelihoods and the probability of failure (POF) of key components, so that when combined with the magnitude of possible adversemore » consequences to determine risk. Traditional PRA uses population-based POF information to estimate the average plant risk over time. Currently, most nuclear power plants have a PRA that reflects the as-operated, as-modified plant; this model is updated periodically, typically once a year. Risk monitors expand on living PRA by incorporating changes in the day-by-day plant operation and configuration (e.g., changes in equipment availability, operating regime, environmental conditions). However, population-based POF (or population- and time-based POF) is still used to populate fault trees. Health monitoring techniques can be used to establish condition indicators and monitoring capabilities that indicate the component-specific POF at a desired point in time (or over a desired period), which can then be incorporated in the risk monitor to provide a more accurate estimate of the plant risk in different configurations. This is particularly important for active systems, structures, and components (SSCs) proposed for use in aSMR designs. These SSCs may differ significantly from those used in the operating fleet of light-water reactors (or even in LWR-based SMR designs). Additionally, the operating characteristics of aSMRs can present significantly different requirements, including the need to operate in different coolant environments, higher operating temperatures, and longer operating cycles between planned refueling and maintenance outages. These features, along with the relative lack of operating experience for some of the proposed advanced designs, may limit the ability to estimate event probability and component POF with a high degree of certainty. Incorporating real-time estimates of component POF may compensate for a relative lack of established knowledge about the long-term component behavior and improve operational and maintenance planning and optimization. The particular eccentricities of advanced reactors and small modular reactors provide unique challenges and needs for advanced instrumentation, control, and human-machine interface (ICHMI) techniques such as enhanced risk monitors (ERM) in aSMRs. Several features of aSMR designs increase the need for accurate characterization of the real-time risk during operation and maintenance activities. A number of technical gaps in realizing ERM exist, and these gaps are largely independent of the specific reactor technology. As a result, the development of a framework for ERM would enable greater situational awareness regardless of the specific class of reactor technology. A set of research tasks are identified in a preliminary research plan to enable the development, testing, and demonstration of such a framework. Although some aspects of aSMRs, such as specific operational characteristics, will vary and are not now completely defined, the proposed framework is expected to be relevant regardless of such uncertainty. The development of an ERM framework will provide one of the key technical developments necessary to ensure the economic viability of aSMRs.« less

  2. Purification and deposition of silicon by an iodide disproportionation reaction

    DOEpatents

    Wang, Tihu; Ciszek, Theodore F.

    2002-01-01

    Method and apparatus for producing purified bulk silicon from highly impure metallurgical-grade silicon source material at atmospheric pressure. Method involves: (1) initially reacting iodine and metallurgical-grade silicon to create silicon tetraiodide and impurity iodide byproducts in a cold-wall reactor chamber; (2) isolating silicon tetraiodide from the impurity iodide byproducts and purifying it by distillation in a distillation chamber; and (3) transferring the purified silicon tetraiodide back to the cold-wall reactor chamber, reacting it with additional iodine and metallurgical-grade silicon to produce silicon diiodide and depositing the silicon diiodide onto a substrate within the cold-wall reactor chamber. The two chambers are at atmospheric pressure and the system is open to allow the introduction of additional source material and to remove and replace finished substrates.

  3. Multi-Level Reduced Order Modeling Equipped with Probabilistic Error Bounds

    NASA Astrophysics Data System (ADS)

    Abdo, Mohammad Gamal Mohammad Mostafa

    This thesis develops robust reduced order modeling (ROM) techniques to achieve the needed efficiency to render feasible the use of high fidelity tools for routine engineering analyses. Markedly different from the state-of-the-art ROM techniques, our work focuses only on techniques which can quantify the credibility of the reduction which can be measured with the reduction errors upper-bounded for the envisaged range of ROM model application. Our objective is two-fold. First, further developments of ROM techniques are proposed when conventional ROM techniques are too taxing to be computationally practical. This is achieved via a multi-level ROM methodology designed to take advantage of the multi-scale modeling strategy typically employed for computationally taxing models such as those associated with the modeling of nuclear reactor behavior. Second, the discrepancies between the original model and ROM model predictions over the full range of model application conditions are upper-bounded in a probabilistic sense with high probability. ROM techniques may be classified into two broad categories: surrogate construction techniques and dimensionality reduction techniques, with the latter being the primary focus of this work. We focus on dimensionality reduction, because it offers a rigorous approach by which reduction errors can be quantified via upper-bounds that are met in a probabilistic sense. Surrogate techniques typically rely on fitting a parametric model form to the original model at a number of training points, with the residual of the fit taken as a measure of the prediction accuracy of the surrogate. This approach, however, does not generally guarantee that the surrogate model predictions at points not included in the training process will be bound by the error estimated from the fitting residual. Dimensionality reduction techniques however employ a different philosophy to render the reduction, wherein randomized snapshots of the model variables, such as the model parameters, responses, or state variables, are projected onto lower dimensional subspaces, referred to as the "active subspaces", which are selected to capture a user-defined portion of the snapshots variations. Once determined, the ROM model application involves constraining the variables to the active subspaces. In doing so, the contribution from the variables discarded components can be estimated using a fundamental theorem from random matrix theory which has its roots in Dixon's theory, developed in 1983. This theory was initially presented for linear matrix operators. The thesis extends this theorem's results to allow reduction of general smooth nonlinear operators. The result is an approach by which the adequacy of a given active subspace determined using a given set of snapshots, generated either using the full high fidelity model, or other models with lower fidelity, can be assessed, which provides insight to the analyst on the type of snapshots required to reach a reduction that can satisfy user-defined preset tolerance limits on the reduction errors. Reactor physics calculations are employed as a test bed for the proposed developments. The focus will be on reducing the effective dimensionality of the various data streams such as the cross-section data and the neutron flux. The developed methods will be applied to representative assembly level calculations, where the size of the cross-section and flux spaces are typically large, as required by downstream core calculations, in order to capture the broad range of conditions expected during reactor operation. (Abstract shortened by ProQuest.).

  4. Developments and Tendencies in Fission Reactor Concepts

    NASA Astrophysics Data System (ADS)

    Adamov, E. O.; Fuji-Ie, Y.

    This chapter describes, in two parts, new-generation nuclear energy systems that are required to be in harmony with nature and to make full use of nuclear resources. The issues of transmutation and containment of radioactive waste will also be addressed. After a short introduction to the first part, Sect. 58.1.2 will detail the requirements these systems must satisfy on the basic premise of peaceful use of nuclear energy. The expected designs themselves are described in Sect. 58.1.3. The subsequent sections discuss various types of advanced reactor systems. Section 58.1.4 deals with the light water reactor (LWR) whose performance is still expected to improve, which would extend its application in the future. The supercritical-water-cooled reactor (SCWR) will also be shortly discussed. Section 58.1.5 is mainly on the high temperature gas-cooled reactor (HTGR), which offers efficient and multipurpose use of nuclear energy. The gas-cooled fast reactor (GFR) is also included. Section 58.1.6 focuses on the sodium-cooled fast reactor (SFR) as a promising concept for advanced nuclear reactors, which may help both to achieve expansion of energy sources and environmental protection thus contributing to the sustainable development of mankind. The molten-salt reactor (MSR) is shortly described in Sect. 58.1.7. The second part of the chapter deals with reactor systems of a new generation, which are now found at the research and development (R&D) stage and in the medium term of 20-30 years can shape up as reliable, economically efficient, and environmentally friendly energy sources. They are viewed as technologies of cardinal importance, capable of resolving the problems of fuel resources, minimizing the quantities of generated radioactive waste and the environmental impacts, and strengthening the regime of nonproliferation of the materials suitable for nuclear weapons production. Particular attention has been given to naturally safe fast reactors with a closed fuel cycle (CFC) - as an advanced and promising reactor system that offers solutions to the above problems. The difference (not confrontation) between the approaches to nuclear power development based on the principles of “inherent safety” and “natural safety” is demonstrated.

  5. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    NASA Astrophysics Data System (ADS)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  6. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, W.G.; Harris, M.T.; Scott, T.C.; Basaran, O.A.

    1996-04-02

    A nozzle for an electric dispersion reactor includes two coaxial cylindrical bodies, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode. 5 figs.

  7. Nozzle for electric dispersion reactor

    DOEpatents

    Sisson, W.G.; Harris, M.T.; Scott, T.C.; Basaran, O.A.

    1998-06-02

    A nozzle for an electric dispersion reactor includes two coaxial cylindrical bodies, the inner one of the two delivering disperse phase fluid into a continuous phase fluid. A potential difference generated by a voltage source creates a dispersing electric field at the end of the inner electrode. 5 figs.

  8. Key Assets for a Sustainable Low Carbon Energy Future

    NASA Astrophysics Data System (ADS)

    Carre, Frank

    2011-10-01

    Since the beginning of the 21st century, concerns of energy security and climate change gave rise to energy policies focused on energy conservation and diversified low-carbon energy sources. Provided lessons of Fukushima accident are evidently accounted for, nuclear energy will probably be confirmed in most of today's nuclear countries as a low carbon energy source needed to limit imports of oil and gas and to meet fast growing energy needs. Future challenges of nuclear energy are then in three directions: i) enhancing safety performance so as to preclude any long term impact of severe accident outside the site of the plant, even in case of hypothetical external events, ii) full use of Uranium and minimization long lived radioactive waste burden for sustainability, and iii) extension to non-electricity energy products for maximizing the share of low carbon energy source in transportation fuels, industrial process heat and district heating. Advanced LWRs (Gen-III) are today's best available technologies and can somewhat advance nuclear energy in these three directions. However, breakthroughs in sustainability call for fast neutron reactors and closed fuel cycles, and non-electric applications prompt a revival of interest in high temperature reactors for exceeding cogeneration performances achievable with LWRs. Both types of Gen-IV nuclear systems by nature call for technology breakthroughs to surpass LWRs capabilities. Current resumption in France of research on sodium cooled fast neutron reactors (SFRs) definitely aims at significant progress in safety and economic competitiveness compared to earlier reactors of this type in order to progress towards a new generation of commercially viable sodium cooled fast reactor. Along with advancing a new generation of sodium cooled fast reactor, research and development on alternative fast reactor types such as gas or lead-alloy cooled systems (GFR & LFR) is strategic to overcome technical difficulties and/or political opposition specific to sodium. In conclusion, research and technology breakthroughs in nuclear power are needed for shaping a sustainable low carbon future. International cooperation is key for sharing costs of research and development of the required novel technologies and cost of first experimental reactors needed to demonstrate enabling technologies. At the same time technology breakthroughs are developed, pre-normative research is required to support codification work and harmonized regulations that will ultimately apply to safety and security features of resulting innovative reactor types and fuel cycles.

  9. Rapid solar-thermal decarbonization of methane

    NASA Astrophysics Data System (ADS)

    Dahl, Jaimee Kristen

    Due to the ever-increasing demand for energy and the concern over the environmental impact of continuing to produce energy using current methods, there is interest in developing a hydrogen economy. Hydrogen is a desirable energy source because it is abundant in nature and burns cleanly. One method for producing hydrogen is to utilize a renewable energy source to obtain high enough temperatures to decompose a fossil fuel into its elements. This thesis work is directed at developing a solar-thermal aerosol flow reactor to dissociate methane to carbon black and hydrogen. The technology is intended as a "bridge" between current hydrogen production methods, such as conventional steam-methane reformers, and future "zero emission" technology for producing hydrogen, such as dissociating water using a renewable heating source. A solar furnace is used to heat a reactor to temperatures in excess of 2000 K. The final reactor design studied consists of three concentric vertical tubes---an outer quartz protection tube, a middle solid graphite heating tube, and an inner porous graphite reaction tube. A "fluid-wall" is created on the inside wall of the porous reaction tube in order to prevent deposition of the carbon black co-product on the reactor tube wall. The amorphous carbon black produced aids in heating the gas stream by absorbing radiation from the reactor wall. Conversions of 90% are obtained at a reactor wall temperature of 2100 K and an average residence time of 0.01 s. Computer modeling is also performed to study the gas flow and temperature profiles in the reactor as well as the kinetics of the methane dissociation reaction. The simulations indicate that there is little flow of the fluid-wall gas through the porous wall in the hot zone region, but this can be remedied by increasing the inlet temperature of the fluid-wall gas and/or increasing the tube permeability only in the hot zone region of the wall. The following expression describes the kinetics of methane dissociation in a solar-thermal fluid-wall reactor: dXdt=5.8x108 exp-155,600RT 1-X 7.2s-1. The experimental and theoretical work reported in this thesis is the groundwork that will be utilized in scaling up the reactor to produce hydrogen in distributed or centralized facilities.

  10. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  11. DCMDN: Deep Convolutional Mixture Density Network

    NASA Astrophysics Data System (ADS)

    D'Isanto, Antonio; Polsterer, Kai Lars

    2017-09-01

    Deep Convolutional Mixture Density Network (DCMDN) estimates probabilistic photometric redshift directly from multi-band imaging data by combining a version of a deep convolutional network with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) are applied as performance criteria. DCMDN is able to predict redshift PDFs independently from the type of source, e.g. galaxies, quasars or stars and renders pre-classification of objects and feature extraction unnecessary; the method is extremely general and allows the solving of any kind of probabilistic regression problems based on imaging data, such as estimating metallicity or star formation rate in galaxies.

  12. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE PAGES

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    2018-03-01

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  13. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  14. Uncertainty Estimation in Tsunami Initial Condition From Rapid Bayesian Finite Fault Modeling

    NASA Astrophysics Data System (ADS)

    Benavente, R. F.; Dettmer, J.; Cummins, P. R.; Urrutia, A.; Cienfuegos, R.

    2017-12-01

    It is well known that kinematic rupture models for a given earthquake can present discrepancies even when similar datasets are employed in the inversion process. While quantifying this variability can be critical when making early estimates of the earthquake and triggered tsunami impact, "most likely models" are normally used for this purpose. In this work, we quantify the uncertainty of the tsunami initial condition for the great Illapel earthquake (Mw = 8.3, 2015, Chile). We focus on utilizing data and inversion methods that are suitable to rapid source characterization yet provide meaningful and robust results. Rupture models from teleseismic body and surface waves as well as W-phase are derived and accompanied by Bayesian uncertainty estimates from linearized inversion under positivity constraints. We show that robust and consistent features about the rupture kinematics appear when working within this probabilistic framework. Moreover, by using static dislocation theory, we translate the probabilistic slip distributions into seafloor deformation which we interpret as a tsunami initial condition. After considering uncertainty, our probabilistic seafloor deformation models obtained from different data types appear consistent with each other providing meaningful results. We also show that selecting just a single "representative" solution from the ensemble of initial conditions for tsunami propagation may lead to overestimating information content in the data. Our results suggest that rapid, probabilistic rupture models can play a significant role during emergency response by providing robust information about the extent of the disaster.

  15. Feasibility study on the use of probabilistic migration modeling in support of exposure assessment from food contact materials.

    PubMed

    Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy

    2010-07-01

    The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.

  16. Evaluating sub-seasonal skill in probabilistic forecasts of Atmospheric Rivers and associated extreme events

    NASA Astrophysics Data System (ADS)

    Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.

    2017-12-01

    Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.

  17. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  18. Isomer Energy Source for Space Propulsion Systems

    DTIC Science & Technology

    2004-03-01

    1,590 Engine F/W (no shield) 3.4 5.0 20.0 A similar core design replacing the fission fuel with the isomer 178Hfm2 is the starting point for this...particles interact and collide with other atoms in the fuel material, reactor core , or coolant, their energy can be transferred to thermal energy...thrust (44). The program produced several reactors that made it all the way through the testing stages of development . The reactors used uranium-235

  19. Method and apparatus for a combination moving bed thermal treatment reactor and moving bed filter

    DOEpatents

    Badger, Phillip C.; Dunn, Jr., Kenneth J.

    2015-09-01

    A moving bed gasification/thermal treatment reactor includes a geometry in which moving bed reactor particles serve as both a moving bed filter and a heat carrier to provide thermal energy for thermal treatment reactions, such that the moving bed filter and the heat carrier are one and the same to remove solid particulates or droplets generated by thermal treatment processes or injected into the moving bed filter from other sources.

  20. Small Modular Reactors: The Army’s Secure Source of Energy?

    DTIC Science & Technology

    2012-03-21

    significant advantages of SMRs is the minimal amount of carbon dioxide (greenhouse gases) that is released in conjunction with the lifecycle operations...moderator in these reactors as well as the cooling agent and the means by which heat is removed to produce steam for turning the turbines of the...separate water system to generate steam to turn a turbine which then produces electricity. In the second type of light water reactors, the boiling water

  1. Performance of a full scale prototype detector at the BR2 reactor for the SoLid experiment

    NASA Astrophysics Data System (ADS)

    Abreu, Y.; Amhis, Y.; Arnold, L.; Ban, G.; Beaumont, W.; Bongrand, M.; Boursette, D.; Castle, B. C.; Clark, K.; Coupé, B.; Cussans, D.; De Roeck, A.; D'Hondt, J.; Durand, D.; Fallot, M.; Ghys, L.; Giot, L.; Guillon, B.; Ihantola, S.; Janssen, X.; Kalcheva, S.; Kalousis, L. N.; Koonen, E.; Labare, M.; Lehaut, G.; Manzanillas, L.; Mermans, J.; Michiels, I.; Moortgat, C.; Newbold, D.; Park, J.; Pestel, V.; Petridis, K.; Piñera, I.; Pommery, G.; Popescu, L.; Pronost, G.; Rademacker, J.; Ryckbosch, D.; Ryder, N.; Saunders, D.; Schune, M.-H.; Simard, L.; Vacheret, A.; Van Dyck, S.; Van Mulders, P.; van Remortel, N.; Vercaemer, S.; Verstraeten, M.; Weber, A.; Yermia, F.

    2018-05-01

    The SoLid collaboration has developed a new detector technology to detect electron anti-neutrinos at close proximity to the Belgian BR2 reactor at surface level. A 288 kg prototype detector was deployed in 2015 and collected data during the operational period of the reactor and during reactor shut-down. Dedicated calibration campaigns were also performed with gamma and neutron sources. This paper describes the construction of the prototype detector with a high control on its proton content and the stability of its operation over a period of several months after deployment at the BR2 reactor site. All detector cells provide sufficient light yields to achieve a target energy resolution of better than 20%/√E(MeV). The capability of the detector to track muons is exploited to equalize the light response of a large number of channels to a precision of 3% and to demonstrate the stability of the energy scale over time. Particle identification based on pulse-shape discrimination is demonstrated with calibration sources. Despite a lower neutron detection efficiency due to triggering constraints, the main backgrounds at the reactor site were determined and taken into account in the shielding strategy for the main experiment. The results obtained with this prototype proved essential in the design optimization of the final detector.

  2. Radiation Source Mapping with Bayesian Inverse Methods

    DOE PAGES

    Hykes, Joshua M.; Azmy, Yousry Y.

    2017-03-22

    In this work, we present a method to map the spectral and spatial distributions of radioactive sources using a limited number of detectors. Locating and identifying radioactive materials is important for border monitoring, in accounting for special nuclear material in processing facilities, and in cleanup operations following a radioactive material spill. Most methods to analyze these types of problems make restrictive assumptions about the distribution of the source. In contrast, the source mapping method presented here allows an arbitrary three-dimensional distribution in space and a gamma peak distribution in energy. To apply the method, the problem is cast as anmore » inverse problem where the system’s geometry and material composition are known and fixed, while the radiation source distribution is sought. A probabilistic Bayesian approach is used to solve the resulting inverse problem since the system of equations is ill-posed. The posterior is maximized with a Newton optimization method. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint, discrete ordinates flux solutions, obtained in this work by the Denovo code, is required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes form the linear mapping from the state space to the response space. The test of the method’s success is simultaneously locating a set of 137Cs and 60Co gamma sources in a room. This test problem is solved using experimental measurements that we collected for this purpose. Because of the weak sources available for use in the experiment, some of the expected photopeaks were not distinguishable from the Compton continuum. However, by supplanting 14 flawed measurements (out of a total of 69) with synthetic responses computed by MCNP, the proof-of-principle source mapping was successful. The locations of the sources were predicted within 25 cm for two of the sources and 90 cm for the third, in a room with an ~4-x 4-m floor plan. Finally, the predicted source intensities were within a factor of ten of their true value.« less

  3. Performance of probabilistic method to detect duplicate individual case safety reports.

    PubMed

    Tregunno, Philip Michael; Fink, Dorthe Bech; Fernandez-Fernandez, Cristina; Lázaro-Bengoa, Edurne; Norén, G Niklas

    2014-04-01

    Individual case reports of suspected harm from medicines are fundamental for signal detection in postmarketing surveillance. Their effective analysis requires reliable data and one challenge is report duplication. These are multiple unlinked records describing the same suspected adverse drug reaction (ADR) in a particular patient. They distort statistical screening and can mislead clinical assessment. Many organisations rely on rule-based detection, but probabilistic record matching is an alternative. The aim of this study was to evaluate probabilistic record matching for duplicate detection, and to characterise the main sources of duplicate reports within each data set. vigiMatch™, a published probabilistic record matching algorithm, was applied to the WHO global individual case safety reports database, VigiBase(®), for reports submitted between 2000 and 2010. Reported drugs, ADRs, patient age, sex, country of origin, and date of onset were considered in the matching. Suspected duplicates for the UK, Denmark, and Spain were reviewed and classified by the respective national centre. This included evaluation to determine whether confirmed duplicates had already been identified by in-house, rule-based screening. Furthermore, each confirmed duplicate was classified with respect to the likely source of duplication. For each country, the proportions of suspected duplicates classified as confirmed duplicates, likely duplicates, otherwise related, and unrelated were obtained. The proportions of confirmed or likely duplicates that were not previously known by the national organisation were determined, and variations in the rates of suspected duplicates across subsets of reports were characterised. Overall, 2.5 % of the reports with sufficient information to be evaluated by vigiMatch were classified as suspected duplicates. The rates for the three countries considered in this study were 1.4 % (UK), 1.0 % (Denmark), and 0.7 % (Spain). Higher rates of suspected duplicates were observed for literature reports (11 %) and reports with fatal outcome (5 %), whereas a lower rate was observed for reports from consumers and non-health professionals (0.5 %). The predictive value for confirmed or likely duplicates among reports flagged as suspected duplicates by vigiMatch ranged from 86 % for the UK, to 64 % for Denmark and 33 % for Spain. The proportions of confirmed duplicates that were previously unknown to national centres ranged from 89 % for Spain, to 60 % for the UK and 38 % for Denmark, despite in-house duplicate detection processes in routine use. The proportion of unrelated cases among suspected duplicates were below 10 % for each national centre in the study. Probabilistic record matching, as implemented in vigiMatch, achieved good predictive value for confirmed or likely duplicates in each data source. Most of the false positives corresponded to otherwise related reports; less than 10 % were altogether unrelated. A substantial proportion of the correctly identified duplicates had not previously been detected by national centre activity. On one hand, vigiMatch highlighted duplicates that had been missed by rule-based methods, and on the other hand its lower total number of suspected duplicates to review improved the accuracy of manual review.

  4. Nuclear Forensics Attributing the Source of Spent Fuel Used in an RDD Event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Mark Robert

    2005-05-01

    An RDD attack against the U.S. is something America needs to prepare against. If such an event occurs the ability to quickly identify the source of the radiological material used in an RDD would aid investigators in identifying the perpetrators. Spent fuel is one of the most dangerous possible radiological sources for an RDD. In this work, a forensics methodology was developed and implemented to attribute spent fuel to a source reactor. The specific attributes determined are the spent fuel burnup, age from discharge, reactor type, and initial fuel enrichment. It is shown that by analyzing the post-event material, thesemore » attributes can be determined with enough accuracy to be useful for investigators. The burnup can be found within a 5% accuracy, enrichment with a 2% accuracy, and age with a 10% accuracy. Reactor type can be determined if specific nuclides are measured. The methodology developed was implemented into a code call NEMASYS. NEMASYS is easy to use and it takes a minimum amount of time to learn its basic functions. It will process data within a few minutes and provide detailed information about the results and conclusions.« less

  5. Effect of air-assisted backwashing on the performance of an anaerobic fixed-bed bioreactor that simultaneously removes nitrate and arsenic from drinking water sources.

    PubMed

    Upadhyaya, Giridhar; Clancy, Tara M; Snyder, Kathryn V; Brown, Jess; Hayes, Kim F; Raskin, Lutgarde

    2012-03-15

    Contaminant removal from drinking water sources under reducing conditions conducive for the growth of denitrifying, arsenate reducing, and sulfate reducing microbes using a fixed-bed bioreactor may require oxygen-free gas (e.g., N2 gas) during backwashing. However, the use of air-assisted backwashing has practical advantages, including simpler operation, improved safety, and lower cost. A study was conducted to evaluate whether replacing N2 gas with air during backwashing would impact performance in a nitrate and arsenic removing anaerobic bioreactor system that consisted of two biologically active carbon reactors in series. Gas-assisted backwashing, comprised of 2 min of gas injection to fluidize the bed and dislodge biomass and solid phase products, was performed in the first reactor (reactor A) every two days. The second reactor (reactor B) was subjected to N2 gas-assisted backwashing every 3-4 months. Complete removal of 50 mg/L NO3- was achieved in reactor A before and after the switch from N2-assisted backwashing (NAB) to air-assisted backwashing (AAB). Substantial sulfate removal was achieved with both backwashing strategies. Prolonged practice of AAB (more than two months), however, diminished sulfate reduction in reactor B somewhat. Arsenic removal in reactor A was impacted slightly by long-term use of AAB, but arsenic removals achieved by the entire system during NAB and AAB periods were not significantly different (p>0.05) and arsenic concentrations were reduced from approximately 200 μg/L to below 20 μg/L. These results indicate that AAB can be implemented in anaerobic nitrate and arsenic removal systems. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Comparative evaluation of solar, fission, fusion, and fossil energy resources, part 3

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Reupke, W. A.

    1974-01-01

    The role of nuclear fission reactors in becoming an important power source in the world is discussed. The supply of fissile nuclear fuel will be severely depleted by the year 2000. With breeder reactors the world supply of uranium could last thousands of years. However, breeder reactors have problems of a large radioactive inventory and an accident potential which could present an unacceptable hazard. Although breeder reactors afford a possible solution to the energy shortage, their ultimate role will depend on demonstrated safety and acceptable risks and environmental effects. Fusion power would also be a long range, essentially permanent, solution to the world's energy problem. Fusion appears to compare favorably with breeders in safety and environmental effects. Research comparing a controlled fusion reactor with the breeder reactor in solving our long range energy needs is discussed.

  7. Isotope Mixes, Corresponding Nuclear Properties and Reactor Design Implications of Naturally Occurring Lead Sources

    DTIC Science & Technology

    2013-06-01

    39  Table 8.  Required enrichment for criticality ...keff ~ 1)-1. ...............................................44  Table 9.  Required enrichment for criticality (keff ~ 1)-2...45  Table 10.  Required enrichment for SSTAR based model reactor to achieve criticality using various natural lead concentrations

  8. Nuclear Power from Fission Reactors. An Introduction.

    ERIC Educational Resources Information Center

    Department of Energy, Washington, DC. Technical Information Center.

    The purpose of this booklet is to provide a basic understanding of nuclear fission energy and different fission reaction concepts. Topics discussed are: energy use and production, current uses of fuels, oil and gas consumption, alternative energy sources, fossil fuel plants, nuclear plants, boiling water and pressurized water reactors, the light…

  9. Recovery of cesium and palladium from nuclear reactor fuel processing waste

    DOEpatents

    Campbell, David O.

    1976-01-01

    A method of recovering cesium and palladium values from nuclear reactor fission product waste solution involves contacting the solution with a source of chloride ions and oxidizing palladium ions present in the solution to precipitate cesium and palladium as Cs.sub.2 PdCl.sub.6.

  10. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  11. Stimulation of the hydrolytic stage for biogas production from cattle manure in an electrochemical bioreactor.

    PubMed

    Samani, Saeed; Abdoli, Mohammad Ali; Karbassi, Abdolreza; Amin, Mohammad Mehdi

    Electrical current in the hydrolytic phase of the biogas process might affect biogas yield. In this study, four 1,150 mL single membrane-less chamber electrochemical bioreactors, containing two parallel titanium plates were connected to the electrical source with voltages of 0, -0.5, -1 and -1.5 V, respectively. Reactor 1 with 0 V was considered as a control reactor. The trend of biogas production was precisely checked against pH, oxidation reduction potential and electrical power at a temperature of 37 ± 0.5°C amid cattle manure as substrate for 120 days. Biogas production increased by voltage applied to Reactors 2 and 3 when compared with the control reactor. In addition, the electricity in Reactors 2 and 3 caused more biogas production than Reactor 4. Acetogenic phase occurred more quickly in Reactor 3 than in the other reactors. The obtained results from Reactor 4 were indicative of acidogenic domination and its continuous behavior under electrical stimulation. The results of the present investigation clearly revealed that phasic electrical current could enhance the efficiency of biogas production.

  12. Experiment on search for neutron-antineutron oscillations using a projected UCN source at the WWR-M reactor

    NASA Astrophysics Data System (ADS)

    Fomin, A. K.; Serebrov, A. P.; Zherebtsov, O. M.; Leonova, E. N.; Chaikovskii, M. E.

    2017-01-01

    We propose an experiment on search for neutron-antineutron oscillations based on the storage of ultracold neutrons (UCN) in a material trap. The sensitivity of the experiment mostly depends on the trap size and the amount of UCN in it. In Petersburg Nuclear Physics Institute (PNPI) a high-intensity UCN source is projected at the WWR-M reactor, which must provide UCN density 2-3 orders of magnitude higher than existing sources. The results of simulations of the designed experimental scheme show that the sensitivity can be increased by ˜ 10-40 times compared to sensitivity of previous experiment depending on the model of neutron reflection from walls.

  13. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  14. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and Siracusa (Italy). The latter target analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for Siracusa are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent a further valuable source of information to benchmark and strengthen the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, the TSUMAPS-NEAM (Grant agreement ECHO/SUB/2015/718568/PREV26) project and the INGV-DPC Agreement.

  15. Preliminary study of fusion reactor: Solution of Grad Shapranov equation

    NASA Astrophysics Data System (ADS)

    Setiawan, Y.; Fermi, N.; Su'ud, Z.

    2012-06-01

    Nuclear fussion is prospective energy sources for the future due to the abundance of the fuel and can be categorized and clean energy sources. The problem is how to contain very hot plasma of temperature few hundreed million degrees safety and reliably. Tokamax type fussion reactors is considered as the most prospective concept. To analyze the plasma confining process and its movement Grad-Shavranov equation must be solved. This paper discuss about solution of Grad-Shavranov equation using Whittaker function. The formulation is then applied to the ITER design and example.

  16. A Probabilistic Ontology Development Methodology

    DTIC Science & Technology

    2014-06-01

    Test, and Evaluation; Acquisition; and Planning and Marketing ," in Handbook of Systems Engineering and Management .: John Wiley & Sons, 2009, pp...Intelligence and knowledge management . However, many real world problems in these disciplines are burdened by incomplete information and other sources...knowledge engineering, Artificial Intelligence and knowledge management . However, many real world problems in these disciplines are burdened by

  17. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  18. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  19. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    EPA Science Inventory

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large ...

  20. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    ERIC Educational Resources Information Center

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  1. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments

    PubMed Central

    Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.

    2018-01-01

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804

  2. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  3. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  4. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  5. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  6. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  7. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments.

    PubMed

    Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A

    2017-11-06

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.

  8. Spectral measurements of direct and scattered gamma radiation at a boiling-water reactor site

    NASA Astrophysics Data System (ADS)

    Block, R. C.; Preiss, I. L.; Ryan, R. M.; Vargo, G. J.

    1990-12-01

    Quantitative surveys of direct and scattered gamma radiation emitted from the steam-power conversion systems of a boiling-water reactor and other on-site radiation sources were made using a directionally shielded HPGe gamma spectrometry system. The purpose of this study was to obtain data on the relative contributions and energy distributions of direct and scattered gamma radiation in the site environs. The principal radionuclide of concern in this study is 16N produced by the 16O(n,p) 16N reaction in the reactor coolant. Due to changes in facility operation resulting from the implementation of hydrogen water chemistry (HWC), the amount of 16N transported from the reactor to the main steam system under full power operation is excepted to increase by a factor of 1.2 to 5.0. This increase in the 16N source term in the nuclear steam must be considered in the design of new facilities to be constructed on site as well as the evaluation of existing facilities with repect to ALARA (As Low As Reasonably Achievable) dose limits in unrestricted areas. This study consisted of base-line measurements taken under normal BWR chemistry conditions in October, 1987 and a corresponding set taken under HWC conditions in July, 1988. Ground-level and elevated measurements, corresponding to second-story building height, were obtained. The primary conclusion of this study is that direct radiation from the steam-power conversion system is the predominant source of radiation in the site environs of this reactor and that air scattering (i.e. skyshine) does not appear to be significant.

  9. Activation product transport in fusion reactors. [RAPTOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, A.C.

    1983-01-01

    Activated corrosion and neutron sputtering products will enter the coolant and/or tritium breeding material of fusion reactor power plants and experiments and cause personnel access problems. Radiation levels around plant components due to these products will cause difficulties with maintenance and repair operations throughout the plant. Similar problems are experienced around fission reactor systems. The determination of the transport of radioactive corrosion and neutron sputtering products through the system is achieved using the computer code RAPTOR. This code calculates the mass transfer of a number of activation products based on the corrosion and sputtering rates through the system, the depositionmore » and release characteristics of various plant components, the neturon flux spectrum, as well as other plant parameters. RAPTOR assembles a system of first order linear differential equations into a matrix equation based upon the reactor system parameters. Included in the transfer matrix are the deposition and erosion coefficients, and the decay and activation data for the various plant nodes and radioactive isotopes. A source vector supplies the corrosion and neutron sputtering source rates. This matrix equation is then solved using a matrix operator technique to give the specific activity distribution of each radioactive species throughout the plant. Once the amount of mass transfer is determined, the photon transport due to the radioactive corrosion and sputtering product sources can be evaluated, and dose rates around the plant components of interest as a function of time can be determined. This method has been used to estimate the radiation hazards around a number of fusion reactor system designs.« less

  10. Fully probabilistic seismic source inversion - Part 2: Modelling errors and station covariances

    NASA Astrophysics Data System (ADS)

    Stähler, Simon C.; Sigloch, Karin

    2016-11-01

    Seismic source inversion, a central task in seismology, is concerned with the estimation of earthquake source parameters and their uncertainties. Estimating uncertainties is particularly challenging because source inversion is a non-linear problem. In a companion paper, Stähler and Sigloch (2014) developed a method of fully Bayesian inference for source parameters, based on measurements of waveform cross-correlation between broadband, teleseismic body-wave observations and their modelled counterparts. This approach yields not only depth and moment tensor estimates but also source time functions. A prerequisite for Bayesian inference is the proper characterisation of the noise afflicting the measurements, a problem we address here. We show that, for realistic broadband body-wave seismograms, the systematic error due to an incomplete physical model affects waveform misfits more strongly than random, ambient background noise. In this situation, the waveform cross-correlation coefficient CC, or rather its decorrelation D = 1 - CC, performs more robustly as a misfit criterion than ℓp norms, more commonly used as sample-by-sample measures of misfit based on distances between individual time samples. From a set of over 900 user-supervised, deterministic earthquake source solutions treated as a quality-controlled reference, we derive the noise distribution on signal decorrelation D = 1 - CC of the broadband seismogram fits between observed and modelled waveforms. The noise on D is found to approximately follow a log-normal distribution, a fortunate fact that readily accommodates the formulation of an empirical likelihood function for D for our multivariate problem. The first and second moments of this multivariate distribution are shown to depend mostly on the signal-to-noise ratio (SNR) of the CC measurements and on the back-azimuthal distances of seismic stations. By identifying and quantifying this likelihood function, we make D and thus waveform cross-correlation measurements usable for fully probabilistic sampling strategies, in source inversion and related applications such as seismic tomography.

  11. Probabilistic data integration and computational complexity

    NASA Astrophysics Data System (ADS)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under-estimation of uncertainty. However, in both examples, one can also analyze the performance of the sampling methods used to solve the data integration problem to indicate the existence of biased information. This can be used actively to avoid biases in the available information and subsequently in the final uncertainty evaluation.

  12. Seismic probabilistic tsunami hazard: from regional to local analysis and use of geological and historical observations

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.

    2016-12-01

    Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent valuable benchmarks for testing and strengthening the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 projects ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389), and the INGV-DPC Agreement.

  13. REVIEW OF PROPOSED METHODOLOGY FOR A RISK- INFORMED RELAXATION TO ASME SECTION XI APPENDIX G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, Terry L; Kirk, Mark

    2010-01-01

    The current regulations, as set forth by the United States Nuclear Regulatory Commission (NRC), to insure that light-water nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to planned normal reactor startup (heat-up) and shut-down (cool-down) transients are specified in Appendix G to 10 CFR Part 50, which incorporates by reference Appendix G to Section XI of the American Society of Mechanical Engineers (ASME) Code. The technical basis for these regulations are now recognized by the technical community as being conservative and some plants are finding it increasingly difficult to comply with the current regulations. Consequently, the nuclearmore » industry has developed, and submitted to the ASME Code for approval, an alternative risk-informed methodology that reduces the conservatism and is consistent with the methods previously used to develop a risk-informed revision to the regulations for accidental transients such as pressurized thermal shock (PTS). The objective of the alternative methodology is to provide a relaxation to the current regulations which will provide more operational flexibility, particularly for reactor pressure vessels with relatively high irradiation levels and radiation sensitive materials, while continuing to provide reasonable assurance of adequate protection to public health and safety. The NRC and its contractor at Oak Ridge National Laboratory (ORNL) have recently performed an independent review of the industry proposed methodology. The NRC / ORNL review consisted of performing probabilistic fracture mechanics (PFM) analyses for a matrix of cool-down and heat-up rates, permutated over various reactor geometries and characteristics, each at multiple levels of embrittlement, including 60 effective full power years (EFPY) and beyond, for various postulated flaw characterizations. The objective of this review is to quantify the risk of a reactor vessel experiencing non-ductile fracture, and possible subsequent failure, over a wide range of normal transient conditions, when the maximum allowable thermal-hydraulic boundary conditions, derived from both the current ASME code and the industry proposed methodology, are imposed on the inner surface of the reactor vessel. This paper discusses the results of the NRC/ORNL review of the industry proposal including the matrices of PFM analyses, results, insights, and conclusions derived from these analyses.« less

  14. Future Reactor Neutrino Experiments (RRNOLD)1

    NASA Astrophysics Data System (ADS)

    Jaffe, David E.

    The prospects for future reactor neutrino experiments that would use tens of kilotons of liquid scintillator with a ∼ 50 km baseline are discussed. These experiments are generically dubbed "RRNOLD" for Radical Reactor Neutrino Oscillation Liquid scintillator Detector experiment. Such experiments are designed to resolve the neutrino mass hierarchy and make sub-percent measurements sin2θ12, Δm232 and Δm122 . RRNOLD would also be sensitive to neutrinos from other sources and have notable sensitivity to proton decay.

  15. A Programmable Liquid Collimator for Both Coded Aperture Adaptive Imaging and Multiplexed Compton Scatter Tomography

    DTIC Science & Technology

    2012-03-01

    environments where a source is either weak or shielded. A vehicle of this type could survey large areas after a nuclear attack or a nuclear reactor accident...to prevent its detection by γ-rays. The best application for unmanned vehicles is the detection of radioactive material after a nuclear reactor ...accident or a nuclear weapon detonation [70]. Whether by a nuclear detonation or a nuclear reactor accident, highly radioactive substances could be dis

  16. Chemical vapor deposition of epitaxial silicon

    DOEpatents

    Berkman, Samuel

    1984-01-01

    A single chamber continuous chemical vapor deposition (CVD) reactor is described for depositing continuously on flat substrates, for example, epitaxial layers of semiconductor materials. The single chamber reactor is formed into three separate zones by baffles or tubes carrying chemical source material and a carrier gas in one gas stream and hydrogen gas in the other stream without interaction while the wafers are heated to deposition temperature. Diffusion of the two gas streams on heated wafers effects the epitaxial deposition in the intermediate zone and the wafers are cooled in the final zone by coolant gases. A CVD reactor for batch processing is also described embodying the deposition principles of the continuous reactor.

  17. DANSSino: a pilot version of the DANSS neutrino detector

    NASA Astrophysics Data System (ADS)

    Alekseev, I.; Belov, V.; Brudanin, V.; Danilov, M.; Egorov, V.; Filosofov, D.; Fomina, M.; Hons, Z.; Kobyakin, A.; Medvedev, D.; Mizuk, R.; Novikov, E.; Olshevsky, A.; Rozov, S.; Rumyantseva, N.; Rusinov, V.; Salamatin, A.; Shevchik, Ye.; Shirchenko, M.; Shitov, Yu.; Starostin, A.; Svirida, D.; Tarkovsky, E.; Tikhomirov, I.; Yakushev, E.; Zhitnikov, I.; Zinatulina, D.

    2014-07-01

    DANSSino is a reduced pilot version of a solid-state detector of reactor antineutrinos (to be created within the DANSS project and installed under the industrial 3 GWth reactor of the Kalinin Nuclear Power Plant—KNPP). Numerous tests performed at a distance of 11 m from the reactor core demonstrate operability of the chosen design and reveal the main sources of the background. In spite of its small size (20 × 20 × 100 cm3), the pilot detector turned out to be quite sensitive to reactor antineutrinos, detecting about 70 IBD events per day with the signal-to-background ratio about unity.

  18. Detecting Dark Photons with Reactor Neutrino Experiments

    NASA Astrophysics Data System (ADS)

    Park, H. K.

    2017-08-01

    We propose to search for light U (1 ) dark photons, A', produced via kinetically mixing with ordinary photons via the Compton-like process, γ e-→A'e-, in a nuclear reactor and detected by their interactions with the material in the active volumes of reactor neutrino experiments. We derive 95% confidence-level upper limits on ɛ , the A'-γ mixing parameter, ɛ , for dark-photon masses below 1 MeV of ɛ <1.3 ×10-5 and ɛ <2.1 ×10-5, from NEOS and TEXONO experimental data, respectively. This study demonstrates the applicability of nuclear reactors as potential sources of intense fluxes of low-mass dark photons.

  19. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blakeman, Edward D; Peplow, Douglas E.; Wagner, John C

    2007-09-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally filesmore » and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.« less

  20. Comparison of Radionuclide Ratios in Atmospheric Nuclear Explosions and Nuclear Releases from Chernobyl and Fukushima seen in Gamma Ray Spectormetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Judah I.; Kephart, Rosara F.; Lucas, Dawn D.

    2013-05-01

    The Comprehensive Nuclear Test Ban Treaty (CTBT) has remote radionuclide monitoring followed by an On Site Inspection (OSI) to clarify the nature of a suspect event. An important aspect of radionuclide measurements on site is the discrimination of other potential sources of similar radionuclides such as reactor accidents or medical isotope production. The Chernobyl and Fukushima nuclear reactor disasters offer two different reactor source term environmental inputs that can be compared against historical measurements of nuclear explosions. The comparison of whole-sample gamma spectrometry measurements from these three events and the analysis of similarities and differences are presented. This analysis ismore » a step toward confirming what is needed for measurements during an OSI under the auspices of the Comprehensive Test Ban Treaty.« less

  1. Calculated performance of a mercury-compressor-jet powered airplane using a nuclear reactor as an energy source

    NASA Technical Reports Server (NTRS)

    Doyle, R B

    1951-01-01

    An analysis was made at a flight Mach number of 1.5, an altitude of 45,000 feet, a turbine-inlet temperature of 1460 degrees R, of a mercury compressor-jet powered airplane using a nuclear reactor as an energy source. The calculations covered a range of turbine-exhaust and turbine-inlet pressures and condenser-inlet Mach numbers. For a turbine--inlet pressure of 40 pounds per square inch absolute, a turbine-exhaust pressure of 14 pounds per square inch absolute, and a condenser-inlet Mach number of 0.23 the calculated airplane gross weight required to carry a 20,000 pound payload was 322000 pounds and the reactor heat release per unit volume was 8.9 kilowatts per cubic inch. These do not represent optimum operating conditions.

  2. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE PAGES

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    2016-01-18

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/ 137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/ 137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/ 137Cs versus 134Cs/ 137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  3. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/ 137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/ 137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/ 137Cs versus 134Cs/ 137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  4. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples.

    PubMed

    Snow, Mathew S; Snyder, Darin C; Delmore, James E

    2016-02-28

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1-3 and spent fuel ponds 1-4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100-250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequential ammonium molybdophosphate-polyacrylonitrile columns, following which (135)Cs/(137) Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. (135)Cs/(137)Cs isotope ratios from samples 100-250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. (135)Cs/(137)Cs versus (134)Cs/(137)Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. Cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  5. Analysis of a nuclear accident: fission and activation product releases from the Fukushima Daiichi nuclear facility as remote indicators of source identification, extent of release, and state of damaged spent nuclear fuel.

    PubMed

    Schwantes, Jon M; Orton, Christopher R; Clark, Richard A

    2012-08-21

    Researchers evaluated radionuclide measurements of environmental samples taken from the Fukushima Daiichi nuclear facility and reported on the Tokyo Electric Power Co. Website following the 2011 tsunami-initiated catastrophe. This effort identified Units 1 and 3 as the major source of radioactive contamination to the surface soil near the facility. Radionuclide trends identified in the soils suggested that: (1) chemical volatility driven by temperature and reduction potential within the vented reactors' primary containment vessels dictated the extent of release of radiation; (2) all coolant had likely evaporated by the time of venting; and (3) physical migration through the fuel matrix and across the cladding wall were minimally effective at containing volatile species, suggesting damage to fuel bundles was extensive. Plutonium isotopic ratios and their distance from the source indicated that the damaged reactors were the major contributor of plutonium to surface soil at the source, decreasing rapidly with distance from the facility. Two independent evaluations estimated the fraction of the total plutonium inventory released to the environment relative to cesium from venting Units 1 and 3 to be ∼0.002-0.004%. This study suggests significant volatile radionuclides within the spent fuel at the time of venting, but not as yet observed and reported within environmental samples, as potential analytes of concern for future environmental surveys around the site. The majority of the reactor inventories of isotopes of less volatile elements like Pu, Nb, and Sr were likely contained within the damaged reactors during venting.

  6. Nitrogen source effects on the denitrifying anaerobic methane oxidation culture and anaerobic ammonium oxidation bacteria enrichment process.

    PubMed

    Fu, Liang; Ding, Jing; Lu, Yong-Ze; Ding, Zhao-Wei; Zeng, Raymond J

    2017-05-01

    The co-culture system of denitrifying anaerobic methane oxidation (DAMO) and anaerobic ammonium oxidation (Anammox) has a potential application in wastewater treatment plant. This study explored the effects of permutation and combination of nitrate, nitrite, and ammonium on the culture enrichment from freshwater sediments. The co-existence of NO 3 - , NO 2 - , and NH 4 + shortened the enrichment time from 75 to 30 days and achieved a total nitrogen removal rate of 106.5 mg/L/day on day 132. Even though ammonium addition led to Anammox bacteria increase and a higher nitrogen removal rate, DAMO bacteria still dominated in different reactors with the highest proportion of 64.7% and the maximum abundance was 3.07 ± 0.25 × 10 8 copies/L (increased by five orders of magnitude) in the nitrite reactor. DAMO bacteria showed greater diversity in the nitrate reactor, and one was similar to M. oxyfera; DAMO bacteria in the nitrite reactor were relatively unified and similar to M. sinica. Interestingly, no DAMO archaea were found in the nitrate reactor. This study will improve the understanding of the impact of nitrogen source on DAMO and Anammox co-culture enrichment.

  7. 76 FR 52715 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Regulatory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... review Draft Final Regulatory Guide (RG) 1.93, ``Availability of Electric Power Sources,'' Revision 1 and new Draft Final RG 1.218, ``Condition Monitoring Techniques for Electric Cables Used in Nuclear Power... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS...

  8. 76 FR 53979 - Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Thermal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards (ACRS); Meeting of the ACRS Subcommittee on Thermal Hydraulics Phenomena; Notice of Meeting The ACRS Subcommittee on Thermal Hydraulics... Revision 4 to Regulatory Guide 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss...

  9. Reliability and cost evaluation of small isolated power systems containing photovoltaic and wind energy

    NASA Astrophysics Data System (ADS)

    Karki, Rajesh

    Renewable energy application in electric power systems is growing rapidly worldwide due to enhanced public concerns for adverse environmental impacts and escalation in energy costs associated with the use of conventional energy sources. Photovoltaics and wind energy sources are being increasingly recognized as cost effective generation sources. A comprehensive evaluation of reliability and cost is required to analyze the actual benefits of utilizing these energy sources. The reliability aspects of utilizing renewable energy sources have largely been ignored in the past due the relatively insignificant contribution of these sources in major power systems, and consequently due to the lack of appropriate techniques. Renewable energy sources have the potential to play a significant role in the electrical energy requirements of small isolated power systems which are primarily supplied by costly diesel fuel. A relatively high renewable energy penetration can significantly reduce the system fuel costs but can also have considerable impact on the system reliability. Small isolated systems routinely plan their generating facilities using deterministic adequacy methods that cannot incorporate the highly erratic behavior of renewable energy sources. The utilization of a single probabilistic risk index has not been generally accepted in small isolated system evaluation despite its utilization in most large power utilities. Deterministic and probabilistic techniques are combined in this thesis using a system well-being approach to provide useful adequacy indices for small isolated systems that include renewable energy. This thesis presents an evaluation model for small isolated systems containing renewable energy sources by integrating simulation models that generate appropriate atmospheric data, evaluate chronological renewable power outputs and combine total available energy and load to provide useful system indices. A software tool SIPSREL+ has been developed which generates risk, well-being and energy based indices to provide realistic cost/reliability measures of utilizing renewable energy. The concepts presented and the examples illustrated in this thesis will help system planners to decide on appropriate installation sites, the types and mix of different energy generating sources, the optimum operating policies, and the optimum generation expansion plans required to meet increasing load demands in small isolated power systems containing photovoltaic and wind energy sources.

  10. Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence

    NASA Astrophysics Data System (ADS)

    Lewis, Nicholas; Grünwald, Peter

    2018-03-01

    Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).

  11. Planar seismic source characterization models developed for probabilistic seismic hazard assessment of Istanbul

    NASA Astrophysics Data System (ADS)

    Gülerce, Zeynep; Buğra Soyman, Kadir; Güner, Barış; Kaymakci, Nuretdin

    2017-12-01

    This contribution provides an updated planar seismic source characterization (SSC) model to be used in the probabilistic seismic hazard assessment (PSHA) for Istanbul. It defines planar rupture systems for the four main segments of the North Anatolian fault zone (NAFZ) that are critical for the PSHA of Istanbul: segments covering the rupture zones of the 1999 Kocaeli and Düzce earthquakes, central Marmara, and Ganos/Saros segments. In each rupture system, the source geometry is defined in terms of fault length, fault width, fault plane attitude, and segmentation points. Activity rates and the magnitude recurrence models for each rupture system are established by considering geological and geodetic constraints and are tested based on the observed seismicity that is associated with the rupture system. Uncertainty in the SSC model parameters (e.g., b value, maximum magnitude, slip rate, weights of the rupture scenarios) is considered, whereas the uncertainty in the fault geometry is not included in the logic tree. To acknowledge the effect of earthquakes that are not associated with the defined rupture systems on the hazard, a background zone is introduced and the seismicity rates in the background zone are calculated using smoothed-seismicity approach. The state-of-the-art SSC model presented here is the first fully documented and ready-to-use fault-based SSC model developed for the PSHA of Istanbul.

  12. Isotopic signature of atmospheric xenon released from light water reactors.

    PubMed

    Kalinowski, Martin B; Pistner, Christoph

    2006-01-01

    A global monitoring system for atmospheric xenon radioactivity is being established as part of the International Monitoring System to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The isotopic activity ratios of (135)Xe, (133m)Xe, (133)Xe and (131m)Xe are of interest for distinguishing nuclear explosion sources from civilian releases. Simulations of light water reactor (LWR) fuel burn-up through three operational reactor power cycles are conducted to explore the possible xenon isotopic signature of nuclear reactor releases under different operational conditions. It is studied how ratio changes are related to various parameters including the neutron flux, uranium enrichment and fuel burn-up. Further, the impact of diffusion and mixing on the isotopic activity ratio variability are explored. The simulations are validated with reported reactor emissions. In addition, activity ratios are calculated for xenon isotopes released from nuclear explosions and these are compared to the reactor ratios in order to determine whether the discrimination of explosion releases from reactor effluents is possible based on isotopic activity ratios.

  13. a Dosimetry Assessment for the Core Restraint of AN Advanced Gas Cooled Reactor

    NASA Astrophysics Data System (ADS)

    Thornton, D. A.; Allen, D. A.; Tyrrell, R. J.; Meese, T. C.; Huggon, A. P.; Whiley, G. S.; Mossop, J. R.

    2009-08-01

    This paper describes calculations of neutron damage rates within the core restraint structures of Advanced Gas Cooled Reactors (AGRs). Using advanced features of the Monte Carlo radiation transport code MCBEND, and neutron source data from core follow calculations performed with the reactor physics code PANTHER, a detailed model of the reactor cores of two of British Energy's AGR power plants has been developed for this purpose. Because there are no relevant neutron fluence measurements directly supporting this assessment, results of benchmark comparisons and successful validation of MCBEND for Magnox reactors have been used to estimate systematic and random uncertainties on the predictions. In particular, it has been necessary to address the known under-prediction of lower energy fast neutron responses associated with the penetration of large thicknesses of graphite.

  14. Development of a reactor with carbon catalysts for modular-scale, low-cost electrochemical generation of H 2O 2

    DOE PAGES

    Chen, Zhihua; Chen, Shucheng; Siahrostami, Samira; ...

    2017-03-01

    The development of small-scale, decentralized reactors for H 2O 2 production that can couple to renewable energy sources would be of great benefit, particularly for water purification in the developing world. Herein, we describe our efforts to develop electrochemical reactors for H 2O 2 generation with high Faradaic efficiencies of >90%, requiring cell voltages of only ~1.6 V. The reactor employs a carbon-based catalyst that demonstrates excellent performance for H 2O 2 production under alkaline conditions, as demonstrated by fundamental studies involving rotating-ring disk electrode methods. Finally, the low-cost, membrane-free reactor design represents a step towards a continuous, modular-scale, de-centralizedmore » production of H 2O 2.« less

  15. Palbociclib in hormone receptor positive advanced breast cancer: A cost-utility analysis.

    PubMed

    Raphael, J; Helou, J; Pritchard, K I; Naimark, D M

    2017-11-01

    The addition of palbociclib to letrozole improves progression-free survival in the first-line treatment of hormone receptor positive advanced breast cancer (ABC). This study assesses the cost-utility of palbociclib from the Canadian healthcare payer perspective. A probabilistic discrete event simulation (DES) model was developed and parameterised with data from the PALOMA 1 and 2 trials and other sources. The incremental cost per quality-adjusted life-month (QALM) gained for palbociclib was calculated. A time horizon of 15 years was used in the base case with costs and effectiveness discounted at 5% annually. Time-to- progression and time-to-death were derived from a Weibull and exponential distribution. Expected costs were based on Ontario fees and other sources. Probabilistic sensitivity analyses were conducted to account for parameter uncertainty. Compared to letrozole, the addition of palbociclib provided an additional 14.7 QALM at an incremental cost of $161,508. The resulting incremental cost-effectiveness ratio was $10,999/QALM gained. Assuming a willingness-to-pay (WTP) of $4167/QALM, the probability of palbociclib to be cost-effective was 0%. Cost-effectiveness acceptability curves derived from a probabilistic sensitivity analysis showed that at a WTP of $11,000/QALM gained, the probability of palbociclib to be cost-effective was 50%. The addition of palbociclib to letrozole is unlikely to be cost-effective for the treatment of ABC from a Canadian healthcare perspective with its current price. While ABC patients derive a meaningful clinical benefit from palbociclib, considerations should be given to increase the WTP threshold and reduce the drug pricing, to render this strategy more affordable. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Associations of Topics of Discussion on Twitter With Survey Measures of Attitudes, Knowledge, and Behaviors Related to Zika: Probabilistic Study in the United States.

    PubMed

    Farhadloo, Mohsen; Winneg, Kenneth; Chan, Man-Pui Sally; Hall Jamieson, Kathleen; Albarracin, Dolores

    2018-02-09

    Recent outbreaks of Zika virus around the world led to increased discussions about this issue on social media platforms such as Twitter. These discussions may provide useful information about attitudes, knowledge, and behaviors of the population regarding issues that are important for public policy. We sought to identify the associations of the topics of discussions on Twitter and survey measures of Zika-related attitudes, knowledge, and behaviors, not solely based upon the volume of such discussions but by analyzing the content of conversations using probabilistic techniques. Using probabilistic topic modeling with US county and week as the unit of analysis, we analyzed the content of Twitter online communications to identify topics related to the reported attitudes, knowledge, and behaviors captured in a national representative survey (N=33,193) of the US adult population over 33 weeks. Our analyses revealed topics related to "congress funding for Zika," "microcephaly," "Zika-related travel discussions," "insect repellent," "blood transfusion technology," and "Zika in Miami" were associated with our survey measures of attitudes, knowledge, and behaviors observed over the period of the study. Our results demonstrated that it is possible to uncover topics of discussions from Twitter communications that are associated with the Zika-related attitudes, knowledge, and behaviors of populations over time. Social media data can be used as a complementary source of information alongside traditional data sources to gauge the patterns of attitudes, knowledge, and behaviors in a population. ©Mohsen Farhadloo, Kenneth Winneg, Man-Pui Sally Chan, Kathleen Hall Jamieson, Dolores Albarracin. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 09.02.2018.

  17. Evaluation of probabilistic forecasts with the scoringRules package

    NASA Astrophysics Data System (ADS)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  18. Nondeterministic computational fluid dynamics modeling of Escherichia coli inactivation by peracetic acid in municipal wastewater contact tanks.

    PubMed

    Santoro, Domenico; Crapulli, Ferdinando; Raisee, Mehrdad; Raspa, Giuseppe; Haas, Charles N

    2015-06-16

    Wastewater disinfection processes are typically designed according to heuristics derived from batch experiments in which the interaction among wastewater quality, reactor hydraulics, and inactivation kinetics is often neglected. In this paper, a computational fluid dynamics (CFD) study was conducted in a nondeterministic (ND) modeling framework to predict the Escherichia coli inactivation by peracetic acid (PAA) in municipal contact tanks fed by secondary settled wastewater effluent. The extent and variability associated with the observed inactivation kinetics were both satisfactorily predicted by the stochastic inactivation model at a 95% confidence level. Moreover, it was found that (a) the process variability induced by reactor hydraulics is negligible when compared to the one caused by inactivation kinetics, (b) the PAA dose required for meeting regulations is dictated equally by the fixed limit of the microbial concentration as well as its probability of occurrence, and (c) neglecting the probability of occurrence during process sizing could lead to an underestimation of the PAA dose required by as much as 100%. Finally, the ND-CFD model was used to generate sizing information in the form of probabilistic disinfection curves relating E. coli inactivation and probability of occurrence with the average PAA dose and PAA residual concentration at the outlet of the contact tank.

  19. New techniques for modeling the reliability of reactor pressure vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, K.I.; Simonen, F.A.; Liebetrau, A.M.

    1986-01-01

    In recent years several probabilistic fracture mechanics codes, including the VISA code, have been developed to predict the reliability of reactor pressure vessels. This paper describes several new modeling techniques used in a second generation of the VISA code entitled VISA-II. Results are presented that show the sensitivity of vessel reliability predictions to such factors as inservice inspection to detect flaws, random positioning of flaws within the vessel wall thickness, and fluence distributions that vary throughout the vessel. The algorithms used to implement these modeling techniques are also described. Other new options in VISA-II are also described in this paper.more » The effect of vessel cladding has been included in the heat transfer, stress, and fracture mechanics solutions in VISA-II. The algorithms for simulating flaws has been changed to consider an entire vessel rather than a single flaw in a single weld. The flaw distribution was changed to include the distribution of both flaw depth and length. A menu of several alternate equations has been included to predict the shift in RT/sub NDT/. For flaws that arrest and later re-initiate, an option was also included to allow correlating the current arrest toughness with subsequent initiation toughnesses.« less

  20. Parametric study of irradiation effects on the ductile damage and flow stress behavior in ferritic-martensitic steels

    NASA Astrophysics Data System (ADS)

    Chakraborty, Pritam; Biner, S. Bulent

    2015-10-01

    Ferritic-martensitic steels are currently being considered as structural materials in fusion and Gen-IV nuclear reactors. These materials are expected to experience high dose radiation, which can increase their ductile to brittle transition temperature and susceptibility to failure during operation. Hence, to estimate the safe operational life of the reactors, precise evaluation of the ductile to brittle transition temperatures of ferritic-martensitic steels is necessary. Owing to the scarcity of irradiated samples, particularly at high dose levels, micro-mechanistic models are being employed to predict the shifts in the ductile to brittle transition temperatures. These models consider the ductile damage evolution, in the form of nucleation, growth and coalescence of voids; and the brittle fracture, in the form of probabilistic cleavage initiation, to estimate the influence of irradiation on the ductile to brittle transition temperature. However, the assessment of irradiation dependent material parameters is challenging and influences the accuracy of these models. In the present study, the effects of irradiation on the overall flow stress and ductile damage behavior of two ferritic-martensitic steels is parametrically investigated. The results indicate that the ductile damage model parameters are mostly insensitive to irradiation levels at higher dose levels though the resulting flow stress behavior varies significantly.

  1. Historical relationship between performance assessment for radioactive waste disposal and other types of risk assessment.

    PubMed

    Rechard, R P

    1999-10-01

    This article describes the evolution of the process for assessing the hazards of a geologic disposal system for radioactive waste and, similarly, nuclear power reactors, and the relationship of this process with other assessments of risk, particularly assessments of hazards from manufactured carcinogenic chemicals during use and disposal. This perspective reviews the common history of scientific concepts for risk assessment developed until the 1950s. Computational tools and techniques developed in the late 1950s and early 1960s to analyze the reliability of nuclear weapon delivery systems were adopted in the early 1970s for probabilistic risk assessment of nuclear power reactors, a technology for which behavior was unknown. In turn, these analyses became an important foundation for performance assessment of nuclear waste disposal in the late 1970s. The evaluation of risk to human health and the environment from chemical hazards is built on methods for assessing the dose response of radionuclides in the 1950s. Despite a shared background, however, societal events, often in the form of legislation, have affected the development path for risk assessment for human health, producing dissimilarities between these risk assessments and those for nuclear facilities. An important difference is the regulator's interest in accounting for uncertainty.

  2. Nitrate removal performance of Diaphorobacter nitroreducens using biodegradable plastics as the source of reducing power

    NASA Astrophysics Data System (ADS)

    Khan, S. T.; Nagao, Y.; Hiraishi, A.

    2015-02-01

    Strain NA10BT and other two strains of the denitrifying betaproteobacterium Diaphorobacter nitroreducens were studied for the performance of solid-phase denitrification (SPD) using poly(3-hydroxybutyrate-co-3-hydroxyvalerate) (PHBV) and some other biodegradable plastics as the source of reducing power in wastewater treatment. Sequencing-batch SPD reactors with these organisms and PHBV granules or flakes as the substrate exhibited good nitrate removal performance. Vial tests using cultures from these parent reactors showed higher nitrate removal rates with PHBV granules (ca. 20 mg-NO3-- N g-1 [dry wt cells] h-1) than with PHBV pellets and flakes. In continuous-flow SPD reactors using strain NA10BT and PHBV flakes, nitrate was not detected even at a loading rate of 21 mg-NO3-- N L-1 h-1. This corresponded to a nitrate removal rate of 47 mg-NO3-- N g-1 (dry wt cells) h-1. In the continuous-flow reactor, the transcription level of the phaZ gene, coding for PHB depolymerase, decreased with time, while that of the nosZ gene, involved in denitrificaiton, was relatively constant. These results suggest that the bioavailability of soluble metabolites as electron donor and carbon sources increases with time in the continuous-flow SPD process, thereby having much higher nitrate removal rates than the process with fresh PHBV as the substrate.

  3. Quantum Social Science

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel; Khrennikov, Andrei

    2013-01-01

    Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.

  4. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  5. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  6. Issues relating to spent nuclear fuel storage on the Oak Ridge Reservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, J.A.; Turner, D.W.

    1994-12-31

    Currently, about 2,800 metric tons of spent nuclear fuel (SNF) is stored in the US, 1,000 kg of SNF (or about 0.03% of the nation`s total) are stored at the US Department of Energy (DOE) complex in Oak Ridge, Tennessee. However small the total quantity of material stored at Oak Ridge, some of the material is quite singular in character and, thus, poses unique management concerns. The various types of SNF stored at Oak Ridge will be discussed including: (1) High-Flux Isotope Reactor (HFIR) and future Advanced Neutron Source (ANS) fuels; (2) Material Testing Reactor (MTR) fuels, including Bulk Shieldingmore » Reactor (BSR) and Oak Ridge Research Reactor (ORR) fuels; (3) Molten Salt Reactor Experiment (MSRE) fuel; (4) Homogeneous Reactor Experiment (HRE) fuel; (5) Miscellaneous SNF stored in Oak Ridge National Laboratory`s (ORNL`s) Solid Waste Storage Areas (SWSAs); (6) SNF stored in the Y-12 Plant 9720-5 Warehouse including Health. Physics Reactor (HPRR), Space Nuclear Auxiliary Power (SNAP-) 10A, and DOE Demonstration Reactor fuels.« less

  7. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    NASA Astrophysics Data System (ADS)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a probabilistic framework, using groundwater metrics derived for a highly stressed groundwater system.

  8. Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.

    Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small imagemore » patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.« less

  9. A setup for active neutron analysis of the fissile material content in fuel assemblies of nuclear reactors

    NASA Astrophysics Data System (ADS)

    Bushuev, A. V.; Kozhin, A. F.; Aleeva, T. B.; Zubarev, V. N.; Petrova, E. V.; Smirnov, V. E.

    2016-12-01

    An active neutron method for measuring the residual mass of 235U in spent fuel assemblies (FAs) of the IRT MEPhI research reactor is presented. The special measuring stand design and uniform irradiation of the fuel with neutrons along the entire length of the active part of the FA provide high accuracy of determination of the residual 235U content. AmLi neutron sources yield a higher effect/background ratio than other types of sources and do not induce the fission of 238U. The proposed method of transfer of the isotope source in accordance with a given algorithm may be used in experiments where the studied object needs to be irradiated with a uniform fluence.

  10. A Probabilistic Approach to Data Integration in Biomedical Research: The IsBIG Experiments

    ERIC Educational Resources Information Center

    Anand, Vibha

    2010-01-01

    Biomedical research has produced vast amounts of new information in the last decade but has been slow to find its use in clinical applications. Data from disparate sources such as genetic studies and summary data from published literature have been amassed, but there is a significant gap, primarily due to a lack of normative methods, in combining…

  11. Trends Concerning Four Misconceptions in Students' Intuitively-Based Probabilistic Reasoning Sourced in the Heuristic of Representativeness

    ERIC Educational Resources Information Center

    Kustos, Paul Nicholas

    2010-01-01

    Student difficulty in the study of probability arises in intuitively-based misconceptions derived from heuristics. One such heuristic, the one of note for this research study, is that of representativeness, in which an individual informally assesses the probability of an event based on the degree to which the event is similar to the sample from…

  12. A comparison of geospatially modeled fire behavior and fire management utility of three data sources in the southeastern United States

    Treesearch

    LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard

    2012-01-01

    Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...

  13. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.

  14. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.

  15. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  16. A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.

    Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less

  17. A global empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  18. An empirical system for probabilistic seasonal climate prediction

    NASA Astrophysics Data System (ADS)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  19. Impact of non-ionic surfactant on the long-term development of lab-scale-activated sludge bacterial communities.

    PubMed

    Lozada, Mariana; Basile, Laura; Erijman, Leonardo

    2007-01-01

    The development of bacterial communities in replicate lab-scale-activated sludge reactors degrading a non-ionic surfactant was evaluated by statistical analysis of denaturing gradient gel electrophoresis (DGGE) fingerprints. Four sequential batch reactors were fed with synthetic sewage, two of which received, in addition, 0.01% of nonylphenol ethoxylates (NPE). The dynamic character of bacterial community structure was confirmed by the differences in species composition among replicate reactors. Measurement of similarities between reactors was obtained by pairwise similarity analysis using the Bray Curtis coefficient. The group of NPE-amended reactors exhibited the highest similarity values (Sjk=0.53+/-0.03), indicating that the bacterial community structure of NPE-amended reactors was better replicated than control reactors (Sjk=0.36+/-0.04). Replicate NPE-amended reactors taken at different times of operation clustered together, whereas analogous relations within the control reactor cluster were not observed. The DGGE pattern of isolates grown in conditioned media prepared with media taken at the end of the aeration cycle grouped separately from other conditioned and synthetic media regardless of the carbon source amendment, suggesting that NPE degradation residuals could have a role in the shaping of the community structure.

  20. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

Top