Sample records for nuclear parameter library

  1. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    NASA Astrophysics Data System (ADS)

    Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from 51V to 239Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.

  2. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Oblozinsky, P.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  3. RIPL-Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Capote,R.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  4. Impact of nuclear data on sodium-cooled fast reactor calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Zwermann, Winfried; Velkov, Kiril

    2016-03-01

    Neutron transport and depletion calculations are performed in combination with various nuclear data libraries in order to assess the impact of nuclear data on safety-relevant parameters of sodium-cooled fast reactors. These calculations are supplemented by systematic uncertainty analyses with respect to nuclear data. Analysed quantities are the multiplication factor and nuclide densities as a function of burn-up and the Doppler and Na-void reactivity coefficients at begin of cycle. While ENDF/B-VII.0 / -VII.1 yield rather consistent results, larger discrepancies are observed between the JEFF libraries. While the newest evaluation, JEFF-3.2, agrees with the ENDF/B-VII libraries, the JEFF-3.1.2 library yields significant larger multiplication factors.

  5. Impact of New Nuclear Data Libraries on Small Sized Long Life CANDLE HTGR Design Parameters

    NASA Astrophysics Data System (ADS)

    Liem, Peng Hong; Hartanto, Donny; Tran, Hoai Nam

    2017-01-01

    The impact of new evaluated nuclear data libraries (JENDL-4.0, ENDF/B-VII.0 and JEFF-3.1) on the core characteristics of small-sized long-life CANDLE High Temperature Gas-Cooled Reactors (HTGRs) with uranium and thorium fuel cycles was investigated. The most important parameters of the CANDLE core characteristics investigated here covered (1) infinite multiplication factor of the fresh fuel containing burnable poison, (2) the effective multiplication factor of the equilibrium core, (3) the moving velocity of the burning region, (4) the attained discharge burnup, and (5) the maximum power density. The reference case was taken from the current JENDL-3.3 results. For the uranium fuel cycle, the impact of the new libraries was small, while significant impact was found for thorium fuel cycle. The findings indicated the needs of more accurate nuclear data libraries for nuclides involved in thorium fuel cycle in the future.

  6. Comparison Of A Neutron Kinetics Parameter For A Polyethylene Moderated Highly Enriched Uranium System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenzie, IV, George Espy; Goda, Joetta Marie; Grove, Travis Justin

    This paper examines the comparison of MCNP® code’s capability to calculate kinetics parameters effectively for a thermal system containing highly enriched uranium (HEU). The Rossi-α parameter was chosen for this examination because it is relatively easy to measure as well as easy to calculate using MCNP®’s kopts card. The Rossi-α also incorporates many other parameters of interest in nuclear kinetics most of which are more difficult to precisely measure. The comparison looks at two different nuclear data libraries for comparison to the experimental data. These libraries are ENDF/BVI (.66c) and ENDF/BVII (.80c).

  7. Random sampling and validation of covariance matrices of resonance parameters

    NASA Astrophysics Data System (ADS)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  8. Uncertainty analysis on reactivity and discharged inventory for a pressurized water reactor fuel assembly due to {sup 235,238}U nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Da Cruz, D. F.; Rochman, D.; Koning, A. J.

    2012-07-01

    This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {supmore » 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)« less

  9. The U.S. national nuclear forensics library, nuclear materials information program, and data dictionary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamont, Stephen Philip; Brisson, Marcia; Curry, Michael

    2011-02-17

    Nuclear forensics assessments to determine material process history requires careful comparison of sample data to both measured and modeled nuclear material characteristics. Developing centralized databases, or nuclear forensics libraries, to house this information is an important step to ensure all relevant data will be available for comparison during a nuclear forensics analysis and help expedite the assessment of material history. The approach most widely accepted by the international community at this time is the implementation of National Nuclear Forensics libraries, which would be developed and maintained by individual nations. This is an attractive alternative toan international database since it providesmore » an understanding that each country has data on materials produced and stored within their borders, but eliminates the need to reveal any proprietary or sensitive information to other nations. To support the concept of National Nuclear Forensics libraries, the United States Department of Energy has developed a model library, based on a data dictionary, or set of parameters designed to capture all nuclear forensic relevant information about a nuclear material. Specifically, information includes material identification, collection background and current location, analytical laboratories where measurements were made, material packaging and container descriptions, physical characteristics including mass and dimensions, chemical and isotopic characteristics, particle morphology or metallurgical properties, process history including facilities, and measurement quality assurance information. While not necessarily required, it may also be valuable to store modeled data sets including reactor burn-up or enrichment cascade data for comparison. It is fully expected that only a subset of this information is available or relevant to many materials, and much of the data populating a National Nuclear Forensics library would be process analytical or material accountability measurement data as opposed to a complete forensic analysis of each material in the library.« less

  10. A Computerized Library and Evaluation System for Integral Neutron Experiments.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    A computerized library of references to integral neutron experiments has been developed at the Lawrence Radiation Laboratory at Livermore. This library serves as a data base for the systematic retrieval of documents describing diverse critical and bulk nuclear experiments. The evaluation and reduction of the physical parameters of the experiments…

  11. n+235U resonance parameters and neutron multiplicities in the energy region below 100 eV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Capote, Roberto; Trkov, Andrej; Pronyaev, Vladimir G.

    2017-09-01

    In August 2016, following the recent effort within the Collaborative International Evaluated Library Organization (CIELO) pilot project to improve the neutron cross sections of 235U, Oak Ridge National Laboratory (ORNL) collaborated with the International Atomic Energy Agency (IAEA) to release a resonance parameter evaluation. This evaluation restores the performance of the evaluated cross sections for the thermal- and above-thermal-solution benchmarks on the basis of newly evaluated thermal neutron constants (TNCs) and thermal prompt fission neutron spectra (PFNS). Performed with support from the US Nuclear Criticality Safety Program (NCSP) in an effort to provide the highest fidelity general purpose nuclear database for nuclear criticality applications, the resonance parameter evaluation was submitted as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The resonance parameter evaluation methodology used the Reich-Moore approximation of the R-matrix formalism implemented in the code SAMMY to fit the available time-of-flight (TOF) measured data for the thermal induced cross section of n+235U up to 100 eV. While maintaining reasonably good agreement with the experimental data, the validation analysis focused on restoring the benchmark performance for 235U solutions by combining changes to the resonance parameters and to the prompt resonance v̅

  12. EMPIRE: A code for nuclear astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palumbo, A.

    The nuclear reaction code EMPIRE is presented as a useful tool for nuclear astrophysics. EMPIRE combines a variety of the reaction models with a comprehensive library of input parameters providing a diversity of options for the user. With exclusion of the directsemidirect capture all reaction mechanisms relevant to the nuclear astrophysics energy range of interest are implemented in the code. Comparison to experimental data show consistent agreement for all relevant channels.

  13. Real time method and computer system for identifying radioactive materials from HPGe gamma-ray spectroscopy

    DOEpatents

    Rowland, Mark S.; Howard, Douglas E.; Wong, James L.; Jessup, James L.; Bianchini, Greg M.; Miller, Wayne O.

    2007-10-23

    A real-time method and computer system for identifying radioactive materials which collects gamma count rates from a HPGe gamma-radiation detector to produce a high-resolution gamma-ray energy spectrum. A library of nuclear material definitions ("library definitions") is provided, with each uniquely associated with a nuclide or isotope material and each comprising at least one logic condition associated with a spectral parameter of a gamma-ray energy spectrum. The method determines whether the spectral parameters of said high-resolution gamma-ray energy spectrum satisfy all the logic conditions of any one of the library definitions, and subsequently uniquely identifies the material type as that nuclide or isotope material associated with the satisfied library definition. The method is iteratively repeated to update the spectrum and identification in real time.

  14. Validation of tungsten cross sections in the neutron energy region up to 100 keV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej

    2017-09-01

    Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.

  15. Effects of the Application of the New Nuclear Data Library ENDF/B to the Criticality Analysis of AP1000

    NASA Astrophysics Data System (ADS)

    Kuntoro, Iman; Sembiring, T. M.; Susilo, Jati; Deswandri; Sunaryo, G. R.

    2018-02-01

    Calculations of criticality of the AP1000 core due to the use of new edition of nuclear data library namely ENDF/B-VII and ENDF/B-VII.1 have been done. This work is aimed to know the accuracy of ENDF/B-VII.1 compared to ENDF/B-VII and ENDF/B-VI.8. in determining the criticality parameter of AP1000. Analysis ws imposed to core at cold zero power (CZP) conditions. The calculations have been carried out by means of MCNP computer code for 3 dimension geometry. The results show that criticality parameter namely effective multiplication factor of the AP1000 core are higher than that ones resulted from ENDF/B-VI.8 with relative differences of 0.39% for application of ENDF/B-VII and of 0.34% for application of ENDF/B-VII.1.

  16. The Decay Data Evaluation Project (DDEP) and the JEFF-3.3 radioactive decay data library: Combining international collaborative efforts on evaluated decay data

    NASA Astrophysics Data System (ADS)

    Kellett, Mark A.; Bersillon, Olivier

    2017-09-01

    The Decay Data Evaluation Project (DDEP), is an international collaboration of decay data evaluators formed with groups from France, Germany, USA, China, Romania, Russia, Spain and the UK, mainly from the metrology community. DDEP members have evaluated over 220 radionuclides, following an agreed upon methodology, including a peer review. Evaluations include all relevant parameters relating to the nuclear decay and the associated atomic processes. An important output of these evaluations are recommendations for new measurements, which can serve as a basis for future measurement programmes. Recently evaluated radionuclides include: 18F, 59Fe, 82Rb, 82Sr, 88Y, 90Y, 89Zr, 94mTc, 109Cd, 133Ba, 140Ba, 140La, 151Sm and 169Er. The DDEP recommended data have recently been incorporated into the JEFF-3.3 Radioactive Decay Data Library. Other sources of nuclear data include 900 or so radionuclides converted from the Evaluated Nuclear Structure Data File (ENSDF), 500 from two UK libraries (UKPADD6.12 and UKHEDD2.6), the IAEA Actinide Decay Data Library, with the remainder converted from the NUBASE evaluation of nuclear properties. Mean decay energies for a number of radionuclides determined from total absorption gamma-ray spectroscopy (TAGS) have also been included, as well as more recent European results from TAGS measurements performed at the University of Jyväskylä by groups from the University of Valencia, Spain and SUBATECH, the University of Nantes, France. The current status of the DDEP collaboration and the JEFF Radioactive Decay Data Library will be presented. Note to the reader: the pdf file has been changed on September 22, 2017.

  17. Used Nuclear Fuel-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, Kaushik; Clarity, Justin B; Cumberland, Riley M

    This will be licensed via RSICC. A new, integrated data and analysis system has been designed to simplify and automate the performance of accurate and efficient evaluations for characterizing the input to the overall nuclear waste management system -UNF-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS). A relational database within UNF-ST&DARDS provides a standard means by which UNF-ST&DARDS can succinctly store and retrieve modeling and simulation (M&S) parameters for specific spent nuclear fuel analysis. A library of various analysis model templates provides the ability to communicate the various set of M&S parameters to the most appropriate M&S application.more » Interactive visualization capabilities facilitate data analysis and results interpretation. UNF-ST&DARDS current analysis capabilities include (1) assembly-specific depletion and decay, (2) and spent nuclear fuel cask-specific criticality and shielding. Currently, UNF-ST&DARDS uses SCALE nuclear analysis code system for performing nuclear analysis.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sublet, J.-Ch., E-mail: jean-christophe.sublet@ukaea.uk; Eastwood, J.W.; Morgan, J.G.

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2more » and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.« less

  19. FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.; Gilbert, M. R.; Fleming, M.; Arter, W.

    2017-01-01

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.

  20. EMPIRE: Nuclear Reaction Model Code System for Data Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Capote, R.; Carlson, B.V.

    EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less

  1. Nuclear Data Uncertainty Propagation to Reactivity Coefficients of a Sodium Fast Reactor

    NASA Astrophysics Data System (ADS)

    Herrero, J. J.; Ochoa, R.; Martínez, J. S.; Díez, C. J.; García-Herranz, N.; Cabellos, O.

    2014-04-01

    The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.

  2. Three-dimensional Monte Carlo calculation of some nuclear parameters

    NASA Astrophysics Data System (ADS)

    Günay, Mehtap; Şeker, Gökmen

    2017-09-01

    In this study, a fusion-fission hybrid reactor system was designed by using 9Cr2WVTa Ferritic steel structural material and the molten salt-heavy metal mixtures 99-95% Li20Sn80 + 1-5% RG-Pu, 99-95% Li20Sn80 + 1-5% RG-PuF4, and 99-95% Li20Sn80 + 1-5% RG-PuO2, as fluids. The fluids were used in the liquid first wall, blanket and shield zones of a fusion-fission hybrid reactor system. Beryllium (Be) zone with the width of 3 cm was used for the neutron multiplication between the liquid first wall and blanket. This study analyzes the nuclear parameters such as tritium breeding ratio (TBR), energy multiplication factor (M), heat deposition rate, fission reaction rate in liquid first wall, blanket and shield zones and investigates effects of reactor grade Pu content in the designed system on these nuclear parameters. Three-dimensional analyses were performed by using the Monte Carlo code MCNPX-2.7.0 and nuclear data library ENDF/B-VII.0.

  3. Covariance generation and uncertainty propagation for thermal and fast neutron induced fission yields

    NASA Astrophysics Data System (ADS)

    Terranova, Nicholas; Serot, Olivier; Archier, Pascal; De Saint Jean, Cyrille; Sumini, Marco

    2017-09-01

    Fission product yields (FY) are fundamental nuclear data for several applications, including decay heat, shielding, dosimetry, burn-up calculations. To be safe and sustainable, modern and future nuclear systems require accurate knowledge on reactor parameters, with reduced margins of uncertainty. Present nuclear data libraries for FY do not provide consistent and complete uncertainty information which are limited, in many cases, to only variances. In the present work we propose a methodology to evaluate covariance matrices for thermal and fast neutron induced fission yields. The semi-empirical models adopted to evaluate the JEFF-3.1.1 FY library have been used in the Generalized Least Square Method available in CONRAD (COde for Nuclear Reaction Analysis and Data assimilation) to generate covariance matrices for several fissioning systems such as the thermal fission of U235, Pu239 and Pu241 and the fast fission of U238, Pu239 and Pu240. The impact of such covariances on nuclear applications has been estimated using deterministic and Monte Carlo uncertainty propagation techniques. We studied the effects on decay heat and reactivity loss uncertainty estimation for simplified test case geometries, such as PWR and SFR pin-cells. The impact on existing nuclear reactors, such as the Jules Horowitz Reactor under construction at CEA-Cadarache, has also been considered.

  4. KAOS/LIB-V: A library of nuclear response functions generated by KAOS-V code from ENDF/B-V and other data files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farawila, Y.; Gohar, Y.; Maynard, C.

    1989-04-01

    KAOS/LIB-V: A library of processed nuclear responses for neutronics analyses of nuclear systems has been generated. The library was prepared using the KAOS-V code and nuclear data from ENDF/B-V. The library includes kerma (kinetic energy released in materials) factors and other nuclear response functions for all materials presently of interest in fusion and fission applications for 43 nonfissionable and 15 fissionable isotopes and elements. The nuclear response functions include gas production and tritium-breeding functions, and all important reaction cross sections. KAOS/LIB-V employs the VITAMIN-E weighting function and energy group structure of 174 neutron groups. Auxiliary nuclear data bases, e.g., themore » Japanese evaluated nuclear data library JENDL-2 were used as a source of isotopic cross sections when these data are not provided in ENDF/B-V files for a natural element. These are needed mainly to estimate average quantities such as effective Q-values for the natural element. This analysis of local energy deposition was instrumental in detecting and understanding energy balance deficiencies and other problems in the ENDF/B-V data. Pertinent information about the library and a graphical display of the main nuclear response functions for all materials in the library are given. 35 refs.« less

  5. Preliminary neutronics design of china lead-alloy cooled demonstration reactor (CLEAR-III) for nuclear waste transmutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Z.; Southwest Science and Technology Univ., No.350 Shushanhu Road, Shushan District, Hefei, Anhui, 230031; Chen, Y.

    2012-07-01

    China Lead-Alloy cooled Demonstration Reactor (CLEAR-III), which is the concept of lead-bismuth cooled accelerator driven sub-critical reactor for nuclear waste transmutation, was proposed and designed by FDS team in China. In this study, preliminary neutronics design studies have primarily focused on three important performance parameters including Transmutation Support Ratio (TSR), effective multiplication factor and blanket thermal power. The constraint parameters, such as power peaking factor and initial TRU loading, were also considered. In the specific design, uranium-free metallic dispersion fuel of (TRU-Zr)-Zr was used as one of the CLEAR-III fuel types and the ratio between MA and Pu was adjustedmore » to maximize transmutation ratio. In addition, three different fuel zones differing in the TRU fraction of the fuel were respectively employed for this subcritical reactor, and the zone sizes and TRU fractions were determined such that the linear powers of these zones were close to each other. The neutronics calculations and analyses were performed by using Multi-Functional 4D Neutronics Simulation System named VisualBUS and nuclear data library HENDL (Hybrid Evaluated Nuclear Data Library). In the preliminary design, the maximum TSRLLMA was {approx}11 and the blanket thermal power was {approx}1000 MW when the effective multiplication factor was 0.98. The results showed that good performance of transmutation could be achieved based on the subcritical reactor loaded with uranium-free fuel. (authors)« less

  6. Comparative Studies on UO2 Fueled HTTR Several Nuclear Data Libraries

    NASA Astrophysics Data System (ADS)

    Hidayati, Anni N.; Prastyo, Puguh A.; Waris, Abdul; Irwanto, Dwi

    2017-07-01

    HTTR (High Temperature Engineering Test Reactor) is one of Generation IV nuclear reactors that has been developed by JAERI (former name of JAEA, JAPAN). HTTR uses graphite moderator, helium gas coolant with UO2 fuel and outlet coolant temperature of 900°C or higher than that. Several studies regarding HTTR have been performed by employing JENDL 3.2 nuclear data libraries. In this paper, comparative evaluation of HTTR with several nuclear data libraries (JENDL 3.3, JENDL 4.0, and JEF 3.1) have been conducted.. The 3-D calculation was performed by using CITATION module of SRAC 2006 code. The result shows some differences between those nuclear data libraries result. K-eff or core effective multiplication factor results are about 1.17, 1,18 and 1,19 (JENDL 3.3, JENDL 4.0, and JEF 3.1) at Begin of Life, also at the End of Life (after two years operation) are 1.16, 1.17 and 1.17 for each nuclear data libraries. There are some different result of K-eff but for neutron spectra results, those nuclear data libraries show the same result.

  7. Towards a More Complete and Accurate Experimental Nuclear Reaction Data Library (EXFOR): International Collaboration Between Nuclear Reaction Data Centres (NRDC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otuka, N., E-mail: n.otsuka@iaea.org; Dupont, E.; Semkova, V.

    The International Network of Nuclear Reaction Data Centres (NRDC) coordinated by the IAEA Nuclear Data Section (NDS) successfully collaborates in the maintenance and development of the EXFOR library. As the scope of published data expands (e.g. to higher energy, to heavier projectile) to meet the needs of research and applications, it has become a challenging task to maintain both the completeness and accuracy of the EXFOR library. Evolution of the library highlighting recent developments is described.

  8. Converting point-wise nuclear cross sections to pole representation using regularized vector fitting

    NASA Astrophysics Data System (ADS)

    Peng, Xingjie; Ducru, Pablo; Liu, Shichang; Forget, Benoit; Liang, Jingang; Smith, Kord

    2018-03-01

    Direct Doppler broadening of nuclear cross sections in Monte Carlo codes has been widely sought for coupled reactor simulations. One recent approach proposed analytical broadening using a pole representation of the commonly used resonance models and the introduction of a local windowing scheme to improve performance (Hwang, 1987; Forget et al., 2014; Josey et al., 2015, 2016). This pole representation has been achieved in the past by converting resonance parameters in the evaluation nuclear data library into poles and residues. However, cross sections of some isotopes are only provided as point-wise data in ENDF/B-VII.1 library. To convert these isotopes to pole representation, a recent approach has been proposed using the relaxed vector fitting (RVF) algorithm (Gustavsen and Semlyen, 1999; Gustavsen, 2006; Liu et al., 2018). This approach however needs to specify ahead of time the number of poles. This article addresses this issue by adding a poles and residues filtering step to the RVF procedure. This regularized VF (ReV-Fit) algorithm is shown to efficiently converge the poles close to the physical ones, eliminating most of the superfluous poles, and thus enabling the conversion of point-wise nuclear cross sections.

  9. Nuclear Data Online Services at Peking University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, T.S.; Guo, Z.Y.; Ye, W.G.

    2005-05-24

    The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.

  10. Nuclear Data Online Services at Peking University

    NASA Astrophysics Data System (ADS)

    Fan, T. S.; Guo, Z. Y.; Ye, W. G.; Liu, W. L.; Liu, T. J.; Liu, C. X.; Chen, J. X.; Tang, G. Y.; Shi, Z. M.; Huang, X. L.; Chen, J. E.

    2005-05-01

    The Institute of Heavy Ion Physics at Peking University has developed a new nuclear data online services software package. Through the web site (http://ndos.nst.pku.edu.cn), it offers online access to main relational nuclear databases: five evaluated neutron libraries (BROND, CENDL, ENDF, JEF, JENDL), the ENSDF library, the EXFOR library, the IAEA photonuclear library and the charged particle data of the FENDL library. This software allows the comparison and graphic representations of the different data sets. The computer programs of this package are based on the Linux implementation of PHP and the MySQL software.

  11. VizieR Online Data Catalog: Brussels nuclear reaction rate library (Aikawa+, 2005)

    NASA Astrophysics Data System (ADS)

    Aikawa, M.; Arnould, M.; Goriely, S.; Jorissen, A.; Takahashi, K.

    2005-07-01

    The present data is part of the Brussels nuclear reaction rate library (BRUSLIB) for astrophysics applications and concerns nuclear reaction rate predictions calculated within the statistical Hauser-Feshbach approximation and making use of global and coherent microscopic nuclear models for the quantities (nuclear masses, nuclear structure properties, nuclear level densities, gamma-ray strength functions, optical potentials) entering the rate calculations. (4 data files).

  12. Nuclear Data Uncertainties for Typical LWR Fuel Assemblies and a Simple Reactor Core

    NASA Astrophysics Data System (ADS)

    Rochman, D.; Leray, O.; Hursin, M.; Ferroukhi, H.; Vasiliev, A.; Aures, A.; Bostelmann, F.; Zwermann, W.; Cabellos, O.; Diez, C. J.; Dyrda, J.; Garcia-Herranz, N.; Castro, E.; van der Marck, S.; Sjöstrand, H.; Hernandez, A.; Fleming, M.; Sublet, J.-Ch.; Fiorito, L.

    2017-01-01

    The impact of the current nuclear data library covariances such as in ENDF/B-VII.1, JEFF-3.2, JENDL-4.0, SCALE and TENDL, for relevant current reactors is presented in this work. The uncertainties due to nuclear data are calculated for existing PWR and BWR fuel assemblies (with burn-up up to 40 GWd/tHM, followed by 10 years of cooling time) and for a simplified PWR full core model (without burn-up) for quantities such as k∞, macroscopic cross sections, pin power or isotope inventory. In this work, the method of propagation of uncertainties is based on random sampling of nuclear data, either from covariance files or directly from basic parameters. Additionally, possible biases on calculated quantities are investigated such as the self-shielding treatment. Different calculation schemes are used, based on CASMO, SCALE, DRAGON, MCNP or FISPACT-II, thus simulating real-life assignments for technical-support organizations. The outcome of such a study is a comparison of uncertainties with two consequences. One: although this study is not expected to lead to similar results between the involved calculation schemes, it provides an insight on what can happen when calculating uncertainties and allows to give some perspectives on the range of validity on these uncertainties. Two: it allows to dress a picture of the state of the knowledge as of today, using existing nuclear data library covariances and current methods.

  13. Reasons for 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D.; Escher, J.; Hoffman, R.

    LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to create the 2011 release of the Evaluated Nuclear Data Library (ENDL2011). ENDL2011 is designed to sup- port LLNL's current and future nuclear data needs. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles, surpassing ENDL2009.0 [1]. The ENDL2011 release [2] contains 918 transport-ready eval- uations in the neutron sub-library alone. ENDL2011 was assembled with strong support from the ASC program, leveraged with support from NNSA science campaigns and the DOE/Offce of Science US Nuclear Datamore » Pro- gram.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wacker, John F.; Curry, Michael

    The interpretation of data from the nuclear forensic analysis of illicit nuclear material of unknown origin requires comparative data from samples of known origin. One way to provide such comparative data is to create a system of national nuclear forensics libraries, in which each participating country stores information about nuclear or other radioactive material that either resides in or was manufactured by that country. Such national libraries could provide an authoritative record of the material located in or produced by a particular country, and thus forms an essential prerequisite for a government to investigate illicit uses of nuclear or othermore » radioactive material within its borders. We describe the concept of the national nuclear forensic library, recommendations for content and structure, and suggested querying methods for utilizing the information for addressing nuclear smuggling.« less

  15. First Conclusions of the WPEC/Subgroup-22 Nuclear Data for Improved LEU-LWR Reactivity Predictions

    NASA Astrophysics Data System (ADS)

    Courcelle, Arnaud

    2005-05-01

    This paper is a summary of a collective work in the framework of the Working Party in International Nuclear Data Evaluation and Co-operation (WPEC) to investigate the reasons for systematic reactivity underprediction of thermal LEU-LWR (Low-Enriched Uranium, Light-Water Reactor). This keff underprediction (≈ -500 pcm) is observed with the most recent nuclear data libraries (ENDF/B-VI.8, JENDL3.3 and JEFF3.0) This report reviews the evaluation work performed at several laboratories [Oak Ridge National Laboratory (ORNL), Los Alamos National Laboratory (LANL), Commissariat a l'énergie atomique de Bruyeres-Le-Chatel (CEA-BRC), International Atomic Energy Agency (IAEA)] as well as the integral tests (mainly at LANL, Knoll Atomic Power Laboratory (KAPL), Bettis Atomic Power Laboratory (BAPL), Nuclear Research and Consultancy Group NRG-Petten, CEA and IAEA) of the successive versions of the new evaluated files. The present status of the work can be summarized as follows: • Improved evaluations of 238U inelastic data proposed by LANL and CEA-BRC were tested against integral benchmarks and partially improve the reactivity prediction. • The thermal capture cross-section of 238U has been revised, and a new evaluation of 238U resonance parameters, up to 20 keV, is in progress at ORNL. Integral tests have ensured that the modifications of 238U capture cross-section in the thermal and resolved range were still compatible with 238U integral measurements (238U capture rate ratios measured in critical facilities and 239Pu build-up prediction in a depleted pressurized water reactor (PWR) assembly). It is demonstrated that the combination of the new inelastic data (LANL or BRC) with the preliminary ORNL resonance parameter set gives a good correction of the reactivity under-estimation. The provisional conclusions of this collective work are expected to contribute toward the improvement of the future versions of nuclear data libraries.

  16. Structure for Storing Properties of Particles (PoP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, N. R.; Mattoon, C. M.; Beck, B. R.

    2014-06-01

    Some evaluated nuclear databases are critical for applications such as nuclear energy, nuclear medicine, homeland security, and stockpile stewardship. Particle masses, nuclear excitation levels, and other “Properties of Particles” are essential for making evaluated nuclear databases. Currently, these properties are obtained from various databases that are stored in outdated formats. Moreover, the “Properties of Particles” (PoP) structure is being designed that will allow storing all information for one or more particles in a single place, so that each evaluation, simulation, model calculation, etc. can link to the same data. Information provided in PoP will include properties of nuclei, gammas andmore » electrons (along with other particles such as pions, as evaluations extend to higher energies). Presently, PoP includes masses from the Atomic Mass Evaluation version 2003 (AME2003), and level schemes and gamma decays from the Reference Input Parameter Library (RIPL-3). The data are stored in a hierarchical structure. An example of how PoP stores nuclear masses and energy levels will be presented here.« less

  17. Structure for Storing Properties of Particles (PoP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, N.R., E-mail: infinidhi@llnl.gov; Mattoon, C.M.; Beck, B.R.

    2014-06-15

    Evaluated nuclear databases are critical for applications such as nuclear energy, nuclear medicine, homeland security, and stockpile stewardship. Particle masses, nuclear excitation levels, and other “Properties of Particles” are essential for making evaluated nuclear databases. Currently, these properties are obtained from various databases that are stored in outdated formats. A “Properties of Particles” (PoP) structure is being designed that will allow storing all information for one or more particles in a single place, so that each evaluation, simulation, model calculation, etc. can link to the same data. Information provided in PoP will include properties of nuclei, gammas and electrons (alongmore » with other particles such as pions, as evaluations extend to higher energies). Presently, PoP includes masses from the Atomic Mass Evaluation version 2003 (AME2003), and level schemes and gamma decays from the Reference Input Parameter Library (RIPL-3). The data are stored in a hierarchical structure. An example of how PoP stores nuclear masses and energy levels will be presented here.« less

  18. Production and testing of the ENEA-Bologna VITJEFF32.BOLIB (JEFF-3.2) multi-group (199 n + 42 γ) cross section library in AMPX format for nuclear fission applications

    NASA Astrophysics Data System (ADS)

    Pescarini, Massimo; Orsi, Roberto; Frisoni, Manuela

    2017-09-01

    The ENEA-Bologna Nuclear Data Group produced the VITJEFF32.BOLIB multi-group coupled neutron/photon (199 n + 42 γ) cross section library in AMPX format, based on the OECD-NEA Data Bank JEFF-3.2 evaluated nuclear data library. VITJEFF32.BOLIB was conceived for nuclear fission applications as European counterpart of the ORNL VITAMIN-B7 similar library (ENDF/B-VII.0 data). VITJEFF32.BOLIB has the same neutron and photon energy group structure as the former ORNL VITAMIN-B6 reference library (ENDF/B-VI.3 data) and was produced using similar data processing methodologies, based on the LANL NJOY-2012.53 nuclear data processing system for the generation of the nuclide cross section data files in GENDF format. Then the ENEA-Bologna 2007 Revision of the ORNL SCAMPI nuclear data processing system was used for the conversion into the AMPX format. VITJEFF32.BOLIB contains processed cross section data files for 190 nuclides, obtained through the Bondarenko (f-factor) method for the treatment of neutron resonance self-shielding and temperature effects. Collapsed working libraries of self-shielded cross sections in FIDO-ANISN format, used by the deterministic transport codes of the ORNL DOORS system, can be generated from VITJEFF32.BOLIB through the cited SCAMPI version. This paper describes the methodology and specifications of the data processing performed and presents some results of the VITJEFF32.BOLIB validation.

  19. 2011.2 Revision of the Evaluated Nuclear Data Library (ENDL2011.2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, B.; Descalles, M. A.; Mattoon, C.

    LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have col- laborated to create the 2011.2 revised release of the Evaluated Nuclear Data Library (ENDL2011.2). ENDL2011.2 is designed to support LLNL's current and future nuclear data needs and will be em- ployed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. This library was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/O cemore » of Science's US Nuclear Data Program. This document lists the revisions made in ENDL2011.2 compared with the data existing in the original ENDL2011.0 release and the ENDL2011.1-rc4 re- lease candidate of April 2015. These changes are made in parallel with some similar revisions for ENDL2009.2.« less

  20. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  1. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  2. 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D. A.; Beck, B.; Descalles, M. A.

    LLNL’s Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to produce the last of three major releases of LLNL’s evaluated nuclear database, ENDL2011. ENDL2011 is designed to support LLNL’s current and future nuclear data needs by providing the best nuclear data available to our programmatic customers. This library contains many new evaluations for radiochemical diagnostics, structural materials, and thermonuclear reactions. We have made an effort to eliminate all holes in reaction networks, allowing in-line isotopic creation and depletion calculations. We have striven to keep ENDL2011 at the leading edge of nuclear data library development by reviewingmore » and incorporating new evaluations as they are made available to the nuclear data community. Finally, this release is our most highly tested release as we have strengthened our already rigorous testing regime by adding tests against IPPE Activation Ratio Measurements, many more new critical assemblies and a more complete set of classified testing (to be detailed separately).« less

  3. 2009.1 Revision of the Evaluated Nuclear Data Library (ENDL2009.1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, I. J.; Beck, B.; Descalles, M. A.

    LLNL’s Computational Nuclear Data and Theory Group have created a 2009.1 revised release of the Evaluated Nuclear Data Library (ENDL2009.1). This library is designed to support LLNL’s current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science’s US Nuclear Data Program. This document listsmore » the revisions and fixes made in a new release called ENDL2009.1, by comparing with the existing data in the original release which is now called ENDL2009.0. These changes are made in conjunction with the revisions for ENDL2011.1, so that both the .1 releases are as free as possible of known defects.« less

  4. 2009.3 Revision of the Evaluated Nuclear Data Library (ENDL2009.3)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, I. J.; Beck, B.; Descalle, M. A.

    LLNL's Computational Nuclear Data and Theory Group have created a 2009.3 revised release of the Evaluated Nuclear Data Library (ENDL2009.3). This library is designed to support LLNL's current and future nuclear data needs and will be employed in nuclear reactor, nuclear security and stockpile stewardship simulations with ASC codes. The ENDL2009 database was the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles. It was assembled with strong support from the ASC PEM and Attribution programs, leveraged with support from Campaign 4 and the DOE/Office of Science's US Nuclear Data Program. This document listsmore » the revisions and fixes made in a new release called ENDL2009.3, by com- paring with the existing data in the previous release ENDL2009.2. These changes are made in conjunction with the revisions for ENDL2011.3, so that both the .3 releases are as free as possible of known defects.« less

  5. Leveraging existing information for use in a National Nuclear Forensics Library (NNFL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davydov, Jerry; Dion, Heather; LaMont, Stephen

    A National Nuclear Forensics Library (NNFL) assists a State to assess whether nuclear material encountered out of regulatory control is of domestic or international origin. And by leveraging nuclear material registries, nuclear enterprise records, and safeguards accountancy information, as well as existing domestic technical capability and subject-matter domain expertise, states can better assess the effort required for setting up an NNFL. For states who are largely recipients of nuclear and radiological materials and have no internal production capabilities may create an NNFL that relies on existing information rather than carry out advanced analyses on domestic materials.

  6. Leveraging existing information for use in a National Nuclear Forensics Library (NNFL)

    DOE PAGES

    Davydov, Jerry; Dion, Heather; LaMont, Stephen; ...

    2015-12-16

    A National Nuclear Forensics Library (NNFL) assists a State to assess whether nuclear material encountered out of regulatory control is of domestic or international origin. And by leveraging nuclear material registries, nuclear enterprise records, and safeguards accountancy information, as well as existing domestic technical capability and subject-matter domain expertise, states can better assess the effort required for setting up an NNFL. For states who are largely recipients of nuclear and radiological materials and have no internal production capabilities may create an NNFL that relies on existing information rather than carry out advanced analyses on domestic materials.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Otuka, N.; Pritychenko, B.; Otuka, N.

    The International Network of Nuclear Reaction Data Centres (NRDC) coordinated by the IAEA Nuclear Data Section (NDS) successfully collaborates in the maintenance and development of the EXFOR library. Likewise, as the scope of published data expands (e.g. to higher energy, to heavier projectile) to meet the needs of research and applications, it has become a challenging task to maintain both the completeness and accuracy of the EXFOR library. Evolution of the library highlighting recent developments is described.

  8. Combinatorial investigation of Fe–B thin-film nanocomposites

    PubMed Central

    Brunken, Hayo; Grochla, Dario; Savan, Alan; Kieschnick, Michael; Meijer, Jan D; Ludwig, Alfred

    2011-01-01

    Combinatorial magnetron sputter deposition from elemental targets was used to create Fe–B composition spread type thin film materials libraries on thermally oxidized 4-in. Si wafers. The materials libraries consisting of wedge-type multilayer thin films were annealed at 500 or 700 °C to transform the multilayers into multiphase alloys. The libraries were characterized by nuclear reaction analysis, Rutherford backscattering, nanoindentation, vibrating sample magnetometry, x-ray diffraction (XRD) and transmission electron microscopy (TEM). Young's modulus and hardness values were related to the annealing parameters, structure and composition of the films. The magnetic properties of the films were improved by annealing in a H2 atmosphere, showing a more than tenfold decrease in the coercive field values in comparison to those of the vacuum-annealed films. The hardness values increased from 8 to 18 GPa when the annealing temperature was increased from 500 to 700 °C. The appearance of Fe2B phases, as revealed by XRD and TEM, had a significant effect on the mechanical properties of the films. PMID:27877435

  9. ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chadwick, M B; Oblozinsky, P; Herman, M

    2006-10-02

    We describe the next generation general purpose Evaluated Nuclear Data File, ENDF/B-VII.0, of recommended nuclear data for advanced nuclear science and technology applications. The library, released by the U.S. Cross Section Evaluation Working Group (CSEWG) in December 2006, contains data primarily for reactions with incident neutrons, protons, and photons on almost 400 isotopes. The new evaluations are based on both experimental data and nuclear reaction theory predictions. The principal advances over the previous ENDF/B-VI library are the following: (1) New cross sections for U, Pu, Th, Np and Am actinide isotopes, with improved performance in integral validation criticality and neutronmore » transmission benchmark tests; (2) More precise standard cross sections for neutron reactions on H, {sup 6}Li, {sup 10}B, Au and for {sup 235,238}U fission, developed by a collaboration with the IAEA and the OECD/NEA Working Party on Evaluation Cooperation (WPEC); (3) Improved thermal neutron scattering; (4) An extensive set of neutron cross sections on fission products developed through a WPEC collaboration; (5) A large suite of photonuclear reactions; (6) Extension of many neutron- and proton-induced reactions up to an energy of 150 MeV; (7) Many new light nucleus neutron and proton reactions; (8) Post-fission beta-delayed photon decay spectra; (9) New radioactive decay data; and (10) New methods developed to provide uncertainties and covariances, together with covariance evaluations for some sample cases. The paper provides an overview of this library, consisting of 14 sublibraries in the same, ENDF-6 format, as the earlier ENDF/B-VI library. We describe each of the 14 sublibraries, focusing on neutron reactions. Extensive validation, using radiation transport codes to simulate measured critical assemblies, show major improvements: (a) The long-standing underprediction of low enriched U thermal assemblies is removed; (b) The {sup 238}U, {sup 208}Pb, and {sup 9}Be reflector biases in fast systems are largely removed; (c) ENDF/B-VI.8 good agreement for simulations of highly enriched uranium assemblies is preserved; (d) The underprediction of fast criticality of {sup 233,235}U and {sup 239}Pu assemblies is removed; and (e) The intermediate spectrum critical assemblies are predicted more accurately. We anticipate that the new library will play an important role in nuclear technology applications, including transport simulations supporting national security, nonproliferation, advanced reactor and fuel cycle concepts, criticality safety, medicine, space applications, nuclear astrophysics, and nuclear physics facility design. The ENDF/B-VII.0 library is archived at the National Nuclear Data Center, BNL. The complete library, or any part of it, may be retrieved from www.nndc.bnl.gov.« less

  10. FENDL: International reference nuclear data library for fusion applications

    NASA Astrophysics Data System (ADS)

    Pashchenko, A. B.; Wienke, H.; Ganesan, S.

    1996-10-01

    The IAEA Nuclear Data Section, in co-operation with several national nuclear data centres and research groups, has created the first version of an internationally available Fusion Evaluated Nuclear Data Library (FENDL-1). The FENDL library has been selected to serve as a comprehensive source of processed and tested nuclear data tailored to the requirements of the engineering design activity (EDA) of the ITER project and other fusion-related development projects. The present version of FENDL consists of the following sublibraries covering the necessary nuclear input for all physics and engineering aspects of the material development, design, operation and safety of the ITER project in its current EDA phase: FENDL/A-1.1: neutron activation cross-sections, selected from different available sources, for 636 nuclides, FENDL/D-1.0: nuclear decay data for 2900 nuclides in ENDF-6 format, FENDL/DS-1.0: neutron activation data for dosimetry by foil activation, FENDL/C-1.0: data for the fusion reactions D(d,n), D(d,p), T(d,n), T(t,2n), He-3(d,p) extracted from ENDF/B-6 and processed, FENDL/E-1.0:data for coupled neutron—photon transport calculations, including a data library for neutron interaction and photon production for 63 elements or isotopes, selected from ENDF/B-6, JENDL-3, or BROND-2, and a photon—atom interaction data library for 34 elements. The benchmark validation of FENDL-1 as required by the customer, i.e. the ITER team, is considered to be a task of high priority in the coming months. The well tested and validated nuclear data libraries in processed form of the FENDL-2 are expected to be ready by mid 1996 for use by the ITER team in the final phase of ITER EDA after extensive benchmarking and integral validation studies in the 1995-1996 period. The FENDL data files can be electronically transferred to users from the IAEA nuclear data section online system through INTERNET. A grand total of 54 (sub)directories with 845 files with total size of about 2 million blocks or about 1 Gigabyte (1 block = 512 bytes) of numerical data is currently available on-line.

  11. BUGJEFF311.BOLIB (JEFF-3.1.1) and BUGENDF70.BOLIB (ENDF/B-VII.0) - Generation Methodology and Preliminary Testing of two ENEA-Bologna Group Cross Section Libraries for LWR Shielding and Pressure Vessel Dosimetry

    NASA Astrophysics Data System (ADS)

    Pescarini, Massimo; Sinitsa, Valentin; Orsi, Roberto; Frisoni, Manuela

    2016-02-01

    Two broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format, dedicated to LWR shielding and pressure vessel dosimetry applications, were generated following the methodology recommended by the US ANSI/ANS-6.1.2-1999 (R2009) standard. These libraries, named BUGJEFF311.BOLIB and BUGENDF70.BOLIB, are respectively based on JEFF-3.1.1 and ENDF/B-VII.0 nuclear data and adopt the same broad-group energy structure (47 n + 20 γ) of the ORNL BUGLE-96 similar library. They were respectively obtained from the ENEA-Bologna VITJEFF311.BOLIB and VITENDF70.BOLIB libraries in AMPX format for nuclear fission applications through problem-dependent cross section collapsing with the ENEA-Bologna 2007 revision of the ORNL SCAMPI nuclear data processing system. Both previous libraries are based on the Bondarenko self-shielding factor method and have the same AMPX format and fine-group energy structure (199 n + 42 γ) as the ORNL VITAMIN-B6 similar library from which BUGLE-96 was obtained at ORNL. A synthesis of a preliminary validation of the cited BUGLE-type libraries, performed through 3D fixed source transport calculations with the ORNL TORT-3.2 SN code, is included. The calculations were dedicated to the PCA-Replica 12/13 and VENUS-3 engineering neutron shielding benchmark experiments, specifically conceived to test the accuracy of nuclear data and transport codes in LWR shielding and radiation damage analyses.

  12. CASMO5 JENDL-4.0 and ENDF/B-VII.1beta4 libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, J.; Gheorghiu, N.; Ferrer, R.

    2012-07-01

    This paper details the generation of neutron data libraries for the CASMO5 lattice physics code based on the recently released JENDL-4.0 and ENDF/B-VII.1beta4 nuclear data evaluations. This data represents state-of-the-art nuclear data for late-2011. The key features of the new evaluations are briefly described along with the procedure for processing of this data into CASMO5, 586-energy group neutron data libraries. Finally some CASMO5 results for standard UO{sub 2} and MOX critical experiments for the two new libraries and the current ENDF/B-VII.0 CASMO5 library are presented including the B and W 1810 series, DIMPLE S06A, S06B, TCA reflector criticals with ironmore » plates and the PNL-30-35 MOX criticals. The results show that CASMO5 with the new libraries is performing well for these criticals with a very slight edge in results to the JENDL-4.0 nuclear data evaluation over the ENDF/B-VII.1beta4 evaluation. Work is currently underway to generate a CASMO5 library based on the final ENDF/B-VII.R1 evaluation released Dec. 22, 2011. (authors)« less

  13. Processing and validation of JEFF-3.1.1 and ENDF/B-VII.0 group-wise cross section libraries for shielding calculations

    NASA Astrophysics Data System (ADS)

    Pescarini, M.; Sinitsa, V.; Orsi, R.; Frisoni, M.

    2013-03-01

    This paper presents a synthesis of the ENEA-Bologna Nuclear Data Group programme dedicated to generate and validate group-wise cross section libraries for shielding and radiation damage deterministic calculations in nuclear fission reactors, following the data processing methodology recommended in the ANSI/ANS-6.1.2-1999 (R2009) American Standard. The VITJEFF311.BOLIB and VITENDF70.BOLIB finegroup coupled n-γ (199 n + 42 γ - VITAMIN-B6 structure) multi-purpose cross section libraries, based on the Bondarenko method for neutron resonance self-shielding and respectively on JEFF-3.1.1 and ENDF/B-VII.0 evaluated nuclear data, were produced in AMPX format using the NJOY-99.259 and the ENEA-Bologna 2007 Revision of the SCAMPI nuclear data processing systems. Two derived broad-group coupled n-γ (47 n + 20 γ - BUGLE-96 structure) working cross section libraries in FIDO-ANISN format for LWR shielding and pressure vessel dosimetry calculations, named BUGJEFF311.BOLIB and BUGENDF70.BOLIB, were generated by the revised version of SCAMPI, through problem-dependent cross section collapsing and self-shielding from the cited fine-group libraries. The validation results on the criticality safety benchmark experiments for the fine-group libraries and the preliminary validation results for the broad-group working libraries on the PCA-Replica and VENUS-3 engineering neutron shielding benchmark experiments are reported in synthesis.

  14. High-Energy Activation Simulation Coupling TENDL and SPACS with FISPACT-II

    NASA Astrophysics Data System (ADS)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark

    2018-06-01

    To address the needs of activation-transmutation simulation in incident-particle fields with energies above a few hundred MeV, the FISPACT-II code has been extended to splice TENDL standard ENDF-6 nuclear data with extended nuclear data forms. The JENDL-2007/HE and HEAD-2009 libraries were processed for FISPACT-II and used to demonstrate the capabilities of the new code version. Tests of the libraries and comparisons against both experimental yield data and the most recent intra-nuclear cascade model results demonstrate that there is need for improved nuclear data libraries up to and above 1 GeV. Simulations on lead targets show that important radionuclides, such as 148Gd, can vary by more than an order of magnitude where more advanced models find agreement within the experimental uncertainties.

  15. Kiwi: An Evaluated Library of Uncertainties in Nuclear Data and Package for Nuclear Sensitivity Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruet, J

    2007-06-23

    This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directorymore » structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.« less

  16. Calculations of Maxwellian-averaged cross sections and astrophysical reaction rates using the ENDF/B-VII.0, JEFF-3.1, JENDL-3.3, and ENDF/B-VI.8 evaluated nuclear reaction data libraries

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Mughaghab, S. F.; Sonzogni, A. A.

    2010-11-01

    We have calculated the Maxwellian-averaged cross sections and astrophysical reaction rates of the stellar nucleosynthesis reactions (n, γ), (n, fission), (n, p), (n, α), and (n, 2n) using the ENDF/B-VII.0, JEFF-3.1, JENDL-3.3, and ENDF/B-VI.8 evaluated nuclear reaction data libraries. These four major nuclear reaction libraries were processed under the same conditions for Maxwellian temperatures (kT) ranging from 1 keV to 1 MeV. We compare our current calculations of the s-process nucleosynthesis nuclei with previous data sets and discuss the differences between them and the implications for nuclear astrophysics.

  17. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    NASA Astrophysics Data System (ADS)

    Hartini, Entin; Andiwijayakusuma, Dinan

    2014-09-01

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  18. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartini, Entin, E-mail: entin@batan.go.id; Andiwijayakusuma, Dinan, E-mail: entin@batan.go.id

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuelmore » type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.« less

  19. Monte Carlo Calculation of Thermal Neutron Inelastic Scattering Cross Section Uncertainties by Sampling Perturbed Phonon Spectra

    NASA Astrophysics Data System (ADS)

    Holmes, Jesse Curtis

    Nuclear data libraries provide fundamental reaction information required by nuclear system simulation codes. The inclusion of data covariances in these libraries allows the user to assess uncertainties in system response parameters as a function of uncertainties in the nuclear data. Formats and procedures are currently established for representing covariances for various types of reaction data in ENDF libraries. This covariance data is typically generated utilizing experimental measurements and empirical models, consistent with the method of parent data production. However, ENDF File 7 thermal neutron scattering library data is, by convention, produced theoretically through fundamental scattering physics model calculations. Currently, there is no published covariance data for ENDF File 7 thermal libraries. Furthermore, no accepted methodology exists for quantifying or representing uncertainty information associated with this thermal library data. The quality of thermal neutron inelastic scattering cross section data can be of high importance in reactor analysis and criticality safety applications. These cross sections depend on the material's structure and dynamics. The double-differential scattering law, S(alpha, beta), tabulated in ENDF File 7 libraries contains this information. For crystalline solids, S(alpha, beta) is primarily a function of the material's phonon density of states (DOS). Published ENDF File 7 libraries are commonly produced by calculation and processing codes, such as the LEAPR module of NJOY, which utilize the phonon DOS as the fundamental input for inelastic scattering calculations to directly output an S(alpha, beta) matrix. To determine covariances for the S(alpha, beta) data generated by this process, information about uncertainties in the DOS is required. The phonon DOS may be viewed as a probability density function of atomic vibrational energy states that exist in a material. Probable variation in the shape of this spectrum may be established that depends on uncertainties in the physics models and methodology employed to produce the DOS. Through Monte Carlo sampling of perturbations from the reference phonon spectrum, an S(alpha, beta) covariance matrix may be generated. In this work, density functional theory and lattice dynamics in the harmonic approximation are used to calculate the phonon DOS for hexagonal crystalline graphite. This form of graphite is used as an example material for the purpose of demonstrating procedures for analyzing, calculating and processing thermal neutron inelastic scattering uncertainty information. Several sources of uncertainty in thermal neutron inelastic scattering calculations are examined, including sources which cannot be directly characterized through a description of the phonon DOS uncertainty, and their impacts are evaluated. Covariances for hexagonal crystalline graphite S(alpha, beta) data are quantified by coupling the standard methodology of LEAPR with a Monte Carlo sampling process. The mechanics of efficiently representing and processing this covariance information is also examined. Finally, with appropriate sensitivity information, it is shown that an S(alpha, beta) covariance matrix can be propagated to generate covariance data for integrated cross sections, secondary energy distributions, and coupled energy-angle distributions. This approach enables a complete description of thermal neutron inelastic scattering cross section uncertainties which may be employed to improve the simulation of nuclear systems.

  20. ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology

    NASA Astrophysics Data System (ADS)

    Chadwick, M. B.; Obložinský, P.; Herman, M.; Greene, N. M.; McKnight, R. D.; Smith, D. L.; Young, P. G.; MacFarlane, R. E.; Hale, G. M.; Frankle, S. C.; Kahler, A. C.; Kawano, T.; Little, R. C.; Madland, D. G.; Moller, P.; Mosteller, R. D.; Page, P. R.; Talou, P.; Trellue, H.; White, M. C.; Wilson, W. B.; Arcilla, R.; Dunford, C. L.; Mughabghab, S. F.; Pritychenko, B.; Rochman, D.; Sonzogni, A. A.; Lubitz, C. R.; Trumbull, T. H.; Weinman, J. P.; Brown, D. A.; Cullen, D. E.; Heinrichs, D. P.; McNabb, D. P.; Derrien, H.; Dunn, M. E.; Larson, N. M.; Leal, L. C.; Carlson, A. D.; Block, R. C.; Briggs, J. B.; Cheng, E. T.; Huria, H. C.; Zerkle, M. L.; Kozier, K. S.; Courcelle, A.; Pronyaev, V.; van der Marck, S. C.

    2006-12-01

    We describe the next generation general purpose Evaluated Nuclear Data File, ENDF/B-VII.0, of recommended nuclear data for advanced nuclear science and technology applications. The library, released by the U.S. Cross Section Evaluation Working Group (CSEWG) in December 2006, contains data primarily for reactions with incident neutrons, protons, and photons on almost 400 isotopes, based on experimental data and theory predictions. The principal advances over the previous ENDF/B-VI library are the following: (1) New cross sections for U, Pu, Th, Np and Am actinide isotopes, with improved performance in integral validation criticality and neutron transmission benchmark tests; (2) More precise standard cross sections for neutron reactions on H, 6Li, 10B, Au and for 235,238U fission, developed by a collaboration with the IAEA and the OECD/NEA Working Party on Evaluation Cooperation (WPEC); (3) Improved thermal neutron scattering; (4) An extensive set of neutron cross sections on fission products developed through a WPEC collaboration; (5) A large suite of photonuclear reactions; (6) Extension of many neutron- and proton-induced evaluations up to 150 MeV; (7) Many new light nucleus neutron and proton reactions; (8) Post-fission beta-delayed photon decay spectra; (9) New radioactive decay data; (10) New methods for uncertainties and covariances, together with covariance evaluations for some sample cases; and (11) New actinide fission energy deposition. The paper provides an overview of this library, consisting of 14 sublibraries in the same ENDF-6 format as the earlier ENDF/B-VI library. We describe each of the 14 sublibraries, focusing on neutron reactions. Extensive validation, using radiation transport codes to simulate measured critical assemblies, show major improvements: (a) The long-standing underprediction of low enriched uranium thermal assemblies is removed; (b) The 238U and 208Pb reflector biases in fast systems are largely removed; (c) ENDF/B-VI.8 good agreement for simulations of thermal high-enriched uranium assemblies is preserved; (d) The underprediction of fast criticality of 233,235U and 239Pu assemblies is removed; and (e) The intermediate spectrum critical assemblies are predicted more accurately. We anticipate that the new library will play an important role in nuclear technology applications, including transport simulations supporting national security, nonproliferation, advanced reactor and fuel cycle concepts, criticality safety, fusion, medicine, space applications, nuclear astrophysics, and nuclear physics facility design. The ENDF/B-VII.0 library is archived at the National Nuclear Data Center, BNL, and can be retrieved from www.nndc.bnl.gov.

  1. General Guidelines on Criteria for Adoption or Rejection of Evaluated Libraries and Data by the Nuclear Data Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, Denise; Conlin, Jeremy Lloyd; Gray, Mark Girard

    This memo contains general guidelines on what documentation and tools need to be in place as well as format and data testing requirements such that evaluated nuclear data sets or entire libraries can be adopted by the nuclear data team. Additional requirements beyond this memo might apply for specific nuclear data observables. These guidelines were established based on discussions between J.L. Conlin, M.G. Gray, A.P. McCartney, D. Neudecker, D.K. Parsons and M.C. White.

  2. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    NASA Astrophysics Data System (ADS)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri

    2017-09-01

    The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  3. Evaluated cross-section libraries and kerma factors for neutrons up to 100 MeV on {sup 12}C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chadwick, M.B.; Blann, M.; Cox, L.

    1995-04-11

    A program is being carried out at Lawrence Livermore National Laboratory to develop high-energy evaluated nuclear data libraries for use in Monte Carlo simulations of cancer radiation therapy. In this report we describe evaluated cross sections and kerma factors for neutrons with incident energies up to 100 MeV on {sup 12}C. The aim of this effort is to incorporate advanced nuclear physics modeling methods, with new experimental measurements, to generate cross section libraries needed for an accurate simulation of dose deposition in fast neutron therapy. The evaluated libraries are based mainly on nuclear model calculations, benchmarked to experimental measurements wheremore » they exist. We use the GNASH code system, which includes Hauser-Feshbach, preequilibrium, and direct reaction mechanisms. The libraries tabulate elastic and nonelastic cross sections, angle-energy correlated production spectra for light ejectiles with A{le}and kinetic energies given to light ejectiles and heavy recoil fragments. The major steps involved in this effort are: (1) development and validation of nuclear models for incident energies up to 100 MeV; (2) collation of experimental measurements, including new results from Louvain-la-Nueve and Los Alamos; (3) extension of the Livermore ENDL formats for representing high-energy data; (4) calculation and evaluation of nuclear data; and (5) validation of the libraries. We describe the evaluations in detail, with particular emphasis on our new high-energy modeling developments. Our evaluations agree well with experimental measurements of integrated and differential cross sections. We compare our results with the recent ENDF/B-VI evaluation which extends up to 32 MeV.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sin, M.; Capote, R.; Herman, M. W.

    Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less

  5. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.

  6. Nuclear data activities at the n_TOF facility at CERN

    NASA Astrophysics Data System (ADS)

    Gunsing, F.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea-Correa, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés-Giraldo, M. A.; Cortés, G.; Cosentino, L.; Damone, L. A.; Deo, K.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Frost, R. J. W.; Furman, V.; Ganesan, S.; García, A. R.; Gawlik, A.; Gheorghe, I.; Glodariu, T.; Gonçalves, I. F.; González, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Göbel, K.; Harada, H.; Heftrich, T.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui, J.; Licata, M.; Lo Meo, S.; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Oprea, A.; Palomo-Pinto, F. R.; Paradela, C.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Quesada, J. M.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rout, P.; Radeck, D.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Stamatopoulos, A.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2016-10-01

    Nuclear data in general, and neutron-induced reaction cross sections in particular, are important for a wide variety of research fields. They play a key role in the safety and criticality assessment of nuclear technology, not only for existing power reactors but also for radiation dosimetry, medical applications, the transmutation of nuclear waste, accelerator-driven systems, fuel cycle investigations and future reactor systems as in Generation IV. Applications of nuclear data are also related to research fields as the study of nuclear level densities and stellar nucleosynthesis. Simulations and calculations of nuclear technology applications largely rely on evaluated nuclear data libraries. The evaluations in these libraries are based both on experimental data and theoretical models. Experimental nuclear reaction data are compiled on a worldwide basis by the international network of Nuclear Reaction Data Centres (NRDC) in the EXFOR database. The EXFOR database forms an important link between nuclear data measurements and the evaluated data libraries. CERN's neutron time-of-flight facility n_TOF has produced a considerable amount of experimental data since it has become fully operational with the start of the scientific measurement programme in 2001. While for a long period a single measurement station (EAR1) located at 185 m from the neutron production target was available, the construction of a second beam line at 20 m (EAR2) in 2014 has substantially increased the measurement capabilities of the facility. An outline of the experimental nuclear data activities at CERN's neutron time-of-flight facility n_TOF will be presented.

  7. Experimental validation of depletion calculations with VESTA 2.1.5 using JEFF-3.2

    NASA Astrophysics Data System (ADS)

    Haeck, Wim; Ichou, Raphaëlle

    2017-09-01

    The removal of decay heat is a significant safety concern in nuclear engineering for the operation of a nuclear reactor both in normal and accidental conditions and for intermediate and long term waste storage facilities. The correct evaluation of the decay heat produced by an irradiated material requires first of all the calculation of the composition of the irradiated material by depletion codes such as VESTA 2.1, currently under development at IRSN in France. A set of PWR assembly decay heat measurements performed by the Swedish Central Interim Storage Facility (CLAB) located in Oskarshamm (Sweden) have been calculated using different nuclear data libraries: ENDF/B-VII.0, JEFF-3.1, JEFF-3.2 and JEFF-3.3T1. Using these nuclear data libraries, VESTA 2.1 calculates the assembly decay heat for almost all cases within 4% of the measured decay heat. On average, the ENDF/B-VII.0 calculated decay heat values appear to give a systematic underestimation of only 0.5%. When using the JEFF-3.1 library, this results a systematic underestimation of about 2%. By switching to the JEFF-3.2 library, this systematic underestimation is improved slighty (up to 1.5%). The changes made in the JEFF-3.3T1 beta library appear to be overcorrecting, as the systematic underestimation is transformed into a systematic overestimation of about 1.5%.

  8. Radiative neutron capture cross section from 236U

    NASA Astrophysics Data System (ADS)

    Baramsai, B.; Jandel, M.; Bredeweg, T. A.; Bond, E. M.; Roman, A. R.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; O'Donnell, J. M.; Ullmann, J. L.; Kawano, T.

    2017-08-01

    The 236U(n ,γ ) reaction cross section has been measured for the incident neutron energy range from 10 eV to 800 keV by using the Detector for Advanced Neutron Capture Experiments (DANCE) γ -ray calorimeter at the Los Alamos Neutron Science Center. The cross section was determined with the ratio method, which is a technique that uses the 235U(n ,f ) reaction as a reference. The results of the experiment are reported in the resolved and unresolved resonance energy regions. Individual neutron resonance parameters were obtained below 1 keV incident energy by using the R -matrix code sammy. The cross section in the unresolved resonance region is determined with improved experimental uncertainty. It agrees with both ENDF/B-VII.1 and JEFF-3.2 nuclear data libraries. The results above 10 keV agree better with the JEFF-3.2 library.

  9. National Security in the Nuclear Age: Public Library Proposal and Booklist. May 1987 Update.

    ERIC Educational Resources Information Center

    Dane, Ernest B.

    To increase public understanding of national security issues, this document proposes that a balanced and up-to-date collection of books and other materials on national security in the nuclear age be included in all U.S. public libraries. The proposal suggests that the books be grouped together on an identified shelf. Selection criteria for the…

  10. Release of the ENDF/B-VII.1 Evaluated Nuclear Data File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, David

    2012-06-30

    The Cross Section Evaluation Working Group (CSEWG) released the ENDF/B-VII.1 library on December 22, 2011. The ENDF/B-VII.1 library is CSEWG's latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0, including: many new evaluation in the neutron sublibrary (423 in all and over 190 of these contain covariances), new fission product yields and a greatly improved decay data sublibrary. This summary barely touches on the five years worth of advances present in the ENDF/B-VII.1 library. We expect that these changes will lead tomore » improved integral performance in reactors and other applications. Furthermore, the expansion of covariance data in this release will allow for better uncertainty quantification, reducing design margins and costs. The ENDF library is an ongoing and evolving effort. Currently, the ENDF data community embarking on several parallel efforts to improve library management: (1) The adoption of a continuous integration system to provide evaluators 'instant' feedback on the quality of their evaluations and to provide data users with working 'beta' quality libraries in between major releases. (2) The transition to new hierarchical data format - the Generalized Nuclear Data (GND) format. We expect GND to enable new kinds of evaluated data which cannot be accommodated in the legacy ENDF format. (3) The development of data assimilation and uncertainty propagation techniques to enable the consistent use of integral experimental data in the evaluation process.« less

  11. ENDF/B-VIII.0: The 8th Major Release of the Nuclear Reaction Data Library with CIELO-project Cross Sections, New Standards and Thermal Scattering Data

    NASA Astrophysics Data System (ADS)

    Brown, D. A.; Chadwick, M. B.; Capote, R.; Kahler, A. C.; Trkov, A.; Herman, M. W.; Sonzogni, A. A.; Danon, Y.; Carlson, A. D.; Dunn, M.; Smith, D. L.; Hale, G. M.; Arbanas, G.; Arcilla, R.; Bates, C. R.; Beck, B.; Becker, B.; Brown, F.; Casperson, R. J.; Conlin, J.; Cullen, D. E.; Descalle, M.-A.; Firestone, R.; Gaines, T.; Guber, K. H.; Hawari, A. I.; Holmes, J.; Johnson, T. D.; Kawano, T.; Kiedrowski, B. C.; Koning, A. J.; Kopecky, S.; Leal, L.; Lestone, J. P.; Lubitz, C.; Márquez Damián, J. I.; Mattoon, C. M.; McCutchan, E. A.; Mughabghab, S.; Navratil, P.; Neudecker, D.; Nobre, G. P. A.; Noguere, G.; Paris, M.; Pigni, M. T.; Plompen, A. J.; Pritychenko, B.; Pronyaev, V. G.; Roubtsov, D.; Rochman, D.; Romano, P.; Schillebeeckx, P.; Simakov, S.; Sin, M.; Sirakov, I.; Sleaford, B.; Sobes, V.; Soukhovitskii, E. S.; Stetcu, I.; Talou, P.; Thompson, I.; van der Marck, S.; Welser-Sherrill, L.; Wiarda, D.; White, M.; Wormald, J. L.; Wright, R. Q.; Zerkle, M.; Žerovnik, G.; Zhu, Y.

    2018-02-01

    We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, and charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.

  12. Examination of total cross section resonance structure of niobium and silicon in neutron transmission experiments

    NASA Astrophysics Data System (ADS)

    Andrianova, Olga; Lomakov, Gleb; Manturov, Gennady

    2017-09-01

    The neutron transmission experiments are one of the main sources of information about the neutron cross section resonance structure and effect in the self-shielding. Such kind of data for niobium and silicon nuclides in energy range 7 keV to 3 MeV can be obtained from low-resolution transmission measurements performed earlier in Russia (with samples of 0.027 to 0.871 atom/barn for niobium and 0.076 to 1.803 atom/barn for silicon). A significant calculation-to-experiment discrepancy in energy range 100 to 600 keV and 300 to 800 keV for niobium and silicon, respectively, obtained using the evaluated nuclear data library ROSFOND, were found. The EVPAR code was used for estimation the average resonance parameters in energy range 7 to 600 keV for niobium. For silicon a stochastic optimization method was used to modify the resolved resonance parameters in energy range 300 to 800 keV. The improved ROSFOND evaluated nuclear data files were tested in calculation of ICSBEP integral benchmark experiments.

  13. National Security in the Nuclear Age. A Proposed Booklist and Public Education Ideas for Libraries.

    ERIC Educational Resources Information Center

    Dane, Ernest B.

    A bibliography on national security in the nuclear age is divided into three sections. The first section describes a proposal calling for the compilation of a balanced and up-to-date collection of books and other materials on this issue to be included in all U.S. public libraries. Also discussed are selection criteria for the book list, project…

  14. ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data

    NASA Astrophysics Data System (ADS)

    Chadwick, M. B.; Herman, M.; Obložinský, P.; Dunn, M. E.; Danon, Y.; Kahler, A. C.; Smith, D. L.; Pritychenko, B.; Arbanas, G.; Arcilla, R.; Brewer, R.; Brown, D. A.; Capote, R.; Carlson, A. D.; Cho, Y. S.; Derrien, H.; Guber, K.; Hale, G. M.; Hoblit, S.; Holloway, S.; Johnson, T. D.; Kawano, T.; Kiedrowski, B. C.; Kim, H.; Kunieda, S.; Larson, N. M.; Leal, L.; Lestone, J. P.; Little, R. C.; McCutchan, E. A.; MacFarlane, R. E.; MacInnes, M.; Mattoon, C. M.; McKnight, R. D.; Mughabghab, S. F.; Nobre, G. P. A.; Palmiotti, G.; Palumbo, A.; Pigni, M. T.; Pronyaev, V. G.; Sayer, R. O.; Sonzogni, A. A.; Summers, N. C.; Talou, P.; Thompson, I. J.; Trkov, A.; Vogt, R. L.; van der Marck, S. C.; Wallner, A.; White, M. C.; Wiarda, D.; Young, P. G.

    2011-12-01

    The ENDF/B-VII.1 library is our latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0. These advances focus on neutron cross sections, covariances, fission product yields and decay data, and represent work by the US Cross Section Evaluation Working Group (CSEWG) in nuclear data evaluation that utilizes developments in nuclear theory, modeling, simulation, and experiment. The principal advances in the new library are: (1) An increase in the breadth of neutron reaction cross section coverage, extending from 393 nuclides to 423 nuclides; (2) Covariance uncertainty data for 190 of the most important nuclides, as documented in companion papers in this edition; (3) R-matrix analyses of neutron reactions on light nuclei, including isotopes of He, Li, and Be; (4) Resonance parameter analyses at lower energies and statistical high energy reactions for isotopes of Cl, K, Ti, V, Mn, Cr, Ni, Zr and W; (5) Modifications to thermal neutron reactions on fission products (isotopes of Mo, Tc, Rh, Ag, Cs, Nd, Sm, Eu) and neutron absorber materials (Cd, Gd); (6) Improved minor actinide evaluations for isotopes of U, Np, Pu, and Am (we are not making changes to the major actinides 235,238U and 239Pu at this point, except for delayed neutron data and covariances, and instead we intend to update them after a further period of research in experiment and theory), and our adoption of JENDL-4.0 evaluations for isotopes of Cm, Bk, Cf, Es, Fm, and some other minor actinides; (7) Fission energy release evaluations; (8) Fission product yield advances for fission-spectrum neutrons and 14 MeV neutrons incident on 239Pu; and (9) A new decay data sublibrary. Integral validation testing of the ENDF/B-VII.1 library is provided for a variety of quantities: For nuclear criticality, the VII.1 library maintains the generally-good performance seen for VII.0 for a wide range of MCNP simulations of criticality benchmarks, with improved performance coming from new structural material evaluations, especially for Ti, Mn, Cr, Zr and W. For Be we see some improvements although the fast assembly data appear to be mutually inconsistent. Actinide cross section updates are also assessed through comparisons of fission and capture reaction rate measurements in critical assemblies and fast reactors, and improvements are evident. Maxwellian-averaged capture cross sections at 30 keV are also provided for astrophysics applications. We describe the cross section evaluations that have been updated for ENDF/B-VII.1 and the measured data and calculations that motivated the changes, and therefore this paper augments the ENDF/B-VII.0 publication [M. B. Chadwick, P. Obložinský, M. Herman, N. M. Greene, R. D. McKnight, D. L. Smith, P. G. Young, R. E. MacFarlane, G. M. Hale, S. C. Frankle, A. C. Kahler, T. Kawano, R. C. Little, D. G. Madland, P. Moller, R. D. Mosteller, P. R. Page, P. Talou, H. Trellue, M. C. White, W. B. Wilson, R. Arcilla, C. L. Dunford, S. F. Mughabghab, B. Pritychenko, D. Rochman, A. A. Sonzogni, C. R. Lubitz, T. H. Trumbull, J. P. Weinman, D. A. Br, D. E. Cullen, D. P. Heinrichs, D. P. McNabb, H. Derrien, M. E. Dunn, N. M. Larson, L. C. Leal, A. D. Carlson, R. C. Block, J. B. Briggs, E. T. Cheng, H. C. Huria, M. L. Zerkle, K. S. Kozier, A. Courcelle, V. Pronyaev, and S. C. van der Marck, "ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology," Nuclear Data Sheets 107, 2931 (2006)].

  15. Modelling Neutron-induced Reactions on 232–237U from 10 keV up to 30 MeV

    DOE PAGES

    Sin, M.; Capote, R.; Herman, M. W.; ...

    2017-01-17

    Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less

  16. JEFF-3.1, ENDF/B-VII and JENDL-3.3 Critical Assemblies Benchmarking With the Monte Carlo Code TRIPOLI

    NASA Astrophysics Data System (ADS)

    Sublet, Jean-Christophe

    2008-02-01

    ENDF/B-VII.0, the first release of the ENDF/B-VII nuclear data library, was formally released in December 2006. Prior to this event the European JEFF-3.1 nuclear data library was distributed in April 2005, while the Japanese JENDL-3.3 library has been available since 2002. The recent releases of these neutron transport libraries and special purpose files, the updates of the processing tools and the significant progress in computer power and potency, allow today far better leaner Monte Carlo code and pointwise library integration leading to enhanced benchmarking studies. A TRIPOLI-4.4 critical assembly suite has been set up as a collection of 86 benchmarks taken principally from the International Handbook of Evaluated Criticality Benchmarks Experiments (2006 Edition). It contains cases for a variety of U and Pu fuels and systems, ranging from fast to deep thermal solutions and assemblies. It covers cases with a variety of moderators, reflectors, absorbers, spectra and geometries. The results presented show that while the most recent library ENDF/B-VII.0, which benefited from the timely development of JENDL-3.3 and JEFF-3.1, produces better overall results, it suggest clearly also that improvements are still needed. This is true in particular in Light Water Reactor applications for thermal and epithermal plutonium data for all libraries and fast uranium data for JEFF-3.1 and JENDL-3.3. It is also true to state that other domains, in which Monte Carlo code are been used, such as astrophysics, fusion, high-energy or medical, radiation transport in general benefit notably from such enhanced libraries. It is particularly noticeable in term of the number of isotopes, materials available, the overall quality of the data and the much broader energy range for which evaluated (as opposed to modeled) data are available, spanning from meV to hundreds of MeV. In pointing out the impact of the different nuclear data at the library but also the isotopic levels one could not help noticing the importance and difference of the compensating effects that result from their single usage. Library differences are still important but tend to diminish due to the ever increasing and beneficial worldwide collaboration in the field of nuclear data measurement and evaluations.

  17. ENDF/B-VIII.0: The 8 th Major Release of the Nuclear Reaction Data Library with CIELO-project Cross Sections, New Standards and Thermal Scattering Data

    DOE PAGES

    Brown, D. A.; Chadwick, M. B.; Capote, R.; ...

    2018-02-01

    We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, andmore » charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.« less

  18. ENDF/B-VIII.0: The 8 th Major Release of the Nuclear Reaction Data Library with CIELO-project Cross Sections, New Standards and Thermal Scattering Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D. A.; Chadwick, M. B.; Capote, R.

    We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, andmore » charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B., E-mail: pritychenko@bnl.go; Mughaghab, S.F.; Sonzogni, A.A.

    We have calculated the Maxwellian-averaged cross sections and astrophysical reaction rates of the stellar nucleosynthesis reactions (n, {gamma}), (n, fission), (n, p), (n, {alpha}), and (n, 2n) using the ENDF/B-VII.0, JEFF-3.1, JENDL-3.3, and ENDF/B-VI.8 evaluated nuclear reaction data libraries. These four major nuclear reaction libraries were processed under the same conditions for Maxwellian temperatures (kT) ranging from 1 keV to 1 MeV. We compare our current calculations of the s-process nucleosynthesis nuclei with previous data sets and discuss the differences between them and the implications for nuclear astrophysics.

  20. Nuclear data libraries assessment for modelling a small fluoride salt-cooled, high-temperature reactor

    NASA Astrophysics Data System (ADS)

    Mohamed, Hassan; Lindley, Benjamin; Parks, Geoffrey

    2017-01-01

    Nuclear data consists of measured or evaluated probabilities of various fundamental physical interactions involving the nuclei of atoms and their properties. Most fluoride salt-cooled high-temperature reactor (FHR) studies that were reviewed do not give detailed information on the data libraries used in their assessments. Therefore, the main objective of this data libraries comparison study is to investigate whether there are any significant discrepancies between main data libraries, namely ENDF/B-VII, JEFF-3.1 and JEF-2.2. Knowing the discrepancies, especially its magnitude, is important and relevant for readers as to whether further cautions are necessary for any future verification or validation processes when modelling an FHR. The study is performed using AMEC's reactor physics software tool, WIMS. The WIMS calculation is simply a 2-D infinite lattice of fuel assembly calculation. The comparison between the data libraries in terms of infinite multiplication factor, kinf and pin power map are presented. Results show that the discrepancy between JEFF-3.1 and ENDF/B-VII libraries is reasonably small but increases as the fuel depletes due to the data libraries uncertainties that are accumulated at each burnup step. Additionally, there are large discrepancies between JEF-2.2 and ENDF/B-VII because of the inadequacy of the JEF-2.2 library.

  1. PAR -- Interface to the ADAM Parameter System

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.; Chipperfield, Alan J.

    PAR is a library of Fortran subroutines that provides convenient mechanisms for applications to exchange information with the outside world, through input-output channels called parameters. Parameters enable a user to control an application's behaviour. PAR supports numeric, character, and logical parameters, and is currently implemented only on top of the ADAM parameter system. The PAR library permits parameter values to be obtained, without or with a variety of constraints. Results may be put into parameters to be passed onto other applications. Other facilities include setting a prompt string, and suggested defaults. This document also introduces a preliminary C interface for the PAR library -- this may be subject to change in the light of experience.

  2. Experimental critical loadings and control rod worths in LWR-PROTEUS configurations compared with MCNPX results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plaschy, M.; Murphy, M.; Jatuff, F.

    2006-07-01

    The PROTEUS research reactor at the Paul Scherrer Inst. (PSI) has been operating since the sixties and has already permitted, due to its high flexibility, investigation of a large range of very different nuclear systems. Currently, the ongoing experimental programme is called LWR-PROTEUS. This programme was started in 1997 and concerns large-scale investigations of advanced light water reactors (LWR) fuels. Until now, the different LWR-PROTEUS phases have permitted to study more than fifteen different configurations, each of them having to be demonstrated to be operationally safe, in particular, for the Swiss safety authorities. In this context, recent developments of themore » PSI computer capabilities have made possible the use of full-scale SD-heterogeneous MCNPX models to calculate accurately different safety related parameters (e.g. the critical driver loading and the shutdown rod worth). The current paper presents the MCNPX predictions of these operational characteristics for seven different LWR-PROTEUS configurations using a large number of nuclear data libraries. More specifically, this significant benchmarking exercise is based on the ENDF/B6v2, ENDF/B6v8, JEF2.2, JEFF3.0, JENDL3.2, and JENDL3.3 libraries. The results highlight certain library specific trends in the prediction of the multiplication factor k{sub eff} (e.g. the systematically larger reactivity calculated with JEF2.2 and the smaller reactivity associated with JEFF3.0). They also confirm the satisfactory determination of reactivity variations by all calculational schemes, for instance, due to the introduction of a safety rod pair, these calculations having been compared with experiments. (authors)« less

  3. Production and Testing of the VITAMIN-B7 Fine-Group and BUGLE-B7 Broad-Group Coupled Neutron/Gamma Cross-Section Libraries Derived from ENDF/B-VII.0 Nuclear Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, J. M.; Wiarda, D.; Dunn, M. E.

    2011-09-30

    New coupled neutron-gamma cross-section libraries have been developed for use in light water reactor (LWR) shielding applications, including pressure vessel dosimetry calculations. The libraries, which were generated using Evaluated Nuclear Data File/B Version VII Release 0 (ENDF/B-VII.0), use the same fine-group and broad-group energy structures as the VITAMIN-B6 and BUGLE-96 libraries. The processing methodology used to generate both libraries is based on the methods used to develop VITAMIN-B6 and BUGLE-96 and is consistent with ANSI/ANS 6.1.2. The ENDF data were first processed into the fine-group pseudo-problem-independent VITAMIN-B7 library and then collapsed into the broad-group BUGLE-B7 library. The VITAMIN-B7 library containsmore » data for 391 nuclides. This represents a significant increase compared to the VITAMIN-B6 library, which contained data for 120 nuclides. The BUGLE-B7 library contains data for the same nuclides as BUGLE-96, and maintains the same numeric IDs for those nuclides. The broad-group data includes nuclides which are infinitely dilute and group collapsed using a concrete weighting spectrum, as well as nuclides which are self-shielded and group collapsed using weighting spectra representative of important regions of LWRs. The verification and validation of the new libraries includes a set of critical benchmark experiments, a set of regression tests that are used to evaluate multigroup crosssection libraries in the SCALE code system, and three pressure vessel dosimetry benchmarks. Results of these tests confirm that the new libraries are appropriate for use in LWR shielding analyses and meet the requirements of Regulatory Guide 1.190.« less

  4. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  5. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  6. Status of MAPA (Modular Accelerator Physics Analysis) and the Tech-X Object-Oriented Accelerator Library

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.

    1998-04-01

    The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.

  7. Comparison of ENDF/B-VII.1 and JEFF-3.2 in VVER-1000 operational data calculation

    NASA Astrophysics Data System (ADS)

    Frybort, Jan

    2017-09-01

    Safe operation of a nuclear reactor requires an extensive calculational support. Operational data are determined by full-core calculations during the design phase of a fuel loading. Loading pattern and design of fuel assemblies are adjusted to meet safety requirements and optimize reactor operation. Nodal diffusion code ANDREA is used for this task in case of Czech VVER-1000 reactors. Nuclear data for this diffusion code are prepared regularly by lattice code HELIOS. These calculations are conducted in 2D on fuel assembly level. There is also possibility to calculate these macroscopic data by Monte-Carlo Serpent code. It can make use of alternative evaluated libraries. All calculations are affected by inherent uncertainties in nuclear data. It is useful to see results of full-core calculations based on two sets of diffusion data obtained by Serpent code calculations with ENDF/B-VII.1 and JEFF-3.2 nuclear data including also decay data library and fission yields data. The comparison is based directly on fuel assembly level macroscopic data and resulting operational data. This study illustrates effect of evaluated nuclear data library on full-core calculations of a large PWR reactor core. The level of difference which results exclusively from nuclear data selection can help to understand the level of inherent uncertainties of such full-core calculations.

  8. Working Party on International Nuclear Data Evaluation Cooperation (WPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupont, E., E-mail: wpec@oecd-nea.org; Chadwick, M.B.; Danon, Y.

    The OECD Nuclear Energy Agency (NEA) organizes cooperation between the major nuclear data evaluation projects in the world. The NEA Working Party on International Nuclear Data Evaluation Cooperation (WPEC) was established to promote the exchange of information on nuclear data evaluation, measurement, nuclear model calculation, validation, and related topics, and to provide a framework for cooperative activities between the participating projects. The working party assesses nuclear data improvement needs and addresses these needs by initiating joint activities in the framework of dedicated WPEC subgroups. Studies recently completed comprise a number of works related to nuclear data covariance and associated processingmore » issues, as well as more specific studies related to the resonance parameter representation in the unresolved resonance region, the gamma production from fission product capture reactions, the {sup 235}U capture cross section, the EXFOR database, and the improvement of nuclear data for advanced reactor systems. Ongoing activities focus on the evaluation of {sup 239}Pu in the resonance region, scattering angular distribution in the fast energy range, and reporting/usage of experimental data for evaluation in the resolved resonance region. New activities include two subgroups on improved fission product yield evaluation methodologies and on modern nuclear database structures. Future activities under discussion include a pilot project for a Collaborative International Evaluated Library Organization (CIELO) and methods to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data. In addition to the above mentioned short-term task-oriented subgroups, WPEC also hosts a longer-term subgroup charged with reviewing and compiling the most important nuclear data requirements in a high priority request list (HPRL)« less

  9. Working Party on International Nuclear Data Evaluation Cooperation (WPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giuseppe Palmiotti

    The OECD Nuclear Energy Agency (NEA) is organizing the cooperation between the major nuclear data evaluation projects in the world. The NEA Working Party on International Nuclear Data Evaluation Cooperation (WPEC) was established to promote the exchange of information on nuclear data evaluation, measurement, nuclear model calculation, validation, and related topics, and to provide a framework for cooperative activities between the participating projects. The working party assesses nuclear data improvement needs and addresses these needs by initiating joint activities in the framework of dedicated WPEC subgroups. Studies recently completed comprise a number of works related to nuclear data covariance andmore » associated processing issues, as well as more specific studies related to the resonance parameter representation in the unresolved resonance region, the gamma production from fission-product capture reactions, the U-235 capture cross-section, the EXFOR database, and the improvement of nuclear data for advanced reactor systems. Ongoing activities focus on the evaluation of Pu-239 in the resonance region, scattering angular distribution in the fast energy range, and reporting/usage of experimental data for evaluation in the resolved resonance region. New activities include two new subgroups on improved fission product yield evaluation methodologies and on modern nuclear database structures. Future activities under discussion include a pilot project of a Collaborative International Evaluated Library (CIELO) and methods to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data. In addition to the above mentioned short-term, task-oriented subgroups, the WPEC also hosts a longer-term subgroup charged with reviewing and compiling the most important nuclear data requirements in a high priority request list (HPRL).« less

  10. Working Party on International Nuclear Data Evaluation Cooperation (WPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupont, E.; Herman, M.; Dupont, E.

    The OECD Nuclear Energy Agency (NEA) organizes cooperation between the major nuclear data evaluation projects in the world. Moreover, the NEA Working Party on International Nuclear Data Evaluation Cooperation (WPEC) was established to promote the exchange of information on nuclear data evaluation, measurement, nuclear model calculation, validation, and related topics, and to provide a framework for cooperative activities between the participating projects. The working party assesses nuclear data improvement needs and addresses these needs by initiating joint activities in the framework of dedicated WPEC subgroups. Studies recently completed comprise a number of works related to nuclear data covariance and associatedmore » processing issues, as well as more specific studies related to the resonance parameter representation in the unresolved resonance region, the gamma production from fission product capture reactions, the 235U capture cross section, the EXFOR database, and the improvement of nuclear data for advanced reactor systems. Ongoing activities focus on the evaluation of 239Pu in the resonance region, scattering angular distribution in the fast energy range, and reporting/usage of experimental data for evaluation in the resolved resonance region. New activities include two subgroups on improved fission product yield evaluation methodologies and on modern nuclear database structures. Some future activities under discussion include a pilot project for a Collaborative International Evaluated Library Organization (CIELO) and methods to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data. In addition to the above mentioned short-term task-oriented subgroups, WPEC also hosts a longer-term subgroup charged with reviewing and compiling the most important nuclear data requirements in a high priority request list (HPRL).« less

  11. Use and Impact of Covariance Data in the Japanese Latest Adjusted Library ADJ2010 Based on JENDL-4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yokoyama, K., E-mail: yokoyama.kenji09@jaea.go.jp; Ishikawa, M.

    2015-01-15

    The current status of covariance applications to fast reactor analysis and design in Japan is summarized. In Japan, the covariance data are mainly used for three purposes: (1) to quantify the uncertainty of nuclear core parameters, (2) to identify important nuclides, reactions and energy ranges which are dominant to the uncertainty of core parameters, and (3) to improve the accuracy of core design values by adopting the integral data such as the critical experiments and the power reactor operation data. For the last purpose, the cross section adjustment based on the Bayesian theorem is used. After the release of JENDL-4.0,more » a development project of the new adjusted group-constant set ADJ2010 was started in 2010 and completed in 2013. In the present paper, the final results of ADJ2010 are briefly summarized. In addition, the adjustment results of ADJ2010 are discussed from the viewpoint of use and impact of nuclear data covariances, focusing on {sup 239}Pu capture cross section alterations. For this purpose three kind of indices, called “degree of mobility,” “adjustment motive force,” and “adjustment potential,” are proposed.« less

  12. New Features in the Computational Infrastructure for Nuclear Astrophysics

    NASA Astrophysics Data System (ADS)

    Smith, M. S.; Lingerfelt, E. J.; Scott, J. P.; Hix, W. R.; Nesaraja, C. D.; Koura, H.; Roberts, L. F.

    2006-04-01

    The Computational Infrastructure for Nuclear Astrophysics is a suite of computer codes online at nucastrodata.org that streamlines the incorporation of recent nuclear physics results into astrophysical simulations. The freely-available, cross- platform suite enables users to upload cross sections and s-factors, convert them into reaction rates, parameterize the rates, store the rates in customizable libraries, setup and run custom post-processing element synthesis calculations, and visualize the results. New features include the ability for users to comment on rates or libraries using an email-type interface, a nuclear mass model evaluator, enhanced techniques for rate parameterization, better treatment of rate inverses, and creation and exporting of custom animations of simulation results. We also have online animations of r- process, rp-process, and neutrino-p process element synthesis occurring in stellar explosions.

  13. A broad-group cross-section library based on ENDF/B-VII.0 for fast neutron dosimetry Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpan, F.A.

    2011-07-01

    A new ENDF/B-VII.0-based coupled 44-neutron, 20-gamma-ray-group cross-section library was developed to investigate the latest evaluated nuclear data file (ENDF) ,in comparison to ENDF/B-VI.3 used in BUGLE-96, as well as to generate an objective-specific library. The objectives selected for this work consisted of dosimetry calculations for in-vessel and ex-vessel reactor locations, iron atom displacement calculations for reactor internals and pressure vessel, and {sup 58}Ni(n,{gamma}) calculation that is important for gas generation in the baffle plate. The new library was generated based on the contribution and point-wise cross-section-driven (CPXSD) methodology and was applied to one of the most widely used benchmarks, themore » Oak Ridge National Laboratory Pool Critical Assembly benchmark problem. In addition to the new library, BUGLE-96 and an ENDF/B-VII.0-based coupled 47-neutron, 20-gamma-ray-group cross-section library was generated and used with both SNLRML and IRDF dosimetry cross sections to compute reaction rates. All reaction rates computed by the multigroup libraries are within {+-} 20 % of measurement data and meet the U. S. Nuclear Regulatory Commission acceptance criterion for reactor vessel neutron exposure evaluations specified in Regulatory Guide 1.190. (authors)« less

  14. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  15. Sensitivity analysis of Monju using ERANOS with JENDL-4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with sensitivity analysis using JENDL-4.0 nuclear data applied to the Monju reactor. In 2010 the Japan Atomic Energy Agency - JAEA - released a new set of nuclear data: JENDL-4.0. This new evaluation is expected to contain improved data on actinides and covariance matrices. Covariance matrices are a key point in quantification of uncertainties due to basic nuclear data. For sensitivity analysis, the well-established ERANOS [1] code was chosen because of its integrated modules that allow users to perform a sensitivity analysis of complex reactor geometries. A JENDL-4.0 cross-section library is not available for ERANOS. Therefore amore » cross-section library had to be made from the original nuclear data set, available as ENDF formatted files. This is achieved by using the following codes: NJOY, CALENDF, MERGE and GECCO in order to create a library for the ECCO cell code (part of ERANOS). In order to make sure of the accuracy of the new ECCO library, two benchmark experiments have been analyzed: the MZA and MZB cores of the MOZART program measured at the ZEBRA facility in the UK. These were chosen due to their similarity to the Monju core. Using the JENDL-4.0 ECCO library we have analyzed the criticality of Monju during the restart in 2010. We have obtained good agreement with the measured criticality. Perturbation calculations have been performed between JENDL-3.3 and JENDL-4.0 based models. The isotopes {sup 239}Pu, {sup 238}U, {sup 241}Am and {sup 241}Pu account for a major part of observed differences. (authors)« less

  16. Development of ENDF/B-IV multigroup neutron cross-section libraries for the LEOPARD and LASER codes. Technical report on Phase 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenquin, U.P.; Stewart, K.B.; Heeb, C.M.

    1975-07-01

    The principal aim of this neutron cross-section research is to provide the utility industry with a 'standard nuclear data base' that will perform satisfactorily when used for analysis of thermal power reactor systems. EPRI is coordinating its activities with those of the Cross Section Evaluation Working Group (CSEWG), responsible for the development of the Evaluated Nuclear Data File-B (ENDF/B) library, in order to improve the performance of the ENDF/B library in thermal reactors and other applications of interest to the utility industry. Battelle-Northwest (BNW) was commissioned to process the ENDF/B Version-4 data files into a group-constant form for use inmore » the LASER and LEOPARD neutronics codes. Performance information on the library should provide the necessary feedback for improving the next version of the library, and a consistent data base is expected to be useful in intercomparing the versions of the LASER and LEOPARD codes presently being used by different utility groups. This report describes the BNW multi-group libraries and the procedures followed in their preparation and testing. (GRA)« less

  17. CESAR5.3: Isotopic depletion for Research and Testing Reactor decommissioning

    NASA Astrophysics Data System (ADS)

    Ritter, Guillaume; Eschbach, Romain; Girieud, Richard; Soulard, Maxime

    2018-05-01

    CESAR stands in French for "simplified depletion applied to reprocessing". The current version is now number 5.3 as it started 30 years ago from a long lasting cooperation with ORANO, co-owner of the code with CEA. This computer code can characterize several types of nuclear fuel assemblies, from the most regular PWR power plants to the most unexpected gas cooled and graphite moderated old timer research facility. Each type of fuel can also include numerous ranges of compositions like UOX, MOX, LEU or HEU. Such versatility comes from a broad catalog of cross section libraries, each corresponding to a specific reactor and fuel matrix design. CESAR goes beyond fuel characterization and can also provide an evaluation of structural materials activation. The cross-sections libraries are generated using the most refined assembly or core level transport code calculation schemes (CEA APOLLO2 or ERANOS), based on the European JEFF3.1.1 nuclear data base. Each new CESAR self shielded cross section library benefits all most recent CEA recommendations as for deterministic physics options. Resulting cross sections are organized as a function of burn up and initial fuel enrichment which allows to condensate this costly process into a series of Legendre polynomials. The final outcome is a fast, accurate and compact CESAR cross section library. Each library is fully validated, against a stochastic transport code (CEA TRIPOLI 4) if needed and against a reference depletion code (CEA DARWIN). Using CESAR does not require any of the neutron physics expertise implemented into cross section libraries generation. It is based on top quality nuclear data (JEFF3.1.1 for ˜400 isotopes) and includes up to date Bateman equation solving algorithms. However, defining a CESAR computation case can be very straightforward. Most results are only 3 steps away from any beginner's ambition: Initial composition, in core depletion and pool decay scenario. On top of a simple utilization architecture, CESAR includes a portable Graphical User Interface which can be broadly deployed in R&D or industrial facilities. Aging facilities currently face decommissioning and dismantling issues. This way to the end of the nuclear fuel cycle requires a careful assessment of source terms in the fuel, core structures and all parts of a facility that must be disposed of with "industrial nuclear" constraints. In that perspective, several CESAR cross section libraries were constructed for early CEA Research and Testing Reactors (RTR's). The aim of this paper is to describe how CESAR operates and how it can be used to help these facilities care for waste disposal, nuclear materials transport or basic safety cases. The test case will be based on the PHEBUS Facility located at CEA - Cadarache.

  18. Identifying Understudied Nuclear Reactions by Text-mining the EXFOR Experimental Nuclear Reaction Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirdt, J.A.; Brown, D.A., E-mail: dbrown@bnl.gov

    The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of socialmore » networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.« less

  19. Identifying Understudied Nuclear Reactions by Text-mining the EXFOR Experimental Nuclear Reaction Library

    NASA Astrophysics Data System (ADS)

    Hirdt, J. A.; Brown, D. A.

    2016-01-01

    The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of social networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.

  20. Generation of the V4.2m5 and AMPX and MPACT 51 and 252-Group Libraries with ENDF/B-VII.0 and VII.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog

    The evaluated nuclear data file (ENDF)/B-7.0 v4.1m3 MPACT 47-group library has been used as a main library for the Consortium for Advanced Simulation of Light Water Reactors (CASL) neutronics simulator in simulating pressurized water reactor (PWR) problems. Recent analysis for the high void boiling water reactor (BWR) fuels and burnt fuels indicates that the 47-group library introduces relatively large reactivity bias. Since the 47- group structure does not match with the SCALE 6.2 252-group boundaries, the CASL Virtual Environment for Reactor Applications Core Simulator (VERA-CS) MPACT library must be maintained independently, which causes quality assurance concerns. In order to addressmore » this issue, a new 51-group structure has been proposed based on the MPACT 47- g and SCALE 252-g structures. In addition, the new CASL library will include a 19-group structure for gamma production and interaction cross section data based on the SCALE 19- group structure. New AMPX and MPACT 51-group libraries have been developed with the ENDF/B-7.0 and 7.1 evaluated nuclear data. The 19-group gamma data also have been generated for future use, but they are only available on the AMPX 51-g library. In addition, ENDF/B-7.0 and 7.1 MPACT 252-g libraries have been generated for verification purposes. Various benchmark calculations have been performed to verify and validate the newly developed libraries.« less

  1. Monte Carlo Determination of Gamma Ray Exposure from a Homogeneous Ground Plane

    DTIC Science & Technology

    1990-03-01

    A HOMOGENEOUS GROUND PLANE SOURCE THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology Air University...come from a standard ANISN format library called FEWG1-85. This state-of-the- art cross section library which contains 37 neutron energy groups and 21...purpose. The FEWGl library, a state-of-the- art cross section library developed for the Defense Nuclear Agency con- sisting of 21 gamma-ray enerQj

  2. Economical analysis of saturation mutagenesis experiments

    PubMed Central

    Acevedo-Rocha, Carlos G.; Reetz, Manfred T.; Nov, Yuval

    2015-01-01

    Saturation mutagenesis is a powerful technique for engineering proteins, metabolic pathways and genomes. In spite of its numerous applications, creating high-quality saturation mutagenesis libraries remains a challenge, as various experimental parameters influence in a complex manner the resulting diversity. We explore from the economical perspective various aspects of saturation mutagenesis library preparation: We introduce a cheaper and faster control for assessing library quality based on liquid media; analyze the role of primer purity and supplier in libraries with and without redundancy; compare library quality, yield, randomization efficiency, and annealing bias using traditional and emergent randomization schemes based on mixtures of mutagenic primers; and establish a methodology for choosing the most cost-effective randomization scheme given the screening costs and other experimental parameters. We show that by carefully considering these parameters, laboratory expenses can be significantly reduced. PMID:26190439

  3. The measurement programme at the neutron time-of-flight facility n_TOF at CERN

    NASA Astrophysics Data System (ADS)

    Gunsing, F.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea-Correa, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés-Giraldo, M. A.; Cortés, G.; Cosentino, L.; Damone, L. A.; Deo, K.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Frost, R. J. W.; Furman, V.; Ganesan, S.; García, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Goverdovski, A.; Griesmayer, E.; Guerrero, C.; Göbel, K.; Harada, H.; Heftrich, T.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lerendegui, J.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Negret, A.; Oprea, A.; Palomo-Pinto, F. R.; Paradela, C.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Robles, M.; Rout, P.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Neutron-induced reaction cross sections are important for a wide variety of research fields ranging from the study of nuclear level densities, nucleosynthesis to applications of nuclear technology like design, and criticality and safety assessment of existing and future nuclear reactors, radiation dosimetry, medical applications, nuclear waste transmutation, accelerator-driven systems and fuel cycle investigations. Simulations and calculations of nuclear technology applications largely rely on evaluated nuclear data libraries. The evaluations in these libraries are based both on experimental data and theoretical models. CERN's neutron time-of-flight facility n_TOF has produced a considerable amount of experimental data since it has become fully operational with the start of its scientific measurement programme in 2001. While for a long period a single measurement station (EAR1) located at 185 m from the neutron production target was available, the construction of a second beam line at 20 m (EAR2) in 2014 has substantially increased the measurement capabilities of the facility. An outline of the experimental nuclear data activities at n_TOF will be presented.

  4. VVER-440 and VVER-1000 reactor dosimetry benchmark - BUGLE-96 versus ALPAN VII.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duo, J. I.

    2011-07-01

    Document available in abstract form only, full text of document follows: Analytical results of the vodo-vodyanoi energetichesky reactor-(VVER-) 440 and VVER-1000 reactor dosimetry benchmarks developed from engineering mockups at the Nuclear Research Inst. Rez LR-0 reactor are discussed. These benchmarks provide accurate determination of radiation field parameters in the vicinity and over the thickness of the reactor pressure vessel. Measurements are compared to calculated results with two sets of tools: TORT discrete ordinates code and BUGLE-96 cross-section library versus the newly Westinghouse-developed RAPTOR-M3G and ALPAN VII.0. The parallel code RAPTOR-M3G enables detailed neutron distributions in energy and space in reducedmore » computational time. ALPAN VII.0 cross-section library is based on ENDF/B-VII.0 and is designed for reactor dosimetry applications. It uses a unique broad group structure to enhance resolution in thermal-neutron-energy range compared to other analogous libraries. The comparison of fast neutron (E > 0.5 MeV) results shows good agreement (within 10%) between BUGLE-96 and ALPAN VII.O libraries. Furthermore, the results compare well with analogous results of participants of the REDOS program (2005). Finally, the analytical results for fast neutrons agree within 15% with the measurements, for most locations in all three mockups. In general, however, the analytical results underestimate the attenuation through the reactor pressure vessel thickness compared to the measurements. (authors)« less

  5. Evaluation of neutron thermalization parameters and benchmark reactor calculations using a synthetic scattering function for molecular gases

    NASA Astrophysics Data System (ADS)

    Gillette, V. H.; Patiño, N. E.; Granada, J. R.; Mayer, R. E.

    1989-08-01

    Using a synthetic incoherent scattering function which describes the interaction of neutrons with molecular gases we provide analytical expressions for zero- and first-order scattering kernels, σ0( E0 → E), σ1( E0 → E), and total cross section σ0( E0). Based on these quantities, we have performed calculations of thermalization parameters and transport coefficients for H 2O, D 2O, C 6H 6 and (CH 2) n at room temperature. Comparison of such values with available experimental data and other calculations is satisfactory. We also generated nuclear data libraries for H 2O with 47 thermal groups at 300 K and performed some benchmark calculations ( 235U, 239Pu, PWR cell and typical APWR cell); the resulting reactivities are compared with experimental data and ENDF/B-IV calculations.

  6. Use the results of measurements on KBR facility for testing of neutron data of main structural materials for fast reactors

    NASA Astrophysics Data System (ADS)

    Koscheev, Vladimir; Manturov, Gennady; Pronyaev, Vladimir; Rozhikhin, Evgeny; Semenov, Mikhail; Tsibulya, Anatoly

    2017-09-01

    Several k∞ experiments were performed on the KBR critical facility at the Institute of Physics and Power Engineering (IPPE), Obninsk, Russia during the 1970s and 80s for study of neutron absorption properties of Cr, Mn, Fe, Ni, Zr, and Mo. Calculations of these benchmarks with almost any modern evaluated nuclear data libraries demonstrate bad agreement with the experiment. Neutron capture cross sections of the odd isotopes of Cr, Mn, Fe, and Ni in the ROSFOND-2010 library have been reevaluated and another evaluation of the Zr nuclear data has been adopted. Use of the modified nuclear data for Cr, Mn, Fe, Ni, and Zr leads to significant improvement of the C/E ratio for the KBR assemblies. Also a significant improvement in agreement between calculated and evaluated values for benchmarks with Fe reflectors was observed. C/E results obtained with the modified ROSFOND library for complex benchmark models that are highly sensitive to the cross sections of structural materials are no worse than results obtained with other major evaluated data libraries. Possible improvement in results by decreasing the capture cross section for Zr and Mo at the energies above 1 keV is indicated.

  7. Testing of ENDF71x: A new ACE-formatted neutron data library based on ENDF/B-VII.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardiner, S. J.; Conlin, J. L.; Kiedrowski, B. C.

    The ENDF71x library [1] is the most thoroughly tested set of ACE-format data tables ever released by the Nuclear Data Team at Los Alamos National Laboratory (LANL). It is based on ENDF/B-VII. 1, the most recently released set of evaluated nuclear data files produced by the US Cross Section Evaluation Working Group (CSEWG). A variety of techniques were used to test and verify the ENDF7 1x library before its public release. These include the use of automated checking codes written by members of the Nuclear Data Team, visual inspections of key neutron data, MCNP6 calculations designed to test data formore » every included combination of isotope and temperature as comprehensively as possible, and direct comparisons between ENDF71x and previous ACE library releases. Visual inspection of some of the most important neutron data revealed energy balance problems and unphysical discontinuities in the cross sections for some nuclides. Doppler broadening of the total cross sections with increasing temperature was found to be qualitatively correct. Test calculations performed using MCNP prompted two modifications to the MCNP6 source code and also exposed bad secondary neutron yields for {sup 231,233}Pa that are present in both ENDF/B-VII.1 and ENDF/B-VII.0. A comparison of ENDF71x with its predecessor ACE library, ENDF70, showed that dramatic changes have been made in the neutron cross section data for a number of isotopes between ENDF/B-VII.0 and ENDF/B-VII.1. Based on the results of these verification tests and the validation tests performed by Kahler, et al. [2], the ENDF71x library is recommended for use in all Monte Carlo applications. (authors)« less

  8. Encoded physics knowledge in checking codes for nuclear cross section libraries at Los Alamos

    NASA Astrophysics Data System (ADS)

    Parsons, D. Kent

    2017-09-01

    Checking procedures for processed nuclear data at Los Alamos are described. Both continuous energy and multi-group nuclear data are verified by locally developed checking codes which use basic physics knowledge and common-sense rules. A list of nuclear data problems which have been identified with help of these checking codes is also given.

  9. Nuclear Data Needs for the Neutronic Design of MYRRHA Fast Spectrum Research Reactor

    NASA Astrophysics Data System (ADS)

    Stankovskiy, A.; Malambu, E.; Van den Eynde, G.; Díez, C. J.

    2014-04-01

    A global sensitivity analysis of effective neutron multiplication factor to the change of nuclear data library has been performed. It revealed that the test version of JEFF-3.2 neutron-induced evaluated data library produces closer results to ENDF/B-VII.1 than JEFF-3.1.2 does. The analysis of contributions of individual evaluations into keff sensitivity resulted in the priority list of nuclides, uncertainties on cross sections and fission neutron multiplicities of which have to be improved by setting up dedicated differential and integral experiments.

  10. Development and testing of the VITAMIN-B7/BUGLE-B7 coupled neutron-gamma multigroup cross-section libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, J.M.; Wiarda, D.; Miller, T.M.

    2011-07-01

    The U.S. Nuclear Regulatory Commission's Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the evaluated nuclear data file (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI.3 data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII.0. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96 libraries.more » Verification and validation of the new libraries were accomplished using diagnostic checks in AMPX, 'unit tests' for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in RPV fluence calculations and meet the calculational uncertainty criterion in Regulatory Guide 1.190. (authors)« less

  11. NSUF Irradiated Materials Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, James Irvin

    The Nuclear Science User Facilities has been in the process of establishing an innovative Irradiated Materials Library concept for maximizing the value of previous and on-going materials and nuclear fuels irradiation test campaigns, including utilization of real-world components retrieved from current and decommissioned reactors. When the ATR national scientific user facility was established in 2007 one of the goals of the program was to establish a library of irradiated samples for users to access and conduct research through competitively reviewed proposal process. As part of the initial effort, staff at the user facility identified legacy materials from previous programs thatmore » are still being stored in laboratories and hot-cell facilities at the INL. In addition other materials of interest were identified that are being stored outside the INL that the current owners have volunteered to enter into the library. Finally, over the course of the last several years, the ATR NSUF has irradiated more than 3500 specimens as part of NSUF competitively awarded research projects. The Logistics of managing this large inventory of highly radioactive poses unique challenges. This document will describe materials in the library, outline the policy for accessing these materials and put forth a strategy for making new additions to the library as well as establishing guidelines for minimum pedigree needed to be included in the library to limit the amount of material stored indefinitely without identified value.« less

  12. Development and Testing of the VITAMIN-B7/BUGLE-B7 Coupled Neutron-Gamma Multigroup Cross-Section Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risner, Joel M; Wiarda, Dorothea; Miller, Thomas Martin

    2011-01-01

    The U.S. Nuclear Regulatory Commission s Regulatory Guide 1.190 states that calculational methods used to estimate reactor pressure vessel (RPV) fluence should use the latest version of the Evaluated Nuclear Data File (ENDF). The VITAMIN-B6 fine-group library and BUGLE-96 broad-group library, which are widely used for RPV fluence calculations, were generated using ENDF/B-VI data, which was the most current data when Regulatory Guide 1.190 was issued. We have developed new fine-group (VITAMIN-B7) and broad-group (BUGLE-B7) libraries based on ENDF/B-VII. These new libraries, which were processed using the AMPX code system, maintain the same group structures as the VITAMIN-B6 and BUGLE-96more » libraries. Verification and validation of the new libraries was accomplished using diagnostic checks in AMPX, unit tests for each element in VITAMIN-B7, and a diverse set of benchmark experiments including critical evaluations for fast and thermal systems, a set of experimental benchmarks that are used for SCALE regression tests, and three RPV fluence benchmarks. The benchmark evaluation results demonstrate that VITAMIN-B7 and BUGLE-B7 are appropriate for use in LWR shielding applications, and meet the calculational uncertainty criterion in Regulatory Guide 1.190.« less

  13. Benchmarking and validation activities within JEFF project

    NASA Astrophysics Data System (ADS)

    Cabellos, O.; Alvarez-Velarde, F.; Angelone, M.; Diez, C. J.; Dyrda, J.; Fiorito, L.; Fischer, U.; Fleming, M.; Haeck, W.; Hill, I.; Ichou, R.; Kim, D. H.; Klix, A.; Kodeli, I.; Leconte, P.; Michel-Sendis, F.; Nunnenmann, E.; Pecchia, M.; Peneliau, Y.; Plompen, A.; Rochman, D.; Romojaro, P.; Stankovskiy, A.; Sublet, J. Ch.; Tamagno, P.; Marck, S. van der

    2017-09-01

    The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF) Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.

  14. SkyNet: Modular nuclear reaction network library

    NASA Astrophysics Data System (ADS)

    Lippuner, Jonas; Roberts, Luke F.

    2017-10-01

    The general-purpose nuclear reaction network SkyNet evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. Any list of isotopes can be evolved and SkyNet supports various different types of nuclear reactions. SkyNet is modular, permitting new or existing physics, such as nuclear reactions or equations of state, to be easily added or modified.

  15. Norms Versus Security: What is More Important to Japan’s View of Nuclear Weapons

    DTIC Science & Technology

    2017-03-01

    objectives: “1) prevent the spread of nuclear weapons and weapons technology, 2) promote cooperation in the peaceful uses of nuclear energy , and 3...http://www.world- nuclear.org/information-library/safety-and-security/safety-of-plants/fukushima-accident.aspx. 40 “Japanese Wary of Nuclear Energy ...PewResearchCenter, accessed February 22, 2017. http://www.pewglobal.org/2012/06/05/japanese-wary-of- nuclear - energy / 41 Malcolm Foster, “Thousands

  16. Screening the 10K Tox21 chemical library for thyroid hormone receptor modulators

    EPA Science Inventory

    Few ligands for the thyroid hormone receptor (TR) have been identified outside of endogenous ligands and pharmaceuticals, which suggests that TR is a very selective nuclear receptor (NR). However, large and diverse chemical libraries, particularly of environmental chemicals, have...

  17. The effects of nuclear data library processing on Geant4 and MCNP simulations of the thermal neutron scattering law

    NASA Astrophysics Data System (ADS)

    Hartling, K.; Ciungu, B.; Li, G.; Bentoumi, G.; Sur, B.

    2018-05-01

    Monte Carlo codes such as MCNP and Geant4 rely on a combination of physics models and evaluated nuclear data files (ENDF) to simulate the transport of neutrons through various materials and geometries. The grid representation used to represent the final-state scattering energies and angles associated with neutron scattering interactions can significantly affect the predictions of these codes. In particular, the default thermal scattering libraries used by MCNP6.1 and Geant4.10.3 do not accurately reproduce the ENDF/B-VII.1 model in simulations of the double-differential cross section for thermal neutrons interacting with hydrogen nuclei in a thin layer of water. However, agreement between model and simulation can be achieved within the statistical error by re-processing ENDF/B-VII.I thermal scattering libraries with the NJOY code. The structure of the thermal scattering libraries and sampling algorithms in MCNP and Geant4 are also reviewed.

  18. Nuclear Structure of 186Re

    DTIC Science & Technology

    2016-12-24

    D population-depopulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 AME Atomic Mass Evaluation...this mass region are important for validating models of nuclear structure and reactions. The ENSDF feeds a specific data library relevant to nuclear...spherically asymmetric. Spherical asymmetry is common for nuclei between shell closures, such as those in the mid-shell 150  A  190 mass range of interest

  19. 20180311 - Screening the 10K Tox21 chemical library for thyroid hormone receptor modulators (SOT)

    EPA Science Inventory

    Few ligands for the thyroid hormone receptor (TR) have been identified outside of endogenous ligands and pharmaceuticals, which suggests that TR is a very selective nuclear receptor (NR). However, large and diverse chemical libraries, particularly of environmental chemicals, have...

  20. Determination of total plutonium content in spent nuclear fuel assemblies with the differential die-away self-interrogation instrument

    NASA Astrophysics Data System (ADS)

    Kaplan, Alexis C.; Henzl, Vladimir; Menlove, Howard O.; Swinhoe, Martyn T.; Belian, Anthony P.; Flaska, Marek; Pozzi, Sara A.

    2014-11-01

    As a part of the Next Generation Safeguards Initiative Spent Fuel project, we simulate the response of the Differential Die-away Self-Interrogation (DDSI) instrument to determine total elemental plutonium content in an assayed spent nuclear fuel assembly (SFA). We apply recently developed concepts that relate total plutonium mass with SFA multiplication and passive neutron count rate. In this work, the multiplication of the SFA is determined from the die-away time in the early time domain of the Rossi-Alpha distributions measured directly by the DDSI instrument. We utilize MCNP to test the method against 44 pressurized water reactor SFAs from a simulated spent fuel library with a wide dynamic range of characteristic parameters such as initial enrichment, burnup, and cooling time. Under ideal conditions, discounting possible errors of a real world measurement, a root mean square agreement between true and determined total Pu mass of 2.1% is achieved.

  1. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru

    2010-12-15

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less

  2. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    NASA Astrophysics Data System (ADS)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.

    2010-12-01

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.

  3. The Army Study Program Fiscal Year 1988 Report

    DTIC Science & Technology

    1988-03-24

    Studies Institute (ATTN: AWCI) 4 Defense Nuclear Agency (ATTN: LASS) 1 Engineering Studies Center (ATTN: ESC) 5 Commandant: US Army War College (ATTN...Library) 5 US Navy War College (ATTN: Library) 5 US Air War College (ATTN: Library) 5 Chief of Naval Operations (ATTN: 0P916) 5 Headquarters, US Air... war . Agreement is needed on the key words describing Army functional areas and related terms and on the relationships among them and other factors

  4. The IR properties of ringed galaxies and the IRAS database

    NASA Technical Reports Server (NTRS)

    Buta, Ronald J.; Crocker, Deborah A.

    1993-01-01

    Our study of the Infrared Astronomy Satellite (IRAS) properties of ringed galaxies has been largely successful. We have identified what we think is the probable cause of the differences in the IRAS properties among non-interacting barred galaxies as the pattern speed of the bar. The key to identifying this parameter has been our focusing the study on outer-ringed galaxies where we know precisely what is present in the central regions (from available BVI CCD images in our library of images). The theory is that outer rings, through their morphology and other characteristics, can be identified with the outer Lindblad resonance, one of the major resonances in galaxy structure. Using a library of n-body simulations for comparison, we can reliably infer both low and high pattern speed galaxies from the appearance of outer rings and the existence of other ring features. It is clear that in some barred galaxies, the bar pattern speed is high enough to avoid an inner Lindblad resonance, hence such objects do not contain nuclear or circumnuclear star formation. The IRAS observations are most sensitive to nuclear star formation in early-type barred galaxies, and will thus select those barred galaxies where the pattern speed is low enough to allow an inner Lindblad resonance to exist. High pattern speed barred galaxies therefore weaken the correlation between bars and infrared excess. This finding helps to reconcile the inconsistent results found between different studies on the correlation between bars and far-IR emission.

  5. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Der Marck, S. C.

    Three nuclear data libraries have been tested extensively using criticality safety benchmark calculations. The three libraries are the new release of the US library ENDF/B-VII.1 (2011), the new release of the Japanese library JENDL-4.0 (2011), and the OECD/NEA library JEFF-3.1 (2006). All calculations were performed with the continuous-energy Monte Carlo code MCNP (version 4C3, as well as version 6-beta1). Around 2000 benchmark cases from the International Handbook of Criticality Safety Benchmark Experiments (ICSBEP) were used. The results were analyzed per ICSBEP category, and per element. Overall, the three libraries show similar performance on most criticality safety benchmarks. The largest differencesmore » are probably caused by elements such as Be, C, Fe, Zr, W. (authors)« less

  7. Statistical uncertainty analysis applied to the DRAGONv4 code lattice calculations and based on JENDL-4 covariance data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez-Solis, A.; Demaziere, C.; Ekberg, C.

    2012-07-01

    In this paper, multi-group microscopic cross-section uncertainty is propagated through the DRAGON (Version 4) lattice code, in order to perform uncertainty analysis on k{infinity} and 2-group homogenized macroscopic cross-sections predictions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library is based on JENDL-4 data, because JENDL-4 contains the largest amount of isotopic covariance matrixes among the different major nuclear data libraries. The aim is to propagate multi-group nuclide uncertainty by running the DRAGONv4 code 500 times, andmore » to assess the output uncertainty of a test case corresponding to a 17 x 17 PWR fuel assembly segment without poison. The chosen sampling strategy for the current study is Latin Hypercube Sampling (LHS). The quasi-random LHS allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. Output uncertainty assessment is based on the tolerance limits concept, where the sample formed by the code calculations infers to cover 95% of the output population with at least a 95% of confidence. This analysis is the first attempt to propagate parameter uncertainties of modern multi-group libraries, which are used to feed advanced lattice codes that perform state of the art resonant self-shielding calculations such as DRAGONv4. (authors)« less

  8. libSRES: a C library for stochastic ranking evolution strategy for parameter estimation.

    PubMed

    Ji, Xinglai; Xu, Ying

    2006-01-01

    Estimation of kinetic parameters in a biochemical pathway or network represents a common problem in systems studies of biological processes. We have implemented a C library, named libSRES, to facilitate a fast implementation of computer software for study of non-linear biochemical pathways. This library implements a (mu, lambda)-ES evolutionary optimization algorithm that uses stochastic ranking as the constraint handling technique. Considering the amount of computing time it might require to solve a parameter-estimation problem, an MPI version of libSRES is provided for parallel implementation, as well as a simple user interface. libSRES is freely available and could be used directly in any C program as a library function. We have extensively tested the performance of libSRES on various pathway parameter-estimation problems and found its performance to be satisfactory. The source code (in C) is free for academic users at http://csbl.bmb.uga.edu/~jix/science/libSRES/

  9. Towards a library of synthetic galaxy spectra and preliminary results of classification and parametrization of unresolved galaxies for Gaia. II

    NASA Astrophysics Data System (ADS)

    Tsalmantza, P.; Kontizas, M.; Rocca-Volmerange, B.; Bailer-Jones, C. A. L.; Kontizas, E.; Bellas-Velidis, I.; Livanou, E.; Korakitis, R.; Dapergolas, A.; Vallenari, A.; Fioc, M.

    2009-09-01

    Aims: This paper is the second in a series, implementing a classification system for Gaia observations of unresolved galaxies. Our goals are to determine spectral classes and estimate intrinsic astrophysical parameters via synthetic templates. Here we describe (1) a new extended library of synthetic galaxy spectra; (2) its comparison with various observations; and (3) first results of classification and parametrization experiments using simulated Gaia spectrophotometry of this library. Methods: Using the PÉGASE.2 code, based on galaxy evolution models that take account of metallicity evolution, extinction correction, and emission lines (with stellar spectra based on the BaSeL library), we improved our first library and extended it to cover the domain of most of the SDSS catalogue. Our classification and regression models were support vector machines (SVMs). Results: We produce an extended library of 28 885 synthetic galaxy spectra at zero redshift covering four general Hubble types of galaxies, over the wavelength range between 250 and 1050 nm at a sampling of 1 nm or less. The library is also produced for 4 random values of redshift in the range of 0-0.2. It is computed on a random grid of four key astrophysical parameters (infall timescale and 3 parameters defining the SFR) and, depending on the galaxy type, on two values of the age of the galaxy. The synthetic library was compared and found to be in good agreement with various observations. The first results from the SVM classifiers and parametrizers are promising, indicating that Hubble types can be reliably predicted and several parameters estimated with low bias and variance.

  10. General Economic and Demographic Background and Projections for Indiana Library Services.

    ERIC Educational Resources Information Center

    Foust, James D.; Tower, Carl B.

    Before future library needs can be estimated, economic and demographic variables that influence the demand for library services must be projected and estimating equations relating library needs to economic and demographic parameters developed. This study considers the size, location and age-sex characteristics of Indiana's current population and…

  11. Editorial Library: User Survey.

    ERIC Educational Resources Information Center

    Surace, Cecily J.

    This report presents the findings of a survey conducted by the editorial library of the Los Angeles Times to measure usage and satisfaction with library service, provide background information on library user characteristics, collect information on patterns of use of the Times' clipping files, relate data on usage and satisfaction parameters to…

  12. 77 FR 28407 - Special Nuclear Material Control and Accounting Systems for Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... and Management System (ADAMS): You may access publicly-available documents online in the NRC Library... revised ANSI N15.8 in February 2009. ANSI N15.8-2009 provides guidance on the fundamentals of an SNM...

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D. A.; Chadwick, M. B.; Capote, R.

    We describe the new ENDF/B-VIII.0 evaluated nuclear reaction data library. ENDF/B-VIII.0 fully incorporates the new IAEA standards, includes improved thermal neutron scattering data and uses new evaluated data from the CIELO project for neutron reactions on 1H, 16O, 56Fe, 235U, 238U and 239Pu described in companion papers in the present issue of Nuclear Data Sheets. The evaluations benefit from recent experimental data obtained in the U.S. and Europe, and improvements in theory and simulation. Notable advances include updated evaluated data for light nuclei, structural materials, actinides, fission energy release, prompt fission neutron and γ-ray spectra, thermal neutron scattering data, andmore » charged-particle reactions. Integral validation testing is shown for a wide range of criticality, reaction rate, and neutron transmission benchmarks. In general, integral validation performance of the library is improved relative to the previous ENDF/B-VII.1 library.« less

  14. On use of ZPR research reactors and associated instrumentation and measurement methods for reactor physics studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chauvin, J.P.; Blaise, P.; Lyoussi, A.

    2015-07-01

    The French atomic and alternative energies -CEA- is strongly involved in research and development programs concerning the use of nuclear energy as a clean and reliable source of energy and consequently is working on the present and future generation of reactors on various topics such as ageing plant management, optimization of the plutonium stockpile, waste management and innovative systems exploration. Core physics studies are an essential part of this comprehensive R and D effort. In particular, the Zero Power Reactor (ZPR) of CEA: EOLE, MINERVE and MASURCA play an important role in the validation of neutron (as well photon) physicsmore » calculation tools (codes and nuclear data). The experimental programs defined in the CEA's ZPR facilities aim at improving the calculation routes by reducing the uncertainties of the experimental databases. They also provide accurate data on innovative systems in terms of new materials (moderating and decoupling materials) and new concepts (ADS, ABWR, new MTR (e.g. JHR), GENIV) involving new fuels, absorbers and coolant materials. Conducting such interesting experimental R and D programs is based on determining and measuring main parameters of phenomena of interest to qualify calculation tools and nuclear data 'libraries'. Determining these parameters relies on the use of numerous and different experimental techniques using specific and appropriate instrumentation and detection tools. Main ZPR experimental programs at CEA, their objectives and challenges will be presented and discussed. Future development and perspectives regarding ZPR reactors and associated programs will be also presented. (authors)« less

  15. 75 FR 41902 - License Nos. DPR-31 and DPR-41; Florida Power & Light Company; Notice of Issuance of Director's...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-19

    ...), Rockville, Maryland, and from the ADAMS Public Library component on the NRC's Web site, http://www.nrc.gov... initiating a review of nuclear safety culture issues by the corporate nuclear review board, benchmarking SCWE...

  16. Testing of the ABBN-RF multigroup data library in photon transport calculations

    NASA Astrophysics Data System (ADS)

    Koscheev, Vladimir; Lomakov, Gleb; Manturov, Gennady; Tsiboulia, Anatoly

    2017-09-01

    Gamma radiation is produced via both of nuclear fuel and shield materials. Photon interaction is known with appropriate accuracy, but secondary gamma ray production known much less. The purpose of this work is studying secondary gamma ray production data from neutron induced reactions in iron and lead by using MCNP code and modern nuclear data as ROSFOND, ENDF/B-7.1, JEFF-3.2 and JENDL-4.0. Results of calculations show that all of these nuclear data have different photon production data from neutron induced reactions and have poor agreement with evaluated benchmark experiment. The ABBN-RF multigroup cross-section library is based on the ROSFOND data. It presented in two forms of micro cross sections: ABBN and MATXS formats. Comparison of group-wise calculations using both ABBN and MATXS data to point-wise calculations with the ROSFOND library shows a good agreement. The discrepancies between calculation and experimental C/E results in neutron spectra are in the limit of experimental errors. For the photon spectrum they are out of experimental errors. Results of calculations using group-wise and point-wise representation of cross sections show a good agreement both for photon and neutron spectra.

  17. RADIOLOGICAL SEALED SOURCE LIBRARY: A NUCLEAR FORENSICS TOOL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canaday, Jodi; Chamberlain, David; Finck, Martha

    If a terrorist were to obtain and possibly detonate a device that contained radiological material, radiological forensic analysis of the material and source capsule could provide law enforcement with valuable clues about the origin of the radiological material; this information could then provide further leads on where the material and sealed source was obtained, and the loss of control point. This information could potentially be utilized for attribution and prosecution. Analyses of nuclear forensic signatures for radiological materials are generally understood to include isotopic ratios, trace element concentrations, the time since irradiation or purification, and morphology. Radiological forensic signatures formore » sealed sources provide additional information that leverages information on the physical design and chemical composition of the source capsule and containers, physical markings indicative of an owner or manufacturer. Argonne National Laboratory (Argonne), in collaboration with Idaho National Laboratory (INL), has been working since 2003 to understand signatures that could be used to identify specific source manufacturers. These signatures include the materials from which the capsule is constructed, dimensions, weld details, elemental composition, and isotopic abundances of the radioactive material. These signatures have been compiled in a library known as the Argonne/INL Radiological Sealed Source Library. Data collected for the library has included open-source information from vendor catalogs and web pages; discussions with source manufacturers and touring of production facilities (both protected through non-disclosure agreements); technical publications; and government registries such as the U.S. Nuclear Regulatory Commission’s Sealed Source and Device Registry.« less

  18. Peace and Conflict: Resources Available from the Manitoba Education Library.

    ERIC Educational Resources Information Center

    Barich, Phyllis

    This bibliography of books, kits, and films for elementary and secondary education, available from Manitoba (Canada) Education Library, covers the topics of peace education, nuclear issues, violence, and the history of war. The list contains 55 books, 21 kits, and 50 16mm films. The films include the 13-part "Canada at War Series" and…

  19. The NJOY Nuclear Data Processing System, Version 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  20. Development of ORIGEN Libraries for Mixed Oxide (MOX) Fuel Assembly Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mertyurek, Ugur; Gauld, Ian C.

    In this research, ORIGEN cross section libraries for reactor-grade mixed oxide (MOX) fuel assembly designs have been developed to provide fast and accurate depletion calculations to predict nuclide inventories, radiation sources and thermal decay heat information needed in safety evaluations and safeguards verification measurements of spent nuclear fuel. These ORIGEN libraries are generated using two-dimensional lattice physics assembly models that include enrichment zoning and cross section data based on ENDF/B-VII.0 evaluations. Using the SCALE depletion sequence, burnup-dependent cross sections are created for selected commercial reactor assembly designs and a representative range of reactor operating conditions, fuel enrichments, and fuel burnup.more » The burnup dependent cross sections are then interpolated to provide problem-dependent cross sections for ORIGEN, avoiding the need for time-consuming lattice physics calculations. The ORIGEN libraries for MOX assembly designs are validated against destructive radiochemical assay measurements of MOX fuel from the MALIBU international experimental program. This program included measurements of MOX fuel from a 15 × 15 pressurized water reactor assembly and a 9 × 9 boiling water reactor assembly. The ORIGEN MOX libraries are also compared against detailed assembly calculations from the Phase IV-B numerical MOX fuel burnup credit benchmark coordinated by the Nuclear Energy Agency within the Organization for Economic Cooperation and Development. Finally, the nuclide compositions calculated by ORIGEN using the MOX libraries are shown to be in good agreement with other physics codes and with experimental data.« less

  1. Development of ORIGEN Libraries for Mixed Oxide (MOX) Fuel Assembly Designs

    DOE PAGES

    Mertyurek, Ugur; Gauld, Ian C.

    2015-12-24

    In this research, ORIGEN cross section libraries for reactor-grade mixed oxide (MOX) fuel assembly designs have been developed to provide fast and accurate depletion calculations to predict nuclide inventories, radiation sources and thermal decay heat information needed in safety evaluations and safeguards verification measurements of spent nuclear fuel. These ORIGEN libraries are generated using two-dimensional lattice physics assembly models that include enrichment zoning and cross section data based on ENDF/B-VII.0 evaluations. Using the SCALE depletion sequence, burnup-dependent cross sections are created for selected commercial reactor assembly designs and a representative range of reactor operating conditions, fuel enrichments, and fuel burnup.more » The burnup dependent cross sections are then interpolated to provide problem-dependent cross sections for ORIGEN, avoiding the need for time-consuming lattice physics calculations. The ORIGEN libraries for MOX assembly designs are validated against destructive radiochemical assay measurements of MOX fuel from the MALIBU international experimental program. This program included measurements of MOX fuel from a 15 × 15 pressurized water reactor assembly and a 9 × 9 boiling water reactor assembly. The ORIGEN MOX libraries are also compared against detailed assembly calculations from the Phase IV-B numerical MOX fuel burnup credit benchmark coordinated by the Nuclear Energy Agency within the Organization for Economic Cooperation and Development. Finally, the nuclide compositions calculated by ORIGEN using the MOX libraries are shown to be in good agreement with other physics codes and with experimental data.« less

  2. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    NASA Astrophysics Data System (ADS)

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative uncertainty due to the 59Ni data of 5.4%.

  3. 78 FR 48504 - Proposed Revisions to Maintenance Rule Standard Review Plan

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... Review Plan AGENCY: Nuclear Regulatory Commission. ACTION: Standard review plan-draft section revision... Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition,'' Section 17... and Management System (ADAMS): You may access publicly available documents online in the NRC Library...

  4. Nuclear choices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfson, R.

    This book contains part of the series New Liberal Arts, which is intended to make science and technology more accessible to students of the liberal arts. Volume in hand provides a comprehensive, multifaceted examination of nuclear energy, in nontechnical terms. Wolfson explains the basics of nuclear energy and radiation, nuclear power..., and nuclear weapons..., and he invites readers to make their own judgments on controversial nuclear issues. Illustrated with photos and diagrams. Each chapter contains suggestions for additional reading and a glossary. For policy, science, and general collections in all libraries. (ES) Topics contained include Atoms and nuclei. Effects andmore » uses of radiation. Energy and People. Reactor safety. Nuclear strategy. Defense in the nuclear age. Nuclear power, nuclear weapons, and nuclear futures.« less

  5. Correlated Production and Analog Transport of Fission Neutrons and Photons using Fission Models FREYA, FIFRELIN and the Monte Carlo Code TRIPOLI-4® .

    NASA Astrophysics Data System (ADS)

    Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier

    2018-01-01

    Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.

  6. Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Richard A.; Radford, David C.

    2013-12-30

    Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arraysmore » such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordborg, C.

    A new improved version of the OECD Nuclear Energy Agency (NEA) co-ordinated Joint Evaluated Fission and Fusion (JEFF) data library, JEFF-3.1, was released in May 2005. It comprises a general purpose library and the following five special purpose libraries: activation; thermal scattering law; radioactive decay; fission yield; and proton library. The objective of the previous version of the library (JEFF-2.2) was to achieve improved performance for existing reactors and fuel cycles. In addition to this objective, the JEFF-3.1 library aims to provide users with data for a wider range of applications. These include innovative reactor concepts, transmutation of radioactive waste,more » fusion, and various other energy and non-energy related industrial applications. Initial benchmark testing has confirmed the expected very good performance of the JEFF-3.1 library. Additional benchmarking of the libraries is underway, both for the general purpose and for the special purpose libraries. A new three-year mandate to continue developing the JEFF library was recently granted by the NEA. For the next version of the library, JEFF-3.2, it is foreseen to put more effort into fission product and minor actinide evaluations, as well as the inclusion of more covariance data. (authors)« less

  8. Stochastic hyperfine interactions modeling library-Version 2

    NASA Astrophysics Data System (ADS)

    Zacate, Matthew O.; Evenson, William E.

    2016-02-01

    The stochastic hyperfine interactions modeling library (SHIML) provides a set of routines to assist in the development and application of stochastic models of hyperfine interactions. The library provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental techniques that measure hyperfine interactions can be calculated. The optimized vector and matrix operations of the BLAS and LAPACK libraries are utilized. The original version of SHIML constructed and solved Blume matrices for methods that measure hyperfine interactions of nuclear probes in a single spin state. Version 2 provides additional support for methods that measure interactions on two different spin states such as Mössbauer spectroscopy and nuclear resonant scattering of synchrotron radiation. Example codes are provided to illustrate the use of SHIML to (1) generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A22 can be neglected and (2) generate Mössbauer spectra for polycrystalline samples for pure dipole or pure quadrupole transitions.

  9. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  10. Contributions of each isotope in some fluids on neutronic performance in a fusion-fission hybrid reactor: a Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Günay, M.; Şarer, B.; Kasap, H.

    2014-08-01

    In the present investigation, a fusion-fission hybrid reactor system was designed by using 9Cr2WVTa ferritic steel structural material and 99-95 % Li20Sn80-1-5 % SFG-Pu, 99-95 % Li20Sn80-1-5 % SFG-PuF4, 99-95 % Li20Sn80-1-5 % SFG-PuO2 the molten salt-heavy metal mixtures, as fluids. The fluids were used in the liquid first wall, blanket and shield zones of a fusion-fission hybrid reactor system. Beryllium zone with the width of 3 cm was used for the neutron multiplicity between liquid first wall and blanket. The contributions of each isotope in fluids on the nuclear parameters of a fusion-fission hybrid reactor such as tritium breeding ratio, energy multiplication factor, heat deposition rate were computed in liquid first wall, blanket and shield zones. Three-dimensional analyses were performed by using Monte Carlo code MCNPX-2.7.0 and nuclear data library ENDF/B-VII.0.

  11. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  12. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). Version 3.5, Quick Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  13. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  14. Use of integral experiments in support to the validation of JEFF-3.2 nuclear data evaluation

    NASA Astrophysics Data System (ADS)

    Leclaire, Nicolas; Cochet, Bertrand; Jinaphanh, Alexis; Haeck, Wim

    2017-09-01

    For many years now, IRSN has developed its own Monte Carlo continuous energy capability, which allows testing various nuclear data libraries. In that prospect, a validation database of 1136 experiments was built from cases used for the validation of the APOLLO2-MORET 5 multigroup route of the CRISTAL V2.0 package. In this paper, the keff obtained for more than 200 benchmarks using the JEFF-3.1.1 and JEFF-3.2 libraries are compared to benchmark keff values and main discrepancies are analyzed regarding the neutron spectrum. Special attention is paid on benchmarks for which the results have been highly modified between both JEFF-3 versions.

  15. Neutron radiation damage studies in the structural materials of a 500 MWe fast breeder reactor using DPA cross-sections from ENDF / B-VII.1

    NASA Astrophysics Data System (ADS)

    Saha, Uttiyoarnab; Devan, K.; Bachchan, Abhitab; Pandikumar, G.; Ganesan, S.

    2018-04-01

    The radiation damage in the structural materials of a 500 MWe Indian prototype fast breeder reactor (PFBR) is re-assessed by computing the neutron displacement per atom (dpa) cross-sections from the recent nuclear data library evaluated by the USA, ENDF / B-VII.1, wherein revisions were taken place in the new evaluations of basic nuclear data because of using the state-of-the-art neutron cross-section experiments, nuclear model-based predictions and modern data evaluation techniques. An indigenous computer code, computation of radiation damage (CRaD), is developed at our centre to compute primary-knock-on atom (PKA) spectra and displacement cross-sections of materials both in point-wise and any chosen group structure from the evaluated nuclear data libraries. The new radiation damage model, athermal recombination-corrected displacement per atom (arc-dpa), developed based on molecular dynamics simulations is also incorporated in our study. This work is the result of our earlier initiatives to overcome some of the limitations experienced while using codes like RECOIL, SPECTER and NJOY 2016, to estimate radiation damage. Agreement of CRaD results with other codes and ASTM standard for Fe dpa cross-section is found good. The present estimate of total dpa in D-9 steel of PFBR necessitates renormalisation of experimental correlations of dpa and radiation damage to ensure consistency of damage prediction with ENDF / B-VII.1 library.

  16. Principal component analysis as a tool for library design: a case study investigating natural products, brand-name drugs, natural product-like libraries, and drug-like libraries.

    PubMed

    Wenderski, Todd A; Stratton, Christopher F; Bauer, Renato A; Kopp, Felix; Tan, Derek S

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design.

  17. Principal Component Analysis as a Tool for Library Design: A Case Study Investigating Natural Products, Brand-Name Drugs, Natural Product-Like Libraries, and Drug-Like Libraries

    PubMed Central

    Wenderski, Todd A.; Stratton, Christopher F.; Bauer, Renato A.; Kopp, Felix; Tan, Derek S.

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design. PMID:25618349

  18. Impact of Americium-241 (n,γ) Branching Ratio on SFR Core Reactivity and Spent Fuel Characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiruta, Hikaru; Youinou, Gilles J.; Dixon, Brent W.

    An accurate prediction of core physics and fuel cycle parameters largely depends on the order of details and accuracy in nuclear data taken into account for actual calculations. 241Am is a major gateway nuclide for most of minor actinides and thus important nuclide for core physics and fuel-cycle calculations. The 241Am(n,?) branching ratio (BR) is in fact the energy dependent (see Fig. 1), therefore, it is necessary to taken into account the spectrum effect on the calculation of the average BR for the full-core depletion calculations. Moreover, the accuracy of the BR used in the depletion calculations could significantly influencemore » the core physics performance and post irradiated fuel compositions. The BR of 241Am(n,?) in ENDF/B-VII.0 library is relatively small and flat in thermal energy range, gradually increases within the intermediate energy range, and even becomes larger at the fast energy range. This indicates that the properly collapsed BR for fast reactors could be significantly different from that of thermal reactors. The evaluated BRs are also differ from one evaluation to another. As seen in Table I, average BRs for several evaluated libraries calculated by means of a fast spectrum are similar but have some differences. Most of currently available depletion codes use a pre-determined single value BR for each library. However, ideally it should be determined on-the-fly basis like that of one-group cross sections. These issues provide a strong incentive to investigate the effect of different 241Am(n,?) BRs on core and spent fuel parameters. This paper investigates the impact of the 241Am(n,?) BR on the results of SFR full-core based fuel-cycle calculations. The analysis is performed by gradually increasing the value of BR from 0.15 to 0.25 and studying its impact on the core reactivity and characteristics of SFR spent fuels over extended storage times (~10,000 years).« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koning, A.J.; Bersillon, O.; Forrest, R. A.

    The status of the Joint Evaluated Fission and Fusion file (JEFF) is described. The next version of the library, JEFF-3.1, comprises a significant update of actinide evaluations, evaluations emerging from European nuclear data projects, the activation library JEFF-3/A, the decay data and fission yield library, and fusion-related data files from the EFF project. The revisions were motivated by the availability of new measurements, modelling capabilities, or trends from integral experiments. Various pre-release validation efforts are underway, mainly for criticality and shielding of thermal and fast systems. This JEFF-3.1 library is expected to provide improved performances with respect to previous releasesmore » for a variety of scientific and industrial applications.« less

  20. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less

  1. ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.

    Earthquake source parameters underpin several aspects of nuclear explosion monitoring. Such aspects are: calibration of moment magnitudes (including coda magnitudes) and magnitude and distance amplitude corrections (MDAC); source depths; discrimination by isotropic moment tensor components; and waveform modeling for structure (including waveform tomography). This project seeks to improve methods for and broaden the applicability of estimating source parameters from broadband waveforms using the Cut-and-Paste (CAP) methodology. The CAP method uses a library of Green’s functions for a one-dimensional (1D, depth-varying) seismic velocity model. The method separates the main arrivals of the regional waveform into 5 windows: Pnl (vertical and radialmore » components), Rayleigh (vertical and radial components) and Love (transverse component). Source parameters are estimated by grid search over strike, dip, rake and depth and seismic moment or equivalently moment magnitude, MW, are adjusted to fit the amplitudes. Key to the CAP method is allowing the synthetic seismograms to shift in time relative to the data in order to account for path-propagation errors (delays) in the 1D seismic velocity model used to compute the Green’s functions. The CAP method has been shown to improve estimates of source parameters, especially when delay and amplitude biases are calibrated using high signal-to-noise data from moderate earthquakes, CAP+.« less

  2. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for recoil cross section spectra under neutron irradiation

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke; Ogawa, Tatsuhiko

    2017-04-01

    Because primary knock-on atoms (PKAs) create point defects and clusters in materials that are irradiated with neutrons, it is important to validate the calculations of recoil cross section spectra that are used to estimate radiation damage in materials. Here, the recoil cross section spectra of fission- and fusion-relevant materials were calculated using the Event Generator Mode (EGM) of the Particle and Heavy Ion Transport code System (PHITS) and also using the data processing code NJOY2012 with the nuclear data libraries TENDL2015, ENDF/BVII.1, and JEFF3.2. The heating number, which is the integral of the recoil cross section spectra, was also calculated using PHITS-EGM and compared with data extracted from the ACE files of TENDL2015, ENDF/BVII.1, and JENDL4.0. In general, only a small difference was found between the PKA spectra of PHITS + TENDL2015 and NJOY + TENDL2015. From analyzing the recoil cross section spectra extracted from the nuclear data libraries using NJOY2012, we found that the recoil cross section spectra were incorrect for 72Ge, 75As, 89Y, and 109Ag in the ENDF/B-VII.1 library, and for 90Zr and 55Mn in the JEFF3.2 library. From analyzing the heating number, we found that the data extracted from the ACE file of TENDL2015 for all nuclides were problematic in the neutron capture region because of incorrect data regarding the emitted gamma energy. However, PHITS + TENDL2015 can calculate PKA spectra and heating numbers correctly.

  3. President Clinton's Statement on the Comprehensive Nuclear Test Ban Treaty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clinton, Bill

    This is video footage of President Clinton delivering a statement to the press on signing the Comprehensive Nuclear Test Ban Treaty and answering press pool questions before departing Kansas City, Missouri. This footage is official public record produced by the White House Television (WHTV) crew, provided by the Clinton Presidential Library.

  4. Information Scanning and Processing at the Nuclear Safety Information Center.

    ERIC Educational Resources Information Center

    Parks, Celia; Julian, Carol

    This report is a detailed manual of the information specialist's duties at the Nuclear Safety Information Center. Information specialists scan the literature for documents to be reviewed, procure the documents (books, journal articles, reports, etc.), keep the document location records, and return the documents to the plant library or other…

  5. 76 FR 43356 - Evaluations of Explosions Postulated To Occur at Nearby Facilities and on Transportation Routes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... the ``Regulatory Guides'' collection of the NRC's Library at http://www.nrc.gov/reading-rm/doc... NUCLEAR REGULATORY COMMISSION [NRC-2011-0152] Evaluations of Explosions Postulated To Occur at..., ``Evaluations of Explosions Postulated to Occur at Nearby Facilities and on Transportation Routes Near Nuclear...

  6. Inventory Uncertainty Quantification using TENDL Covariance Data in Fispact-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastwood, J.W.; Morgan, J.G.; Sublet, J.-Ch., E-mail: jean-christophe.sublet@ccfe.ac.uk

    2015-01-15

    The new inventory code Fispact-II provides predictions of inventory, radiological quantities and their uncertainties using nuclear data covariance information. Central to the method is a novel fast pathways search algorithm using directed graphs. The pathways output provides (1) an aid to identifying important reactions, (2) fast estimates of uncertainties, (3) reduced models that retain important nuclides and reactions for use in the code's Monte Carlo sensitivity analysis module. Described are the methods that are being implemented for improving uncertainty predictions, quantification and propagation using the covariance data that the recent nuclear data libraries contain. In the TENDL library, above themore » upper energy of the resolved resonance range, a Monte Carlo method in which the covariance data come from uncertainties of the nuclear model calculations is used. The nuclear data files are read directly by FISPACT-II without any further intermediate processing. Variance and covariance data are processed and used by FISPACT-II to compute uncertainties in collapsed cross sections, and these are in turn used to predict uncertainties in inventories and all derived radiological data.« less

  7. A Multi-User Microcomputer System for Small Libraries.

    ERIC Educational Resources Information Center

    Leggate, Peter

    1988-01-01

    Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)

  8. IDENTIFYING COMPOUNDS USING SOURCE CID ON AN ORTHOGONAL ACCELERATION TIME-OF-FLIGHT MASS SPECTROMETER

    EPA Science Inventory

    Exact mass libraries of ESI and APCI mass spectra are not commercially available In-house libraries are dependent on CID parameters and are instrument specific. The ability to identify compounds without reliance on mass spectral libraries is therefore more crucial for liquid sam...

  9. Macro and Microenvironments at the British Library.

    ERIC Educational Resources Information Center

    Shenton, Helen

    This paper describes the storage of the 12 million items that have just been moved into the new British Library building. The specifications for the storage and environmental conditions for different types of library and archive material are explained. The varying environmental parameters for storage areas and public areas, including reading rooms…

  10. MT71x: Multi-Temperature Library Based on ENDF/B-VII.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conlin, Jeremy Lloyd; Parsons, Donald Kent; Gray, Mark Girard

    The Nuclear Data Team has released a multitemperature transport library, MT71x, based upon ENDF/B-VII.1 with a few modifications as well as additional evaluations for a total of 427 isotope tables. The library was processed using NJOY2012.39 into 23 temperatures. MT71x consists of two sub-libraries; MT71xMG for multigroup energy representation data and MT71xCE for continuous energy representation data. These sub-libraries are suitable for deterministic transport and Monte Carlo transport applications, respectively. The SZAs used are the same for the two sub-libraries; that is, the same SZA can be used for both libraries. This makes comparisons between the two libraries and betweenmore » deterministic and Monte Carlo codes straightforward. Both the multigroup energy and continuous energy libraries were verified and validated with our checking codes checkmg and checkace (multigroup and continuous energy, respectively) Then an expanded suite of tests was used for additional verification and, finally, verified using an extensive suite of critical benchmark models. We feel that this library is suitable for all calculations and is particularly useful for calculations sensitive to temperature effects.« less

  11. Analysis of dosimetry from the H.B. Robinson unit 2 pressure vessel benchmark using RAPTOR-M3G and ALPAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, G.A.

    2011-07-01

    Document available in abstract form only, full text of document follows: The dosimetry from the H. B. Robinson Unit 2 Pressure Vessel Benchmark is analyzed with a suite of Westinghouse-developed codes and data libraries. The radiation transport from the reactor core to the surveillance capsule and ex-vessel locations is performed by RAPTOR-M3G, a parallel deterministic radiation transport code that calculates high-resolution neutron flux information in three dimensions. The cross-section library used in this analysis is the ALPAN library, an Evaluated Nuclear Data File (ENDF)/B-VII.0-based library designed for reactor dosimetry and fluence analysis applications. Dosimetry is evaluated with the industry-standard SNLRMLmore » reactor dosimetry cross-section data library. (authors)« less

  12. ENDF-6 Formats Manual Data Formats and Procedures for the Evaluated Nuclear Data File ENDF/B-VI and ENDF/B-VII

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Members of the Cross Sections Evaluation Working Group

    2009-06-01

    In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats andmore » several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.« less

  13. Impact of Nuclear Data Uncertainties on Advanced Fuel Cycles and their Irradiated Fuel - a Comparison between Libraries

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2014-04-01

    The uncertainties on the isotopic composition throughout the burnup due to the nuclear data uncertainties are analysed. The different sources of uncertainties: decay data, fission yield and cross sections; are propagated individually, and their effect assessed. Two applications are studied: EFIT (an ADS-like reactor) and ESFR (Sodium Fast Reactor). The impact of the uncertainties on cross sections provided by the EAF-2010, SCALE6.1 and COMMARA-2.0 libraries are compared. These Uncertainty Quantification (UQ) studies have been carried out with a Monte Carlo sampling approach implemented in the depletion/activation code ACAB. Such implementation has been improved to overcome depletion/activation problems with variations of the neutron spectrum.

  14. Methods and Issues for the Combined Use of Integral Experiments and Covariance Data: Results of a NEA International Collaborative Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmiotti, Giuseppe; Salvatores, Massimo

    2014-04-01

    The Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD) established a Subgroup (called “Subgroup 33”) in 2009 on “Methods and issues for the combined use of integral experiments and covariance data.” The first stage was devoted to producing the description of different adjustment methodologies and assessing their merits. A detailed document related to this first stage has been issued. Nine leading organizations (often with a long and recognized expertise in the field) have contributed: ANL, CEA, INL, IPPE, JAEA, JSI, NRG, IRSN and ORNL. In the second stagemore » a practical benchmark exercise was defined in order to test the reliability of the nuclear data adjustment methodology. A comparison of the results obtained by the participants and major lessons learned in the exercise are discussed in the present paper that summarizes individual contributions which often include several original developments not reported separately. The paper provides the analysis of the most important results of the adjustment of the main nuclear data of 11 major isotopes in a 33-group energy structure. This benchmark exercise was based on a set of 20 well defined integral parameters from 7 fast assembly experiments. The exercise showed that using a common shared set of integral experiments but different starting evaluated libraries and/or different covariance matrices, there is a good convergence of trends for adjustments. Moreover, a significant reduction of the original uncertainties is often observed. Using the a–posteriori covariance data, there is a strong reduction of the uncertainties of integral parameters for reference reactor designs, mainly due to the new correlations in the a–posteriori covariance matrix. Furthermore, criteria have been proposed and applied to verify the consistency of differential and integral data used in the adjustment. Finally, recommendations are given for an appropriate use of sensitivity analysis methods and indications for future work are provided.« less

  15. Multicapillary Flow Reactor: Synthesis of 1,2,5-Thiadiazepane 1,1-Dioxide Library Utilizing One-Pot Elimination and Inter-/Intramolecular Double aza-Michael Addition Via Microwave-Assisted, Continuous-Flow Organic Synthesis (MACOS)

    PubMed Central

    Ullah, Farman; Zang, Qin; Javed, Salim; Zhou, Aihua; Knudtson, Christopher A.; Bi, Danse; Hanson, Paul R.; Organ, Michael G.

    2013-01-01

    A microwave-assisted, continuous-flow organic synthesis (MACOS) protocol for the synthesis of functionalized 1,2,5-thiadiazepane 1,1-dioxide library, utilizing a one-pot elimination and inter-/intramolecular double aza-Michael addition strategy is reported. The optimized protocol in MACOS was utilized for scale-out and further extended for library production using a multicapillary flow reactor. A 50-member library of 1,2,5-thiadiazepane 1,1-dioxides was prepared on a 100- to 300-mg scale with overall yields between 50 and 80% and over 90 % purity determined by proton nuclear magnetic resonance (1H-NMR) spectroscopy. PMID:24244871

  16. Retrospective Reconstruction of Radiation Doses of Chernobyl Liquidators by Electron Paramagnetic Resonance

    DTIC Science & Technology

    1997-12-01

    Armed Forces Rad I Research Institute Retrospective Reconstruction of Radiation Doses of Chernobyl Liquidators by Electron Paramagnetic Resonance A...of Radiation Doses of Chernobyl Liquidators by Electron Paramagnetic Resonance Authored by Scientific Center of Radiation Medicine Academy of Medical...libraries associated with the U.S. Government’s Depository Library System. Preface On April 26, 1986, Reactor #4 at the Chernobyl Nuclear Power Plant near

  17. Measurement of leakage neutron spectra from graphite cylinders irradiated with D-T neutrons for validation of evaluated nuclear data.

    PubMed

    Luo, F; Han, R; Chen, Z; Nie, Y; Shi, F; Zhang, S; Lin, W; Ren, P; Tian, G; Sun, Q; Gou, B; Ruan, X; Ren, J; Ye, M

    2016-10-01

    A benchmark experiment for validation of graphite data evaluated from nuclear data libraries was conducted for 14MeV neutrons irradiated on graphite cylinder samples. The experiments were performed using the benchmark experimental facility at the China Institute of Atomic Energy (CIAE). The leakage neutron spectra from the surface of graphite (Φ13cm×20cm) at 60° and 120° and graphite (Φ13cm×2cm) at 60° were measured by the time-of-flight (TOF) method. The obtained results were compared with the measurements made by the Monte Carlo neutron transport code MCNP-4C with the ENDF/B-VII.1, CENDL-3.1 and JENDL-4.0 libraries. The results obtained from a 20cm-thick sample revealed that the calculation results with CENDL-3.1 and JENDL-4.0 libraries showed good agreements with the experiments conducted in the whole energy region. However, a large discrepancy of approximately 40% was observed below the 3MeV energy region with the ENDF/B-VII.1 library. For the 2cm-thick sample, the calculated results obtained from the abovementioned three libraries could not reproduce the experimental data in the energy range of 5-7MeV. The graphite data in CENDL-3.1 were verified for the first time and were proved to be reliable. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Library fingerprints: a novel approach to the screening of virtual libraries.

    PubMed

    Klon, Anthony E; Diller, David J

    2007-01-01

    We propose a novel method to prioritize libraries for combinatorial synthesis and high-throughput screening that assesses the viability of a particular library on the basis of the aggregate physical-chemical properties of the compounds using a naïve Bayesian classifier. This approach prioritizes collections of related compounds according to the aggregate values of their physical-chemical parameters in contrast to single-compound screening. The method is also shown to be useful in screening existing noncombinatorial libraries when the compounds in these libraries have been previously clustered according to their molecular graphs. We show that the method used here is comparable or superior to the single-compound virtual screening of combinatorial libraries and noncombinatorial libraries and is superior to the pairwise Tanimoto similarity searching of a collection of combinatorial libraries.

  19. Fluctuating hyperfine interactions: an updated computational implementation

    NASA Astrophysics Data System (ADS)

    Zacate, M. O.; Evenson, W. E.

    2015-04-01

    The stochastic hyperfine interactions modeling library (SHIML) is a set of routines written in the C programming language designed to assist in the analysis of stochastic models of hyperfine interactions. The routines read a text-file description of the model, set up the Blume matrix, upon which the evolution operator of the quantum mechanical system depends, and calculate the eigenvalues and eigenvectors of the Blume matrix, from which theoretical spectra of experimental techniques can be calculated. The original version of SHIML constructs Blume matrices applicable for methods that measure hyperfine interactions with only a single nuclear spin state. In this paper, we report an extension of the library to provide support for methods such as Mössbauer spectroscopy and nuclear resonant scattering of synchrotron radiation, which are sensitive to interactions with two nuclear spin states. Examples will be presented that illustrate the use of this extension of SHIML to generate Mössbauer spectra for polycrystalline samples under a number of fluctuating hyperfine field models.

  20. Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from the ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0, ROSFOND-2010, CENDL-3.1 and EAF-2010 Evaluated Data Libraries

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Mughabghab, S. F.

    2012-12-01

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present paper contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.

  1. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  2. Kinetic Parameter Measurements in the MINERVE Reactor

    NASA Astrophysics Data System (ADS)

    Perret, Grégory; Geslot, Benoit; Gruel, Adrien; Blaise, Patrick; Di-Salvo, Jacques; De Izarra, Grégoire; Jammes, Christian; Hursin, Mathieu; Pautz, Andréas

    2017-01-01

    In the framework of an international collaboration, teams of the PSI and CEA research institutes measure the critical decay constant (α0 = β/A), delayed neutron fraction (β) and generation time (A) of the Minerve reactor using the Feynman-α, Power Spectral Density and Rossi-α neutron noise measurement techniques. These measurements contribute to the experimental database of kinetic parameters used to improve nuclear data files and validate modern methods in Monte Carlo codes. Minerve is a zero-power pool reactor composed of a central experimental test lattice surrounded by a large aluminum buffer and four high-enriched driver regions. Measurements are performed in three slightly subcritical configurations (-2 cents to -30 cents) using two high-efficiency 235U fission chambers in the driver regions. Measurement of α0 and β obtained by the two institutes and with the different techniques are consistent for the configurations envisaged. Slight increases of the β values are observed with the subcriticality level. Best estimate values are obtained with the Cross-Power Spectral Density technique at -2 cents, and are worth: β = 716.9±9.0 pcm, α0 = 79.0±0.6 s-1 and A = 90.7±1.4 μs. The kinetic parameters are predicted with MCNP5-v1.6 and TRIPOLI4.9 and the JEFF-3.1/3.1.1 and ENDF/B-VII.1 nuclear data libraries. The predictions for β and α0 overestimate the experimental results by 3-5% and 10-12%, respectively; that for A underestimate the experimental result by 6-7%. The discrepancies are suspected to come from the driven system nature of Minerve and the location of the detectors in the driver regions, which prevent accounting for the full reactor.

  3. Development of a New 47-Group Library for the CASL Neutronics Simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Williams, Mark L; Wiarda, Dorothea

    The CASL core simulator MPACT is under development for the neutronics and thermal-hydraulics coupled simulation for the pressurized light water reactors. The key characteristics of the MPACT code include a subgroup method for resonance self-shielding, and a whole core solver with a 1D/2D synthesis method. The ORNL AMPX/SCALE code packages have been significantly improved to support various intermediate resonance self-shielding approximations such as the subgroup and embedded self-shielding methods. New 47-group AMPX and MPACT libraries based on ENDF/B-VII.0 have been generated for the CASL core simulator MPACT of which group structure comes from the HELIOS library. The new 47-group MPACTmore » library includes all nuclear data required for static and transient core simulations. This study discusses a detailed procedure to generate the 47-group AMPX and MPACT libraries and benchmark results for the VERA progression problems.« less

  4. Measurement and calculation of neutron leakage spectra from slab samples of beryllium, gallium and tungsten irradiated with 14.8 MeV neutrons

    NASA Astrophysics Data System (ADS)

    Nie, Y. B.; Ruan, X. C.; Ren, J.; Zhang, S.; Han, R.; Bao, J.; Huang, H. X.; Ding, Y. Y.; Wu, H. C.; Liu, P.; Zhou, Z. Y.

    2017-09-01

    In order to make benchmark validation of the nuclear data for gallium (Ga), tungsten (W) and beryllium (Be) in existing modern evaluated nuclear data files, neutron leakage spectra in the range from 0.8 to 15 MeV from slab samples were measured by time-of-flight technique with a BC501 scintillation detector. The measurements were performed at China Institute of Atomic Energy (CIAE) using a D-T neutron source. The thicknesses of the slabs were 0.5 to 2.5 mean free path for 14.8 MeV neutrons, and the measured angles were chosen to be 60∘ and 120∘. The measured spectra were compared with those calculated by the continuous energy Monte-Carlo transport code MCNP, using the data from the CENDL-3.1, ENDF/B-VII.1 and JENDL-4.0 nuclear data files, the comparison between the experimental and calculated results show that: The results from all three libraries significantly underestimate the cross section in energy range of 10-13 MeV for Ga; For W, the calculated spectra using data from CENDL-3.1 and JENDL-4.0 libraries show larger discrepancies with the measured ones, especially around 8.5-13.5 MeV; and for Be, all the libraries led to underestimation below 3 MeV at 120∘.

  5. Parameterizable Library Components for SAW Devices

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2006-01-01

    To facilitate quick fabrication of Surface Acoustic Wave (SAW) sensors we have found it necessary to develop a library of parameterizable components. This library is the first module in our strategy towards a design tool that is integrated into existing Electronic Design Automation (EDA) tools. This library is similar to the standard cell libraries found in digital design packages. The library cells allow the user to input the design parameters which automatically generate a detailed layout of the SAW component. This paper presents the results of our development of parameterizable cells for an InterDigitated Transducer (IDT), reflector, SAW delay line, and both one and two port resonators.

  6. Sources of Information on Atomic Energy, International Series of Monographs in Library and Information Science, Volume 2.

    ERIC Educational Resources Information Center

    Anthony, L. J.

    This book provides a comprehensive survey of the principal national and international organizations which are sources of information on atomic and nuclear energy and of the published literature in this field. Organizations in all the major nuclear countries such as the United States, Britain, the Soviet Union, France, and Japan are described, and…

  7. 78 FR 44603 - Byron Nuclear Station, Units 1 and 2, and Braidwood Nuclear Station, Units 1 and 2; Exelon...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... rule, the participant must file the document using the NRC's online, Web-based submission form. In... form, including the installation of the Web browser plug-in, is available on the NRC's public Web site... 61010, and near Braidwood at the Fossil Ridge (Braidwood) Public Library, 386 W. Kennedy Road, Braidwood...

  8. Re-analysis of HCPB/HCLL Blanket Mock-up Experiments Using Recent Nuclear Data Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kondo, K., E-mail: keitaro.kondo@kit.edu; Fischer, U.; Klix, A.

    2014-06-15

    We have re-analysed the two breeding blankets experiments performed previously in the frame of the European fusion program on two mock-ups of the European Helium-Cooled-Lithiium Lead (HCLL) and Helium-Cooled-Pebble-Bed (HCPB) test blanket modules for ITER. The tritium production rate and the neutron and photon spectra measured in these mock-ups were compared with calculations using FENDL-3 Starter Library, release 4 and state-of-the-art nuclear data evaluations, JEFF-3.1.2, JENDL-4.0 and ENDF/B-VII.0. The tritium production calculated for the HCPB mock-up underestimates the experimental result by about 10%. The result calculated with FENDL-3/SLIB4 gives slightly smaller tritium production by 2% than the one with FENDL-2.1.more » The difference attributes to the slight modification of the total and elastic scattering cross section of Be. For the HCLL experiment, all libraries reproduce the experimental results well. FENDL-3/SLIB4 gives better result both for the measured spectra and the tritium production compared to FENDL-2.1.« less

  9. Covariance Applications in Criticality Safety, Light Water Reactor Analysis, and Spent Fuel Characterization

    DOE PAGES

    Williams, M. L.; Wiarda, D.; Ilas, G.; ...

    2014-06-15

    Recently, we processed a new covariance data library based on ENDF/B-VII.1 for the SCALE nuclear analysis code system. The multigroup covariance data are discussed here, along with testing and application results for critical benchmark experiments. Moreover, the cross section covariance library, along with covariances for fission product yields and decay data, is used to compute uncertainties in the decay heat produced by a burned reactor fuel assembly.

  10. Chemical Space of DNA-Encoded Libraries.

    PubMed

    Franzini, Raphael M; Randolph, Cassie

    2016-07-28

    In recent years, DNA-encoded chemical libraries (DECLs) have attracted considerable attention as a potential discovery tool in drug development. Screening encoded libraries may offer advantages over conventional hit discovery approaches and has the potential to complement such methods in pharmaceutical research. As a result of the increased application of encoded libraries in drug discovery, a growing number of hit compounds are emerging in scientific literature. In this review we evaluate reported encoded library-derived structures and identify general trends of these compounds in relation to library design parameters. We in particular emphasize the combinatorial nature of these libraries. Generally, the reported molecules demonstrate the ability of this technology to afford hits suitable for further lead development, and on the basis of them, we derive guidelines for DECL design.

  11. ORIGEN-based Nuclear Fuel Inventory Module for Fuel Cycle Assessment: Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E.

    The goal of this project, “ORIGEN-based Nuclear Fuel Depletion Module for Fuel Cycle Assessment" is to create a physics-based reactor depletion and decay module for the Cyclus nuclear fuel cycle simulator in order to assess nuclear fuel inventories over a broad space of reactor operating conditions. The overall goal of this approach is to facilitate evaluations of nuclear fuel inventories for a broad space of scenarios, including extended used nuclear fuel storage and cascading impacts on fuel cycle options such as actinide recovery in used nuclear fuel, particularly for multiple recycle scenarios. The advantages of a physics-based approach (compared tomore » a recipe-based approach which has been typically employed for fuel cycle simulators) is in its inherent flexibility; such an approach can more readily accommodate the broad space of potential isotopic vectors that may be encountered under advanced fuel cycle options. In order to develop this flexible reactor analysis capability, we are leveraging the Origen nuclear fuel depletion and decay module from SCALE to produce a standalone “depletion engine” which will serve as the kernel of a Cyclus-based reactor analysis module. The ORIGEN depletion module is a rigorously benchmarked and extensively validated tool for nuclear fuel analysis and thus its incorporation into the Cyclus framework can bring these capabilities to bear on the problem of evaluating long-term impacts of fuel cycle option choices on relevant metrics of interest, including materials inventories and availability (for multiple recycle scenarios), long-term waste management and repository impacts, etc. Developing this Origen-based analysis capability for Cyclus requires the refinement of the Origen analysis sequence to the point where it can reasonably be compiled as a standalone sequence outside of SCALE; i.e., wherein all of the computational aspects of Origen (including reactor cross-section library processing and interpolation, input and output processing, and depletion/decay solvers) can be self-contained into a single executable sequence. Further, to embed this capability into other software environments (such as the Cyclus fuel cycle simulator) requires that Origen’s capabilities be encapsulated into a portable, self-contained library which other codes can then call directly through function calls, thereby directly accessing the solver and data processing capabilities of Origen. Additional components relevant to this work include modernization of the reactor data libraries used by Origen for conducting nuclear fuel depletion calculations. This work has included the development of new fuel assembly lattices not previously available (such as for CANDU heavy-water reactor assemblies) as well as validation of updated lattices for light-water reactors updated to employ modern nuclear data evaluations. The CyBORG reactor analysis module as-developed under this workscope is fully capable of dynamic calculation of depleted fuel compositions from all commercial U.S. reactor assembly types as well as a number of international fuel types, including MOX, VVER, MAGNOX, and PHWR CANDU fuel assemblies. In addition, the Origen-based depletion engine allows for CyBORG to evaluate novel fuel assembly and reactor design types via creation of Origen reactor data libraries via SCALE. The establishment of this new modeling capability affords fuel cycle modelers a substantially improved ability to model dynamically-changing fuel cycle and reactor conditions, including recycled fuel compositions from fuel cycle scenarios involving material recycle into thermal-spectrum systems.« less

  12. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  13. FRENDY: A new nuclear data processing system being developed at JAEA

    NASA Astrophysics Data System (ADS)

    Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

    2017-09-01

    JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.

  14. Visualization Based Data Mining for Comparison Between Two Solar Cell Libraries.

    PubMed

    Yosipof, Abraham; Kaspi, Omer; Majhi, Koushik; Senderowitz, Hanoch

    2016-12-01

    Material informatics may provide meaningful insights and powerful predictions for the development of new and efficient Metal Oxide (MO) based solar cells. The main objective of this paper is to establish the usefulness of data reduction and visualization methods for analyzing data sets emerging from multiple all-MOs solar cell libraries. For this purpose, two libraries, TiO 2 |Co 3 O 4 and TiO 2 |Co 3 O 4 |MoO 3 , differing only by the presence of a MoO 3 layer in the latter were analyzed with Principal Component Analysis and Self-Organizing Maps. Both analyses suggest that the addition of the MoO 3 layer to the TiO 2 |Co 3 O 4 library has affected the overall photovoltaic (PV) activity profile of the solar cells making the two libraries clearly distinguishable from one another. Furthermore, while MoO 3 had an overall favorable effect on PV parameters, a sub-population of cells was identified which were either indifferent to its presence or even demonstrated a reduction in several parameters. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. After Action Report - Kazakhstan NSDD July 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Caterina; Eppich, Gary; Kips, Ruth

    On Monday 20 July, Caterina Fox, Ruth Kips and Kim Knight were invited to participate in Kazakhstan's nuclear material inventory management working group meeting coordinated by Alexander Vasilliev as nuclear forensics subject matter experts. The meeting included participants from Kazakhstan's nuclear regulatory agency (CAESC, the Committee on Atomic and Energetic Supervision and Control) and 3 institutes 1. Institute of Nuclear Physics, INP (Almaty), 2. National Nuclear Center, NNC (Kurchatov), and 3. Ulba Metallurgical Plant, UMP (Oskemen). CAESC requested attendance of an MC&A expert, an IT Specialist, and a Physical Security Specialist from each site. The general meeting concerned considerations formore » creating unified or compatible systems for nuclear material inventory management. NSDD representatives provided an overview of nuclear forensics and presented considerations for developments of inventory management that might be synergistic with future consideration of development of a National Nuclear Forensics Library to support nuclear forensics investigations.« less

  16. SAR target recognition using behaviour library of different shapes in different incidence angles and polarisations

    NASA Astrophysics Data System (ADS)

    Fallahpour, Mojtaba Behzad; Dehghani, Hamid; Jabbar Rashidi, Ali; Sheikhi, Abbas

    2018-05-01

    Target recognition is one of the most important issues in the interpretation of the synthetic aperture radar (SAR) images. Modelling, analysis, and recognition of the effects of influential parameters in the SAR can provide a better understanding of the SAR imaging systems, and therefore facilitates the interpretation of the produced images. Influential parameters in SAR images can be divided into five general categories of radar, radar platform, channel, imaging region, and processing section, each of which has different physical, structural, hardware, and software sub-parameters with clear roles in the finally formed images. In this paper, for the first time, a behaviour library that includes the effects of polarisation, incidence angle, and shape of targets, as radar and imaging region sub-parameters, in the SAR images are extracted. This library shows that the created pattern for each of cylindrical, conical, and cubic shapes is unique, and due to their unique properties these types of shapes can be recognised in the SAR images. This capability is applied to data acquired with the Canadian RADARSAT1 satellite.

  17. A comparison of different functions for predicted protein model quality assessment.

    PubMed

    Li, Juan; Fang, Huisheng

    2016-07-01

    In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.

  18. Reactivity impact of {sup 16}O thermal elastic-scattering nuclear data for some numerical and critical benchmark systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozier, K. S.; Roubtsov, D.; Plompen, A. J. M.

    2012-07-01

    The thermal neutron-elastic-scattering cross-section data for {sup 16}O used in various modern evaluated-nuclear-data libraries were reviewed and found to be generally too high compared with the best available experimental measurements. Some of the proposed revisions to the ENDF/B-VII.0 {sup 16}O data library and recent results from the TENDL system increase this discrepancy further. The reactivity impact of revising the {sup 16}O data downward to be consistent with the best measurements was tested using the JENDL-3.3 {sup 16}O cross-section values and was found to be very small in MCNP5 simulations of the UO{sub 2} and reactor-recycle MOX-fuel cases of the ANSmore » Doppler-defect numerical benchmark. However, large reactivity differences of up to about 14 mk (1400 pcm) were observed using {sup 16}O data files from several evaluated-nuclear-data libraries in MCNP5 simulations of the Los Alamos National Laboratory HEU heavy-water solution thermal critical experiments, which were performed in the 1950's. The latter result suggests that new measurements using HEU in a heavy-water-moderated critical facility, such as the ZED-2 zero-power reactor at the Chalk River Laboratories, might help to resolve the discrepancy between the {sup 16}O thermal elastic-scattering cross-section values and thereby reduce or better define its uncertainty, although additional assessment work would be needed to confirm this. (authors)« less

  19. Deuteron nuclear data for the design of accelerator-based neutron sources: Measurement, model analysis, evaluation, and application

    NASA Astrophysics Data System (ADS)

    Watanabe, Yukinobu; Kin, Tadahiro; Araki, Shouhei; Nakayama, Shinsuke; Iwamoto, Osamu

    2017-09-01

    A comprehensive research program on deuteron nuclear data motivated by development of accelerator-based neutron sources is being executed. It is composed of measurements of neutron and gamma-ray yields and production cross sections, modelling of deuteron-induced reactions and code development, nuclear data evaluation and benchmark test, and its application to medical radioisotopes production. The goal of this program is to develop a state-of-the-art deuteron nuclear data library up to 200 MeV which will be useful for the design of future (d,xn) neutron sources. The current status and future plan are reviewed.

  20. Dissemination of data measured at the CERN n_TOF facility

    NASA Astrophysics Data System (ADS)

    Dupont, E.; Otuka, N.; Cabellos, O.; Aberle, O.; Aerts, G.; Altstadt, S.; Alvarez, H.; Alvarez-Velarde, F.; Andriamonje, S.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Badurek, G.; Balibrea, J.; Barbagallo, M.; Barros, S.; Baumann, P.; Bécares, V.; Bečvář, F.; Beinrucker, C.; Belloni, F.; Berthier, B.; Berthoumieux, E.; Billowes, J.; Boccone, V.; Bosnar, D.; Brown, A.; Brugger, M.; Caamaño, M.; Calviani, M.; Calviño, F.; Cano-Ott, D.; Capote, R.; Cardella, R.; Carrapiço, C.; Casanovas, A.; Castelluccio, D. M.; Cennini, P.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Chin, M.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Couture, A.; Cox, J.; Damone, L. A.; David, S.; Deo, K.; Diakaki, M.; Dillmann, I.; Domingo-Pardo, C.; Dressler, R.; Dridi, W.; Duran, I.; Eleftheriadis, C.; Embid-Segura, M.; Fernández-Domínguez, B.; Ferrant, L.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Fraval, K.; Frost, R. J. W.; Fujii, K.; Furman, W.; Ganesan, S.; Garcia, A. R.; Gawlik, A.; Gheorghe, I.; Gilardoni, S.; Giubrone, G.; Glodariu, T.; Göbel, K.; Gomez-Hornillos, M. B.; Goncalves, I. F.; Gonzalez-Romero, E.; Goverdovski, A.; Gramegna, F.; Griesmayer, E.; Guerrero, C.; Gunsing, F.; Gurusamy, P.; Haight, R.; Harada, H.; Heftrich, T.; Heil, M.; Heinitz, S.; Hernández-Prieto, A.; Heyse, J.; Igashira, M.; Isaev, S.; Jenkins, D. G.; Jericha, E.; Kadi, Y.; Kaeppeler, F.; Kalamara, A.; Karadimos, D.; Karamanis, D.; Katabuchi, T.; Kavrigin, P.; Kerveno, M.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Konovalov, V.; Krtička, M.; Kroll, J.; Kurtulgil, D.; Lampoudis, C.; Langer, C.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Naour, C. Le; Lerendegui-Marco, J.; Leong, L. S.; Licata, M.; Meo, S. Lo; Lonsdale, S. J.; Losito, R.; Lozano, M.; Macina, D.; Manousos, A.; Marganiec, J.; Martinez, T.; Marrone, S.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Mondelaers, W.; Montesano, S.; Moreau, C.; Mosconi, M.; Musumarra, A.; Negret, A.; Nolte, R.; O'Brien, S.; Oprea, A.; Palomo-Pinto, F. R.; Pancin, J.; Paradela, C.; Patronis, N.; Pavlik, A.; Pavlopoulos, P.; Perkowski, J.; Perrot, L.; Pigni, M. T.; Plag, R.; Plompen, A.; Plukis, L.; Poch, A.; Porras, I.; Praena, J.; Pretel, C.; Quesada, J. M.; Radeck, D.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego, A.; Robles, M.; Roman, F.; Rout, P. C.; Rudolf, G.; Rubbia, C.; Rullhusen, P.; Ryan, J. A.; Sabaté-Gilarte, M.; Salgado, J.; Santos, C.; Sarchiapone, L.; Sarmento, R.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Stephan, C.; Suryanarayana, S. V.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tarrío, D.; Tassan-Got, L.; Tavora, L.; Terlizzi, R.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Versaci, R.; Vermeulen, M. J.; Villamarin, D.; Vicente, M. C.; Vlachoudis, V.; Vlastou, R.; Voss, F.; Wallner, A.; Walter, S.; Ware, T.; Warren, S.; Weigand, M.; Weiß, C.; Wolf, C.; Wiesher, M.; Wisshak, K.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    The n_TOF neutron time-of-flight facility at CERN is used for high quality nuclear data measurements from thermal energy up to hundreds of MeV. In line with the CERN open data policy, the n_TOF Collaboration takes actions to preserve its unique data, facilitate access to them in standardised format, and allow their re-use by a wide community in the fields of nuclear physics, nuclear astrophysics and various nuclear technologies. The present contribution briefly describes the n_TOF outcomes, as well as the status of dissemination and preservation of n_TOF final data in the international EXFOR library.

  1. A Coincidence Signature Library for Multicoincidence Radionuclide Analysis Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Ellis, J E.; Valsan, Andrei B.

    Pacific Northwest National Laboratory (PNNL) is currently developing multicoincidence systems to perform trace radionuclide analysis at or near the sample collection point, for applications that include emergency response, nuclear forensics, and environmental monitoring. Quantifying radionuclide concentrations with these systems requires a library of accurate emission intensities for each detected signature, for all candidate radionuclides. To meet this need, a Coincidence Lookup Library (CLL) is being developed to calculate the emission intensities of coincident signatures from a user-specified radionuclide, or conversely, to determine the radionuclides that may be responsible for a specific detected coincident signature. The algorithms used to generate absolutemore » emission intensities and various query modes for our developmental CLL are described.« less

  2. Hypercluster parallel processing library user's manual

    NASA Technical Reports Server (NTRS)

    Quealy, Angela

    1990-01-01

    This User's Manual describes the Hypercluster Parallel Processing Library, composed of FORTRAN-callable subroutines which enable a FORTRAN programmer to manipulate and transfer information throughout the Hypercluster at NASA Lewis Research Center. Each subroutine and its parameters are described in detail. A simple heat flow application using Laplace's equation is included to demonstrate the use of some of the library's subroutines. The manual can be used initially as an introduction to the parallel features provided by the library. Thereafter it can be used as a reference when programming an application.

  3. Ret Receptor: Functional Consequences of Oncogenic Rearrangements.

    DTIC Science & Technology

    1996-10-01

    incorporation of the thymidine analog 5- bromodeoxyuridine (BrdU) and its subsequent detection by immunostaining (33). Following nuclear ...other LexA- fussions to test for Ret/ptc2 specific interaction. Seventeen of the library plasmids yielded co-transformants which were 3- galactosidase...cellsexpressing the EGFR/Ret chimera and M. Pierotti for the Ret/ptc2 events in papillary thyroid carcinoma (28). In a nuclear micro- clone. injection assay the

  4. GENIE Production Release 2.10.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, M.; Andreopoulos, C.; Athar, M.

    2015-12-25

    GENIE is a neutrino Monte Carlo event generator that simulates the primary interaction of a neutrino with a nuclear target, along with the subsequent propagation of the reaction products through the nuclear medium. It additionally contains libraries for fully-featured detector geometries and for managing various types of neutrino flux. This note details recent updates to GENIE, in particular, changes introduced into the newest production release, version 2.10.0.

  5. Isotopic yield measurement in the heavy mass region for 239Pu thermal neutron induced fission

    NASA Astrophysics Data System (ADS)

    Bail, A.; Serot, O.; Mathieu, L.; Litaize, O.; Materna, T.; Köster, U.; Faust, H.; Letourneau, A.; Panebianco, S.

    2011-09-01

    Despite the huge number of fission yield data available in the different evaluated nuclear data libraries, such as JEFF-3.1.1, ENDF/B-VII.0, and JENDL-4.0, more accurate data are still needed both for nuclear energy applications and for our understanding of the fission process itself. It is within the framework of this that measurements on the recoil mass spectrometer Lohengrin (at the Institut Laue-Langevin, Grenoble, France) was undertaken, to determine isotopic yields for the heavy fission products from the 239Pu(nth,f) reaction. In order to do this, a new experimental method based on γ-ray spectrometry was developed and validated by comparing our results with those performed in the light mass region with completely different setups. Hence, about 65 fission product yields were measured with an uncertainty that has been reduced on average by a factor of 2 compared to that previously available in the nuclear data libraries. In addition, for some fission products, a strongly deformed ionic charge distribution compared to a normal Gaussian shape was found, which was interpreted as being caused by the presence of a nanosecond isomeric state. Finally, a nuclear charge polarization has been observed in agreement, with the one described on other close fissioning systems.

  6. Computer Simulation of the Circulation Subsystem of a Library

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  7. Computation of a Canadian SCWR unit cell with deterministic and Monte Carlo codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrisson, G.; Marleau, G.

    2012-07-01

    The Canadian SCWR has the potential to achieve the goals that the generation IV nuclear reactors must meet. As part of the optimization process for this design concept, lattice cell calculations are routinely performed using deterministic codes. In this study, the first step (self-shielding treatment) of the computation scheme developed with the deterministic code DRAGON for the Canadian SCWR has been validated. Some options available in the module responsible for the resonance self-shielding calculation in DRAGON 3.06 and different microscopic cross section libraries based on the ENDF/B-VII.0 evaluated nuclear data file have been tested and compared to a reference calculationmore » performed with the Monte Carlo code SERPENT under the same conditions. Compared to SERPENT, DRAGON underestimates the infinite multiplication factor in all cases. In general, the original Stammler model with the Livolant-Jeanpierre approximations are the most appropriate self-shielding options to use in this case of study. In addition, the 89 groups WIMS-AECL library for slight enriched uranium and the 172 groups WLUP library for a mixture of plutonium and thorium give the most consistent results with those of SERPENT. (authors)« less

  8. ZASPE: A Code to Measure Stellar Atmospheric Parameters and their Covariance from Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Hartman, Joel; Bakos, Gáspár

    2017-05-01

    We describe the Zonal Atmospheric Stellar Parameters Estimator (zaspe), a new algorithm, and its associated code, for determining precise stellar atmospheric parameters and their uncertainties from high-resolution echelle spectra of FGK-type stars. zaspe estimates stellar atmospheric parameters by comparing the observed spectrum against a grid of synthetic spectra only in the most sensitive spectral zones to changes in the atmospheric parameters. Realistic uncertainties in the parameters are computed from the data itself, by taking into account the systematic mismatches between the observed spectrum and the best-fitting synthetic one. The covariances between the parameters are also estimated in the process. zaspe can in principle use any pre-calculated grid of synthetic spectra, but unbiased grids are required to obtain accurate parameters. We tested the performance of two existing libraries, and we concluded that neither is suitable for computing precise atmospheric parameters. We describe a process to synthesize a new library of synthetic spectra that was found to generate consistent results when compared with parameters obtained with different methods (interferometry, asteroseismology, equivalent widths).

  9. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order tomore » provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.« less

  10. New evaluation of thermal neutron scattering libraries for light and heavy water

    NASA Astrophysics Data System (ADS)

    Marquez Damian, Jose Ignacio; Granada, Jose Rolando; Cantargi, Florencia; Roubtsov, Danila

    2017-09-01

    In order to improve the design and safety of thermal nuclear reactors and for verification of criticality safety conditions on systems with significant amount of fissile materials and water, it is necessary to perform high-precision neutron transport calculations and estimate uncertainties of the results. These calculations are based on neutron interaction data distributed in evaluated nuclear data libraries. To improve the evaluations of thermal scattering sub-libraries, we developed a set of thermal neutron scattering cross sections (scattering kernels) for hydrogen bound in light water, and deuterium and oxygen bound in heavy water, in the ENDF-6 format from room temperature up to the critical temperatures of molecular liquids. The new evaluations were generated and processable with NJOY99 and also with NJOY-2012 with minor modifications (updates), and with the new version of NJOY-2016. The new TSL libraries are based on molecular dynamics simulations with GROMACS and recent experimental data, and result in an improvement of the calculation of single neutron scattering quantities. In this work, we discuss the importance of taking into account self-diffusion in liquids to accurately describe the neutron scattering at low neutron energies (quasi-elastic peak problem). To improve modeling of heavy water, it is important to take into account temperature-dependent static structure factors and apply Sköld approximation to the coherent inelastic components of the scattering matrix. The usage of the new set of scattering matrices and cross-sections improves the calculation of thermal critical systems moderated and/or reflected with light/heavy water obtained from the International Criticality Safety Benchmark Evaluation Project (ICSBEP) handbook. For example, the use of the new thermal scattering library for heavy water, combined with the ROSFOND-2010 evaluation of the cross sections for deuterium, results in an improvement of the C/E ratio in 48 out of 65 international benchmark cases calculated with the Monte Carlo code MCNP5, in comparison with the existing library based on the ENDF/B-VII.0 evaluation.

  11. Exploring Pandora's Box: Potential and Pitfalls of Low Coverage Genome Surveys for Evolutionary Biology

    PubMed Central

    Leese, Florian; Mayer, Christoph; Agrawal, Shobhit; Dambach, Johannes; Dietz, Lars; Doemel, Jana S.; Goodall-Copstake, William P.; Held, Christoph; Jackson, Jennifer A.; Lampert, Kathrin P.; Linse, Katrin; Macher, Jan N.; Nolzen, Jennifer; Raupach, Michael J.; Rivera, Nicole T.; Schubart, Christoph D.; Striewski, Sebastian; Tollrian, Ralph; Sands, Chester J.

    2012-01-01

    High throughput sequencing technologies are revolutionizing genetic research. With this “rise of the machines”, genomic sequences can be obtained even for unknown genomes within a short time and for reasonable costs. This has enabled evolutionary biologists studying genetically unexplored species to identify molecular markers or genomic regions of interest (e.g. micro- and minisatellites, mitochondrial and nuclear genes) by sequencing only a fraction of the genome. However, when using such datasets from non-model species, it is possible that DNA from non-target contaminant species such as bacteria, viruses, fungi, or other eukaryotic organisms may complicate the interpretation of the results. In this study we analysed 14 genomic pyrosequencing libraries of aquatic non-model taxa from four major evolutionary lineages. We quantified the amount of suitable micro- and minisatellites, mitochondrial genomes, known nuclear genes and transposable elements and searched for contamination from various sources using bioinformatic approaches. Our results show that in all sequence libraries with estimated coverage of about 0.02–25%, many appropriate micro- and minisatellites, mitochondrial gene sequences and nuclear genes from different KEGG (Kyoto Encyclopedia of Genes and Genomes) pathways could be identified and characterized. These can serve as markers for phylogenetic and population genetic analyses. A central finding of our study is that several genomic libraries suffered from different biases owing to non-target DNA or mobile elements. In particular, viruses, bacteria or eukaryote endosymbionts contributed significantly (up to 10%) to some of the libraries analysed. If not identified as such, genetic markers developed from high-throughput sequencing data for non-model organisms may bias evolutionary studies or fail completely in experimental tests. In conclusion, our study demonstrates the enormous potential of low-coverage genome survey sequences and suggests bioinformatic analysis workflows. The results also advise a more sophisticated filtering for problematic sequences and non-target genome sequences prior to developing markers. PMID:23185309

  12. Development of the FITS tools package for multiple software environments

    NASA Technical Reports Server (NTRS)

    Pence, W. D.; Blackburn, J. K.

    1992-01-01

    The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.

  13. AQUATOX Data Sources Documents

    EPA Pesticide Factsheets

    Contains the data sources for parameter values of the AQUATOX model including: a bibliography for the AQUATOX data libraries and the compendia of parameter values for US Army Corps of Engineers models.

  14. Relationships between bacterial diversity and environmental variables in a tropical marine environment, Rio de Janeiro.

    PubMed

    Vieira, Ricardo P; Gonzalez, Alessandra M; Cardoso, Alexander M; Oliveira, Denise N; Albano, Rodolpho M; Clementino, Maysa M; Martins, Orlando B; Paranhos, Rodolfo

    2008-01-01

    This study is the first to apply a comparative analysis of environmental chemistry, microbiological parameters and bacterioplankton 16S rRNA clone libraries from different areas of a 50 km transect along a trophic gradient in the tropical Guanabara Bay ecosystem. Higher bacterial diversity was found in the coastal area, whereas lower richness was observed in the more polluted inner bay water. The significance of differences between clone libraries was examined with LIBSHUFF statistics. Paired reciprocal comparisons indicated that each of the libraries differs significantly from the others, and this is in agreement with direct interpretation of the phylogenetic tree. Furthermore, correspondence analyses showed that some taxa are related to specific abiotic, trophic and microbiological parameters in Guanabara Bay estuarine system.

  15. Construction, database integration, and application of an Oenothera EST library.

    PubMed

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches.

  16. The ENSDF Java Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonzogni, A.A.

    2005-05-24

    A package of computer codes has been developed to process and display nuclear structure and decay data stored in the ENSDF (Evaluated Nuclear Structure Data File) library. The codes were written in an object-oriented fashion using the java language. This allows for an easy implementation across multiple platforms as well as deployment on web pages. The structure of the different java classes that make up the package is discussed as well as several different implementations.

  17. The Library of the Institute of Theoretical Astronomy of the R.A.S. (1924-1994). History, Present State, Perspectives for Future

    NASA Astrophysics Data System (ADS)

    Lapteva, M. V.

    Building up a specialized library collection of the Library of the Institute of Theoretical Astronomy of the Russian Academy of Sciences beginning with foundation of the Library (1924) up to the present time have been considered in their historical perspective. The main acquisition sources, stock figures, various parameters of the collection composi- tion, including information on rare foreign editions are also dealt with. The data on the existing retrieval systems and the perspectives of developing computerized problem directed reference bibliographic complexes are also considered.

  18. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    NASA Astrophysics Data System (ADS)

    Kahler, A. C.; MacFarlane, R. E.; Mosteller, R. D.; Kiedrowski, B. C.; Frankle, S. C.; Chadwick, M. B.; McKnight, R. D.; Lell, R. M.; Palmiotti, G.; Hiruta, H.; Herman, M.; Arcilla, R.; Mughabghab, S. F.; Sublet, J. C.; Trkov, A.; Trumbull, T. H.; Dunn, M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., "ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data," Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected 235U and 239Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected actinide reaction rates such as 236U, 238,242Pu and 241,243Am capture in fast systems. Other deficiencies, such as the overprediction of Pu solution system critical eigenvalues and a decreasing trend in calculated eigenvalue for 233U fueled systems as a function of Above-Thermal Fission Fraction remain. The comprehensive nature of this critical benchmark suite and the generally accurate calculated eigenvalues obtained with ENDF/B-VII.1 neutron cross sections support the conclusion that this is the most accurate general purpose ENDF/B cross section library yet released to the technical community.

  19. Fission yields data generation and benchmarks of decay heat estimation of a nuclear fuel

    NASA Astrophysics Data System (ADS)

    Gil, Choong-Sup; Kim, Do Heon; Yoo, Jae Kwon; Lee, Jounghwa

    2017-09-01

    Fission yields data with the ENDF-6 format of 235U, 239Pu, and several actinides dependent on incident neutron energies have been generated using the GEF code. In addition, fission yields data libraries of ORIGEN-S, -ARP modules in the SCALE code, have been generated with the new data. The decay heats by ORIGEN-S using the new fission yields data have been calculated and compared with the measured data for validation in this study. The fission yields data ORIGEN-S libraries based on ENDF/B-VII.1, JEFF-3.1.1, and JENDL/FPY-2011 have also been generated, and decay heats were calculated using the ORIGEN-S libraries for analyses and comparisons.

  20. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danon, Yaron; Nazarewicz, Witold; Talou, Patrick

    2013-02-18

    This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less

  1. MatProps: Material Properties Database and Associated Access Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durrenberger, J K; Becker, R C; Goto, D M

    2007-08-13

    Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less

  2. Use of Data Libraries for IAEA Nuclear Security Assessment Methodologies (NUSAM) [section 5.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, D.; Lane, M.

    2015-06-23

    Data libraries are essential for the characterization of the facility and provide the documented input which enables the facility assessment results and subsequent conclusions. Data Libraries are historical, verifiable, quantified, and applicable collections of testing data on different types of barriers, sensors, cameras, procedures, and/or personnel. Data libraries are developed and maintained as part of any assessment program or process. Data is collected during the initial stages of facility characterization to aid in the model and/or simulation development process. Data library values may also be developed through the use of state testing centers and/or site resources by testing different typesmore » of barriers, sensors, cameras, procedures, and/or personnel. If no data exists, subject matter expert opinion and manufacturer's specifications/ testing values can be the basis for initially assigning values, but are generally less reliable and lack appropriate confidence measures. The use of existing data libraries that have been developed by a state testing organization reduces the assessment costs by establishing standard delay, detection and assessment values for use by multiple sites or facilities where common barriers and alarms systems exist.« less

  3. The EPRDATA Format: A Dialogue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, III, Henry Grady

    2015-08-18

    Recently the Los Alamos Nuclear Data Team has communicated certain issues of concern in relation to the new electron/photon/relaxation ACE data format as released in the eprdata12 library. In this document those issues are parsed, analyzed, and answered.

  4. Investigation of activation cross section data of alpha particle induced nuclear reaction on molybdenum up to 40 MeV: Review of production routes of medically relevant 97,103Ru

    NASA Astrophysics Data System (ADS)

    Tárkányi, F.; Hermanne, A.; Ditrói, F.; Takács, S.; Ignatyuk, A.

    2017-05-01

    The main goals of this investigations were to expand and consolidate reliable activation cross-section data for the natMo(α,x) reactions in connection with production of medically relevant 97,103Ru and the use of the natMo(α,x)97Ru reaction for monitoring beam parameters. The excitation functions for formation of the gamma-emitting radionuclides 94Ru, 95Ru, 97Ru, 103Ru, 93mTc, 93gTc(m+), 94mTc, 94gTc, 95mTc, 95gTc, 96gTc(m+), 99mTc, 93mMo, 99Mo(cum), 90Nb(m+) and 88Zr were measured up to 40 MeV alpha-particle energy by using the stacked foil technique and activation method. Data of our earlier similar experiments were re-evaluated and resulted in corrections on the reported results. Our experimental data were compared with critically analyzed literature data and with the results of model calculations, obtained by using the ALICE-IPPE, EMPIRE 3.1 (Rivoli) and TALYS codes (TENDL-2011 and TENDL-2015 on-line libraries). Nuclear data for different production routes of 97Ru and 103Ru are compiled and reviewed.

  5. SP_Ace: a new code to derive stellar parameters and elemental abundances

    NASA Astrophysics Data System (ADS)

    Boeche, C.; Grebel, E. K.

    2016-03-01

    Context. Ongoing and future massive spectroscopic surveys will collect large numbers (106-107) of stellar spectra that need to be analyzed. Highly automated software is needed to derive stellar parameters and chemical abundances from these spectra. Aims: We developed a new method of estimating the stellar parameters Teff, log g, [M/H], and elemental abundances. This method was implemented in a new code, SP_Ace (Stellar Parameters And Chemical abundances Estimator). This is a highly automated code suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). Methods: After the astrophysical calibration of the oscillator strengths of 4643 absorption lines covering the wavelength ranges 5212-6860 Å and 8400-8924 Å, we constructed a library that contains the equivalent widths (EW) of these lines for a grid of stellar parameters. The EWs of each line are fit by a polynomial function that describes the EW of the line as a function of the stellar parameters. The coefficients of these polynomial functions are stored in a library called the "GCOG library". SP_Ace, a code written in FORTRAN95, uses the GCOG library to compute the EWs of the lines, constructs models of spectra as a function of the stellar parameters and abundances, and searches for the model that minimizes the χ2 deviation when compared to the observed spectrum. The code has been tested on synthetic and real spectra for a wide range of signal-to-noise and spectral resolutions. Results: SP_Ace derives stellar parameters such as Teff, log g, [M/H], and chemical abundances of up to ten elements for low to medium resolution spectra of FGK-type stars with precision comparable to the one usually obtained with spectra of higher resolution. Systematic errors in stellar parameters and chemical abundances are presented and identified with tests on synthetic and real spectra. Stochastic errors are automatically estimated by the code for all the parameters. A simple Web front end of SP_Ace can be found at http://dc.g-vo.org/SP_ACE while the source code will be published soon. Full Tables D.1-D.3 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A2

  6. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  7. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  8. Dietary Regulation of PTEN Signaling and Mammary Tumor Initiating Cells: Implications for Breast Cancer Prevention

    DTIC Science & Technology

    2011-01-01

    All rights reserved. For Permissions, please email: journals.permissions@oxfordjournals.org 1491 at U niversity of A rkansas for M edical S ciences ...U niversity of A rkansas for M edical S ciences Library on A pril 7, 2011 carcin.oxfordjournals.org D ow nloaded from In addition, this dose is...nuclear PTEN–p53 cross talk by GEN 1493 at U niversity of A rkansas for M edical S ciences Library on A pril 7, 2011 carcin.oxfordjournals.org D ow

  9. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  10. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  11. Development of an image analysis screen for estrogen receptor alpha (ERα) ligands through measurement of nuclear translocation dynamics.

    PubMed

    Dull, Angie; Goncharova, Ekaterina; Hager, Gordon; McMahon, James B

    2010-11-01

    We have developed a robust high-content assay to screen for novel estrogen receptor alpha (ERα) agonists and antagonists by quantitation of cytoplasmic to nuclear translocation of an estrogen receptor chimera in 384-well plates. The screen utilizes a green fluorescent protein tagged-glucocorticoid/estrogen receptor (GFP-GRER) chimera which consisted of the N-terminus of the glucocorticoid receptor fused to the human ER ligand binding domain. The GFP-GRER exhibited cytoplasmic localization in the absence of ERα ligands, and translocated to the nucleus in response to stimulation with ERα agonists or antagonists. The BD Pathway 435 imaging system was used for image acquisition, analysis of translocation dynamics, and cytotoxicity measurements. The assay was validated with known ERα agonists and antagonists, and the Library of Pharmacologically Active Compounds (LOPAC 1280). Additionally, screening of crude natural product extracts demonstrated the robustness of the assay, and the ability to quantitate the effects of toxicity on nuclear translocation dynamics. The GFP-GRER nuclear translocation assay was very robust, with z' values >0.7, CVs <5%, and has been validated with known ER ligands, and inclusion of cytotoxicity filters will facilitate screening of natural product extracts. This assay has been developed for future primary screening of synthetic, pure natural products, and natural product extracts libraries available at the National Cancer Institute at Frederick. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Investigating the binding properties of porous drug delivery systems using nuclear sensors (radiotracers) and positron annihilation lifetime spectroscopy--predicting conditions for optimum performance.

    PubMed

    Mume, Eskender; Lynch, Daniel E; Uedono, Akira; Smith, Suzanne V

    2011-06-21

    Understanding how the size, charge and number of available pores in porous material influences the uptake and release properties is important for optimising their design and ultimately their application. Unfortunately there are no standard methods for screening porous materials in solution and therefore formulations must be developed for each encapsulated agent. This study investigates the potential of a library of radiotracers (nuclear sensors) for assessing the binding properties of hollow silica shell materials. Uptake and release of Cu(2+) and Co(2+) and their respective complexes with polyazacarboxylate macrocycles (dota and teta) and a series of hexa aza cages (diamsar, sarar and bis-(p-aminobenzyl)-diamsar) from the hollow silica shells was monitored using their radioisotopic analogues. Coordination chemistry of the metal (M) species, subtle alterations in the molecular architecture of ligands (Ligand) and their resultant complexes (M-Ligand) were found to significantly influence their uptake over pH 3 to 9 at room temperature. Positively charged species were selectively and rapidly (within 10 min) absorbed at pH 7 to 9. Negatively charged species were preferentially absorbed at low pH (3 to 5). Rates of release varied for each nuclear sensor, and time to establish equilibrium varied from minutes to days. The subtle changes in design of the nuclear sensors proved to be a valuable tool for determining the binding properties of porous materials. The data support the development of a library of nuclear sensors for screening porous materials for use in optimising the design of porous materials and the potential of nuclear sensors for high through-put screening of materials.

  13. Role of Nuclear Morphometry in Breast Cancer and its Correlation with Cytomorphological Grading of Breast Cancer: A Study of 64 Cases.

    PubMed

    Kashyap, Anamika; Jain, Manjula; Shukla, Shailaja; Andley, Manoj

    2018-01-01

    Fine needle aspiration cytology (FNAC) is a simple, rapid, inexpensive, and reliable method of diagnosis of breast mass. Cytoprognostic grading in breast cancers is important to identify high-grade tumors. Computer-assisted image morphometric analysis has been developed to quantitate as well as standardize various grading systems. To apply nuclear morphometry on cytological aspirates of breast cancer and evaluate its correlation with cytomorphological grading with derivation of suitable cutoff values between various grades. Descriptive cross-sectional hospital-based study. This study included 64 breast cancer cases (29 of grade 1, 22 of grade 2, and 13 of grade 3). Image analysis was performed on Papanicolaou stained FNAC slides by NIS -Elements Advanced Research software (Ver 4.00). Nuclear morphometric parameters analyzed included 5 nuclear size, 2 shape, 4 texture, and 2 density parameters. Nuclear size parameters showed an increase in values with increasing cytological grades of carcinoma. Nuclear shape parameters were not found to be significantly different between the three grades. Among nuclear texture parameters, sum intensity, and sum brightness were found to be different between the three grades. Nuclear morphometry can be applied to augment the cytology grading of breast cancer and thus help in classifying patients into low and high-risk groups.

  14. JANIS: NEA JAva-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, Nicolas; Bossant, Manuel; Cabellos, Oscar; Dupont, Emmeric; Díez, Carlos J.

    2017-09-01

    JANIS (JAva-based Nuclear Data Information System) software is developed by the OECD Nuclear Energy Agency (NEA) Data Bank to facilitate the visualization and manipulation of nuclear data, giving access to evaluated nuclear data libraries, such as ENDF, JEFF, JENDL, TENDL etc., and also to experimental nuclear data (EXFOR) and bibliographical references (CINDA). It is available as a standalone Java program, downloadable and distributed on DVD and also a web application available on the NEA website. One of the main new features in JANIS is the scripting capability via command line, which notably automatizes plots generation and permits automatically extracting data from the JANIS database. Recent NEA software developments rely on these JANIS features to access nuclear data, for example the Nuclear Data Sensitivity Tool (NDaST) makes use of covariance data in BOXER and COVERX formats, which are retrieved from the JANIS database. New features added in this version of the JANIS software are described along this paper with some examples.

  15. Multigroup cross section library for GFR2400

    NASA Astrophysics Data System (ADS)

    Čerba, Štefan; Vrban, Branislav; Lüley, Jakub; Haščík, Ján; Nečas, Vladimír

    2017-09-01

    In this paper the development and optimization of the SBJ_E71 multigroup cross section library for GFR2400 applications is discussed. A cross section processing scheme, merging Monte Carlo and deterministic codes, was developed. Several fine and coarse group structures and two weighting flux options were analysed through 18 benchmark experiments selected from the handbook of ICSBEP and based on performed similarity assessments. The performance of the collapsed version of the SBJ_E71 library was compared with MCNP5 CE ENDF/B VII.1 and the Korean KAFAX-E70 library. The comparison was made based on integral parameters of calculations performed on full core homogenous models.

  16. Measurement and Analysis of Neutron Leakage Spectra from Pb and LBE Cylinders with D-T Neutrons

    NASA Astrophysics Data System (ADS)

    Chen, Size; Gan, Leting; Li, Taosheng; Han, Yuncheng; Liu, Chao; Jiang, Jieqiong; Wu, Yican

    2017-09-01

    For validating the current evaluated neutron data libraries, neutron leakage spectra from lead and lead bismuth eutectic (LBE) cylinders have been measured using an intense D-T pulsed neutron source with time-of-flight (TOF) method by Institute of Nuclear Energy Safety Technology (INEST), Chinese Academy of Sciences (CAS). The measured leakage spectra have been compared with the calculated ones using Super Monte Carlo Simulation Program for Nuclear and Radiation Process (SuperMC) with the evaluated pointwise data of lead and bismuth processed from ENDF/B-VII.1, JEFF-3.1 and JENDL-4.0 libraries. This work shows that calculations of the three libraries are all generally consistent with the lead experimental result. For LBE experiment, the JEFF-3.1 and JENDL-4.0 calculations both agree well with the measurement. However, the result of ENDF/B-VII.1 fails to fit with the measured data, especially in the energy range of 5.5 and 7 MeV with difference more than 80%. Through sensitivity analysis with partial cross sections of 209Bi in ENDF/B-VII.1 and JEFF, the difference between the measurement and the ENDF/B-VII.1 calculation in LBE experiment is found due to the neutron data of 209Bi.

  17. Collection Development Policy: Academic Library, St. Mary's University. Revised.

    ERIC Educational Resources Information Center

    Sylvia, Margaret

    This guide spells out the collection development policy of the library of St. Mary's University in San Antonio, Texas. The guide is divided into the following five topic areas: (1) introduction to the community served, parameters of the collection, cooperation in collection development, and priorities of the collection; (2) considerations in…

  18. Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    1986-01-01

    Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)

  19. INTRIGOSS: A new Library of High Resolution Synthetic Spectra

    NASA Astrophysics Data System (ADS)

    Franchini, Mariagrazia; Morossi, Carlo; Di Marcancantonio, Paolo; Chavez, Miguel; GES-Builders

    2018-01-01

    INTRIGOSS (INaf Trieste Grid Of Synthetic Spectra) is a new High Resolution (HiRes) synthetic spectral library designed for studying F, G, and K stars. The library is based on atmosphere models computed with specified individual element abundances via ATLAS12 code. Normalized SPectra (NSP) and surface Flux SPectra (FSP), in the 4800-5400 Å wavelength range, were computed by means of the SPECTRUM code. The synthetic spectra are computed with an atomic and bi-atomic molecular line list including "bona fide" Predicted Lines (PLs) built by tuning loggf to reproduce very high SNR Solar spectrum and the UVES-U580 spectra of five cool giants extracted from the Gaia-ESO survey (GES). The astrophysical gf-values were then assessed by using more than 2000 stars with homogenous and accurate atmosphere parameters and detailed chemical composition from GES. The validity and greater accuracy of INTRIGOSS NSPs and FSPs with respect to other available spectral libraries is discussed. INTRIGOSS will be available on the web and will be a valuable tool for both stellar atmospheric parameters and stellar population studies.

  20. Nuclear Data Needs for Generation IV Nuclear Energy Systems

    NASA Astrophysics Data System (ADS)

    Rullhusen, Peter

    2006-04-01

    Nuclear data needs for generation IV systems. Future of nuclear energy and the role of nuclear data / P. Finck. Nuclear data needs for generation IV nuclear energy systems-summary of U.S. workshop / T. A. Taiwo, H. S. Khalil. Nuclear data needs for the assessment of gen. IV systems / G. Rimpault. Nuclear data needs for generation IV-lessons from benchmarks / S. C. van der Marck, A. Hogenbirk, M. C. Duijvestijn. Core design issues of the supercritical water fast reactor / M. Mori ... [et al.]. GFR core neutronics studies at CEA / J. C. Bosq ... [et al]. Comparative study on different phonon frequency spectra of graphite in GCR / Young-Sik Cho ... [et al.]. Innovative fuel types for minor actinides transmutation / D. Haas, A. Fernandez, J. Somers. The importance of nuclear data in modeling and designing generation IV fast reactors / K. D. Weaver. The GIF and Mexico-"everything is possible" / C. Arrenondo Sánchez -- Benmarks, sensitivity calculations, uncertainties. Sensitivity of advanced reactor and fuel cycle performance parameters to nuclear data uncertainties / G. Aliberti ... [et al.]. Sensitivity and uncertainty study for thermal molten salt reactors / A. Biduad ... [et al.]. Integral reactor physics benchmarks- The International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPHEP) / J. B. Briggs, D. W. Nigg, E. Sartori. Computer model of an error propagation through micro-campaign of fast neutron gas cooled nuclear reactor / E. Ivanov. Combining differential and integral experiments on [symbol] for reducing uncertainties in nuclear data applications / T. Kawano ... [et al.]. Sensitivity of activation cross sections of the Hafnium, Tanatalum and Tungsten stable isotopes to nuclear reaction mechanisms / V. Avrigeanu ... [et al.]. Generating covariance data with nuclear models / A. J. Koning. Sensitivity of Candu-SCWR reactors physics calculations to nuclear data files / K. S. Kozier, G. R. Dyck. The lead cooled fast reactor benchmark BREST-300: analysis with sensitivity method / V. Smirnov ... [et al.]. Sensitivity analysis of neutron cross-sections considered for design and safety studies of LFR and SFR generation IV systems / K. Tucek, J. Carlsson, H. Wider -- Experiments. INL capabilities for nuclear data measurements using the Argonne intense pulsed neutron source facility / J. D. Cole ... [et al.]. Cross-section measurements in the fast neutron energy range / A. Plompen. Recent measurements of neutron capture cross sections for minor actinides by a JNC and Kyoto University Group / H. Harada ... [et al.]. Determination of minor actinides fission cross sections by means of transfer reactions / M. Aiche ... [et al.] -- Evaluated data libraries. Nuclear data services from the NEA / H. Henriksson, Y. Rugama. Nuclear databases for energy applications: an IAEA perspective / R. Capote Noy, A. L. Nichols, A. Trkov. Nuclear data evaluation for generation IV / G. Noguère ... [et al.]. Improved evaluations of neutron-induced reactions on americium isotopes / P. Talou ... [et al.]. Using improved ENDF-based nuclear data for candu reactor calculations / J. Prodea. A comparative study on the graphite-moderated reactors using different evaluated nuclear data / Do Heon Kim ... [et al.].

  1. Evaluation of the DRAGON code for VHTR design analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less

  2. Measurements and analysis of leakage neutron spectra from multiple-slab sample assemblies comprising W,U,C, and CH2 with D-T neutron irradiation.

    PubMed

    Luo, F; Han, R; Chen, Z; Nie, Y; Sun, Q; Shi, F; Zhang, S; Tian, G; Song, L; Ruan, X; Ye, M Y

    2018-07-01

    The accelerator driven subcritical system (ADS) is regarded as a safe and clean nuclear power system, which can be used for the transmutation of nuclear waste and the breeding of nuclear fuel. In this study, in order to validate nuclear data and the neutron transportation performance of the materials related to ADS, we measured the leakage neutron spectra from multiple-slab sample assemblies using 14.8 MeV D-T neutrons. Two types of assemblies comprising A-1 (W+U+C+CH 2 ) and A-2 (U+C+CH 2 ) were both built up gradually starting with the first wall. The measured spectra were compared with those calculated using the Monte Carlo code neutron transport coed (MCNP)-4C. A comparison of the results showed that the experimental leakage neutron spectra for both A-1 or A-2 were reproduced well by the three evaluated nuclear data libraries with discrepancies of less than 15% (A-1) and 12% (A-2), except when below 3 MeV. For 2-cm and 5-cm uranium samples, the CENDL-3.1 calculations exhibited large discrepancies in the energy range of 2-8 MeV and above 13 MeV. Thus, the CENDL-3.1 library for uranium should be reevaluated, especially around this energy range. It was significant that the leakage neuron spectra changed clearly when the latest material layer was added during the building of assemblies A-1 and A-2. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Neutronic experiments with fluorine rich compounds at LR-0 reactor

    DOE PAGES

    Losa, Evzen; Kostal, Michal; Czakoj, T.; ...

    2018-06-06

    Here, research on molten salt reactor (MSR) neutronics continues in Research Centre Rez (Czech Republic) with experimental work being conducted using fluoride salt that was originally used in the Molten Salt Reactor Experiment (MSRE). Previous results identified significant variations in the neutron spectrum measured in LiF-NaF salt. These variations could originate from the fluorine description in current nuclear data sets. Subsequent experiments were performed to try to confirm this phenomenon. Therefore, another fluorine-rich compound, Teflon, was used for testing. Critical experiments showed slight discrepancies in C/E-1 for both compounds, Teflon and FLIBE, and systematic overestimation of criticality was observed inmore » calculations. Different nuclear data libraries were used for data set testing. For Teflon, the overestimation is higher when using JENDL-3.3, JENDL-4, and RUSFOND-2010 libraries, all three of which share the same inelastic-to-elastic scattering cross section ratio. Calculations using other libraries (ENDF/B-VII.1, ENDF/B-VII.0, JEFF-3.2, JEFF-3.1, and CENDL-3.1) tend to be closer to the experimental value. Neutron spectrum measurement in both substances revealed structure similar to that seen in previous measurements using LiF-NaF salt, which indicates that the neutron spectrum seems to be strongly shaped by fluorine. Discrepancies between experimental and calculational results seem to be larger in the neutron energy range of 100–1300 keV than in higher energies. In the case of neutron spectrum calculation, none of the tested libraries gives overall better results than the others.« less

  4. Neutronic experiments with fluorine rich compounds at LR-0 reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Losa, Evzen; Kostal, Michal; Czakoj, T.

    Here, research on molten salt reactor (MSR) neutronics continues in Research Centre Rez (Czech Republic) with experimental work being conducted using fluoride salt that was originally used in the Molten Salt Reactor Experiment (MSRE). Previous results identified significant variations in the neutron spectrum measured in LiF-NaF salt. These variations could originate from the fluorine description in current nuclear data sets. Subsequent experiments were performed to try to confirm this phenomenon. Therefore, another fluorine-rich compound, Teflon, was used for testing. Critical experiments showed slight discrepancies in C/E-1 for both compounds, Teflon and FLIBE, and systematic overestimation of criticality was observed inmore » calculations. Different nuclear data libraries were used for data set testing. For Teflon, the overestimation is higher when using JENDL-3.3, JENDL-4, and RUSFOND-2010 libraries, all three of which share the same inelastic-to-elastic scattering cross section ratio. Calculations using other libraries (ENDF/B-VII.1, ENDF/B-VII.0, JEFF-3.2, JEFF-3.1, and CENDL-3.1) tend to be closer to the experimental value. Neutron spectrum measurement in both substances revealed structure similar to that seen in previous measurements using LiF-NaF salt, which indicates that the neutron spectrum seems to be strongly shaped by fluorine. Discrepancies between experimental and calculational results seem to be larger in the neutron energy range of 100–1300 keV than in higher energies. In the case of neutron spectrum calculation, none of the tested libraries gives overall better results than the others.« less

  5. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pigni, M.T., E-mail: pignimt@ornl.gov; Francis, M.W.; Gauld, I.C.

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {supmore » 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.« less

  6. Methodology comparison for gamma-heating calculations in material-testing reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemaire, M.; Vaglio-Gaudard, C.; Lyoussi, A.

    2015-07-01

    The Jules Horowitz Reactor (JHR) is a Material-Testing Reactor (MTR) under construction in the south of France at CEA Cadarache (French Alternative Energies and Atomic Energy Commission). It will typically host about 20 simultaneous irradiation experiments in the core and in the beryllium reflector. These experiments will help us better understand the complex phenomena occurring during the accelerated ageing of materials and the irradiation of nuclear fuels. Gamma heating, i.e. photon energy deposition, is mainly responsible for temperature rise in non-fuelled zones of nuclear reactors, including JHR internal structures and irradiation devices. As temperature is a key parameter for physicalmore » models describing the behavior of material, accurate control of temperature, and hence gamma heating, is required in irradiation devices and samples in order to perform an advanced suitable analysis of future experimental results. From a broader point of view, JHR global attractiveness as a MTR depends on its ability to monitor experimental parameters with high accuracy, including gamma heating. Strict control of temperature levels is also necessary in terms of safety. As JHR structures are warmed up by gamma heating, they must be appropriately cooled down to prevent creep deformation or melting. Cooling-power sizing is based on calculated levels of gamma heating in the JHR. Due to these safety concerns, accurate calculation of gamma heating with well-controlled bias and associated uncertainty as low as possible is all the more important. There are two main kinds of calculation bias: bias coming from nuclear data on the one hand and bias coming from physical approximations assumed by computer codes and by general calculation route on the other hand. The former must be determined by comparison between calculation and experimental data; the latter by calculation comparisons between codes and between methodologies. In this presentation, we focus on this latter kind of bias. Nuclear heating is represented by the physical quantity called absorbed dose (energy deposition induced by particle-matter interactions, divided by mass). Its calculation with Monte Carlo codes is possible but computationally expensive as it requires transport simulation of charged particles, along with neutrons and photons. For that reason, the calculation of another physical quantity, called KERMA, is often preferred, as KERMA calculation with Monte Carlo codes only requires transport of neutral particles. However, KERMA is only an estimator of the absorbed dose and many conditions must be fulfilled for KERMA to be equal to absorbed dose, including so-called condition of electronic equilibrium. Also, Monte Carlo computations of absorbed dose still present some physical approximations, even though there is only a limited number of them. Some of these approximations are linked to the way how Monte Carlo codes apprehend the transport simulation of charged particles and the productive and destructive interactions between photons, electrons and positrons. There exists a huge variety of electromagnetic shower models which tackle this topic. Differences in the implementation of these models can lead to discrepancies in calculated values of absorbed dose between different Monte Carlo codes. The magnitude of order of such potential discrepancies should be quantified for JHR gamma-heating calculations. We consequently present a two-pronged plan. In a first phase, we intend to perform compared absorbed dose / KERMA Monte Carlo calculations in the JHR. This way, we will study the presence or absence of electronic equilibrium in the different JHR structures and experimental devices and we will give recommendations for the choice of KERMA or absorbed dose when calculating gamma heating in the JHR. In a second phase, we intend to perform compared TRIPOLI4 / MCNP absorbed dose calculations in a simplified JHR-representative geometry. For this comparison, we will use the same nuclear data library for both codes (the European library JEFF3.1.1 and photon library EPDL97) so as to isolate the effects from electromagnetic shower models on absorbed dose calculation. This way, we hope to get insightful feedback on these models and their implementation in Monte Carlo codes. (authors)« less

  7. Collection Metadata Solutions for Digital Library Applications

    NASA Technical Reports Server (NTRS)

    Hill, Linda L.; Janee, Greg; Dolin, Ron; Frew, James; Larsgaard, Mary

    1999-01-01

    Within a digital library, collections may range from an ad hoc set of objects that serve a temporary purpose to established library collections intended to persist through time. The objects in these collections vary widely, from library and data center holdings to pointers to real-world objects, such as geographic places, and the various metadata schemas that describe them. The key to integrated use of such a variety of collections in a digital library is collection metadata that represents the inherent and contextual characteristics of a collection. The Alexandria Digital Library (ADL) Project has designed and implemented collection metadata for several purposes: in XML form, the collection metadata "registers" the collection with the user interface client; in HTML form, it is used for user documentation; eventually, it will be used to describe the collection to network search agents; and it is used for internal collection management, including mapping the object metadata attributes to the common search parameters of the system.

  8. Microsatellite primers for red drum (Sciaenops ocellatus)

    USDA-ARS?s Scientific Manuscript database

    In this note, we document polymerase-chain-reaction (PCR) primer pairs for 101, nuclear-encoded microsatellites designed and developed from a red drum (Sciaenops ocellatus) genomic library. The 101 microsatellites (Genbank Accession Numbers EU015882-EU015982) were amplified successfully and used to...

  9. Evaluation of the 235 U resonance parameters to fit the standard recommended values

    DOE PAGES

    Leal, Luiz; Noguere, Gilles; Paradela, Carlos; ...

    2017-09-13

    A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. We performed a resonance re-evaluation of the n + 235U interactionmore » in order to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-o-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. Our paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.« less

  10. Evaluation of the 235 U resonance parameters to fit the standard recommended values

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leal, Luiz; Noguere, Gilles; Paradela, Carlos

    A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. We performed a resonance re-evaluation of the n + 235U interactionmore » in order to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-o-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. Our paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.« less

  11. Evaluation of the 235U resonance parameters to fit the standard recommended values

    NASA Astrophysics Data System (ADS)

    Leal, Luiz; Noguere, Gilles; Paradela, Carlos; Durán, Ignacio; Tassan-Got, Laurent; Danon, Yaron; Jandel, Marian

    2017-09-01

    A great deal of effort has been dedicated to the revision of the standard values in connection with the neutron interaction for some actinides. While standard data compilation are available for decades nuclear data evaluations included in existing nuclear data libraries (ENDF, JEFF, JENDL, etc.) do not follow the standard recommended values. Indeed, the majority of evaluations for major actinides do not conform to the standards whatsoever. In particular, for the n + 235U interaction the only value in agreement with the standard is the thermal fission cross section. A resonance re-evaluation of the n + 235U interaction has been performed to address the issues regarding standard values in the energy range from 10-5 eV to 2250 eV. Recently, 235U fission cross-section measurements have been performed at the CERN Neutron Time-of-Flight facility (TOF), known as n_TOF, in the energy range from 0.7 eV to 10 keV. The data were normalized according to the recommended standard of the fission integral in the energy range 7.8 eV to 11 eV. As a result, the n_TOF averaged fission cross sections above 100 eV are in good agreement with the standard recommended values. The n_TOF data were included in the 235U resonance analysis that was performed with the code SAMMY. In addition to the average standard values related to the fission cross section, standard thermal values for fission, capture, and elastic cross sections were also included in the evaluation. This paper presents the procedure used for re-evaluating the 235U resonance parameters including the recommended standard values as well as new cross section measurements.

  12. Simulation of Thermal Neutron Transport Processes Directly from the Evaluated Nuclear Data Files

    NASA Astrophysics Data System (ADS)

    Androsenko, P. A.; Malkov, M. R.

    The main idea of the method proposed in this paper is to directly extract thetrequired information for Monte-Carlo calculations from nuclear data files. The met od being developed allows to directly utilize the data obtained from libraries and seehs to be the most accurate technique. Direct simulation of neutron scattering in themmal energy range using file 7 ENDF-6 format in terms of code system BRAND has beer achieved. Simulation algorithms have been verified using the criterion x2

  13. User Guide for the Plotting Software for the Los Alamos National Laboratory Nuclear Weapons Analysis Tools Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleland, Timothy James

    The Los Alamos National Laboratory Plotting Software for the Nuclear Weapons Analysis Tools is a Java™ application based upon the open source library JFreeChart. The software provides a capability for plotting data on graphs with a rich variety of display options while allowing the viewer interaction via graph manipulation and scaling to best view the data. The graph types include XY plots, Date XY plots, Bar plots and Histogram plots.

  14. It is not Just a Press Conference: The Consequences of Crisis Communication While the World Watches

    DTIC Science & Technology

    2014-03-01

    criticized for sluggish action and the release of inaccurate information in the hours and days following the disaster at the Fukushima nuclear power...at 21:23 on the evening of March 11.”10 Two months after the disaster , a nationwide poll showed 81 percent of respondents to the survey said they did...Effective Crisis Communication: Moving from Crisis to Opportunity (Sage Publications, 2010). 10 National Diet Library, The Fukushima Nuclear Accident

  15. Integrated cloud infrastructure of the LIT JINR, PE "NULITS" and INP's Astana branch

    NASA Astrophysics Data System (ADS)

    Mazhitova, Yelena; Balashov, Nikita; Baranov, Aleksandr; Kutovskiy, Nikolay; Semenov, Roman

    2018-04-01

    The article describes the distributed cloud infrastructure deployed on the basis of the resources of the Laboratory of Information Technologies of the Joint Institute for Nuclear Research (LIT JINR) and some JINR Member State organizations. It explains a motivation of that work, an approach it is based on, lists of its participants among which there are private entity "Nazarbayev University Library and IT services" (PE "NULITS") Autonomous Education Organization "Nazarbayev University" (AO NU) and The Institute of Nuclear Physics' (INP's) Astana branch.

  16. Library Staff operate a Microfilm Reader at the Lewis Research Center

    NASA Image and Video Library

    1961-04-21

    Jean Neidengard and George Mandel operate a Kodak Recordak microfilm reader in the library at the National Aeronautics and Space Administration (NASA) Lewis Research Center. The library was located in the Administration Building until the mid-1960s. It was then moved to the Propulsion Systems Laboratory Office Building. In 2008 the library was moved once again, to the Research Analysis Center. At the time of this photograph, the Lewis library claimed to possess “One of the most complete aero-technical collections in the world.” It was doing a brisk business in the early 1960s. During 1960 alone the library acquired 19,000 new documents and provided 100,000 documents to customers. The library’s eleven-person staff provided reference services, archived technical reports, and supplied periodicals. The staff also included Sam Reiss, a full-time translator who could read 30 languages. He translated technical reports from all over the world for the Lewis research staff. Jean Neidengard oversaw the secret Atomic Energy Commission (AEC) documents in the collection. NASA was partnering with the AEC at the time on Nuclear Engine for Rocket Vehicle Application (NERVA) program. NASA Lewis was the agency’s lead center in the NERVA program. Neidengard’s husband Bill was the head mechanic in the Propulsion Systems Laboratory. George Mandel led the library staff from 1955 to 1968.

  17. Quantification of aquifer properties with surface nuclear magnetic resonance in the Platte River valley, central Nebraska, using a novel inversion method

    USGS Publications Warehouse

    Irons, Trevor P.; Hobza, Christopher M.; Steele, Gregory V.; Abraham, Jared D.; Cannia, James C.; Woodward, Duane D.

    2012-01-01

    Surface nuclear magnetic resonance, a noninvasive geophysical method, measures a signal directly related to the amount of water in the subsurface. This allows for low-cost quantitative estimates of hydraulic parameters. In practice, however, additional factors influence the signal, complicating interpretation. The U.S. Geological Survey, in cooperation with the Central Platte Natural Resources District, evaluated whether hydraulic parameters derived from surface nuclear magnetic resonance data could provide valuable input into groundwater models used for evaluating water-management practices. Two calibration sites in Dawson County, Nebraska, were chosen based on previous detailed hydrogeologic and geophysical investigations. At both sites, surface nuclear magnetic resonance data were collected, and derived parameters were compared with results from four constant-discharge aquifer tests previously conducted at those same sites. Additionally, borehole electromagnetic-induction flowmeter data were analyzed as a less-expensive surrogate for traditional aquifer tests. Building on recent work, a novel surface nuclear magnetic resonance modeling and inversion method was developed that incorporates electrical conductivity and effects due to magnetic-field inhomogeneities, both of which can have a substantial impact on the data. After comparing surface nuclear magnetic resonance inversions at the two calibration sites, the nuclear magnetic-resonance-derived parameters were compared with previously performed aquifer tests in the Central Platte Natural Resources District. This comparison served as a blind test for the developed method. The nuclear magnetic-resonance-derived aquifer parameters were in agreement with results of aquifer tests where the environmental noise allowed data collection and the aquifer test zones overlapped with the surface nuclear magnetic resonance testing. In some cases, the previously performed aquifer tests were not designed fully to characterize the aquifer, and the surface nuclear magnetic resonance was able to provide missing data. In favorable locations, surface nuclear magnetic resonance is able to provide valuable noninvasive information about aquifer parameters and should be a useful tool for groundwater managers in Nebraska.

  18. Synthesis of Actinide Materials for the Study of Basic Actinide Science and Rapid Separation of Fission Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorhout, Jacquelyn Marie

    This dissertation covers several distinct projects relating to the fields of nuclear forensics and basic actinide science. Post-detonation nuclear forensics, in particular, the study of fission products resulting from a nuclear device to determine device attributes and information, often depends on the comparison of fission products to a library of known ratios. The expansion of this library is imperative as technology advances. Rapid separation of fission products from a target material, without the need to dissolve the target, is an important technique to develop to improve the library and provide a means to develop samples and standards for testing separations.more » Several materials were studied as a proof-of-concept that fission products can be extracted from a solid target, including microparticulate (< 10 μm diameter) dUO 2, porous metal organic frameworks (MOFs) synthesized from depleted uranium (dU), and other organicbased frameworks containing dU. The targets were irradiated with fast neutrons from one of two different neutron sources, contacted with dilute acids to facilitate the separation of fission products, and analyzed via gamma spectroscopy for separation yields. The results indicate that smaller particle sizes of dUO 2 in contact with the secondary matrix KBr yield higher separation yields than particles without a secondary matrix. It was also discovered that using 0.1 M HNO 3 as a contact acid leads to the dissolution of the target material. Lower concentrations of acid were used for future experiments. In the case of the MOFs, a larger pore size in the framework leads to higher separation yields when contacted with 0.01 M HNO 3. Different types of frameworks also yield different results.« less

  19. Measuring and Validating Neutron Capture Cross Sections Using a Lead Slowing-Down Spectrometer

    NASA Astrophysics Data System (ADS)

    Thompson, Nicholas

    Accurate nuclear data is essential for the modeling, design, and operation of nuclear systems. In this work, the Rensselaer Polytechnic Institute (RPI) Lead Slowing-Down Spectrometer (LSDS) at the Gaerttner Linear Accelerator Center (LINAC) was used to measure neutron capture cross sections and validate capture cross sections in cross section libraries. The RPI LINAC was used to create a fast burst of neutrons in the center of the LSDS, a large cube of high purity lead. A sample and YAP:Ce scintillator were placed in the LSDS, and as neutrons lost energy through scattering interactions with the lead, the scintillator detected capture gammas resulting from neutron capture events in the sample. Samples of silver, gold, cobalt, iron, indium, molybdenum, niobium, nickel, tin, tantalum, and zirconium were measured. Data was collected as a function of time after neutron pulse, or slowing-down time, which is correlated to average neutron energy. An analog and a digital data acquisition system collected data simultaneously, allowing for collection of pulse shape information as well as timing. Collection of digital data allowed for pulse shape analysis after the experiment. This data was then analyzed and compared to Monte Carlo simulations to validate the accuracy of neutron capture cross section libraries. These measurements represent the first time that neutron capture cross sections have been measured using an LSDS in the United States, and the first time tools such as coincidence measurements and pulse height weighting have been applied to measurements of neutron capture cross sections using an LSDS. Significant differences between measurement results and simulation results were found in multiple materials, and some errors in nuclear data libraries have already been identified due to these measurements.

  20. Role of Nuclear Morphometry in Breast Cancer and its Correlation with Cytomorphological Grading of Breast Cancer: A Study of 64 Cases

    PubMed Central

    Kashyap, Anamika; Jain, Manjula; Shukla, Shailaja; Andley, Manoj

    2018-01-01

    Background: Fine needle aspiration cytology (FNAC) is a simple, rapid, inexpensive, and reliable method of diagnosis of breast mass. Cytoprognostic grading in breast cancers is important to identify high-grade tumors. Computer-assisted image morphometric analysis has been developed to quantitate as well as standardize various grading systems. Aims: To apply nuclear morphometry on cytological aspirates of breast cancer and evaluate its correlation with cytomorphological grading with derivation of suitable cutoff values between various grades. Settings and Designs: Descriptive cross-sectional hospital-based study. Materials and Methods: This study included 64 breast cancer cases (29 of grade 1, 22 of grade 2, and 13 of grade 3). Image analysis was performed on Papanicolaou stained FNAC slides by NIS –Elements Advanced Research software (Ver 4.00). Nuclear morphometric parameters analyzed included 5 nuclear size, 2 shape, 4 texture, and 2 density parameters. Results: Nuclear size parameters showed an increase in values with increasing cytological grades of carcinoma. Nuclear shape parameters were not found to be significantly different between the three grades. Among nuclear texture parameters, sum intensity, and sum brightness were found to be different between the three grades. Conclusion: Nuclear morphometry can be applied to augment the cytology grading of breast cancer and thus help in classifying patients into low and high-risk groups. PMID:29403169

  1. A Method of Predicting Queuing at Library Online PCs

    ERIC Educational Resources Information Center

    Beranek, Lea G.

    2006-01-01

    On-campus networked personal computer (PC) usage at La Trobe University Library was surveyed during September 2005. The survey's objectives were to confirm peak usage times, to measure some of the relevant parameters of online PC usage, and to determine the effect that 24 new networked PCs had on service quality. The survey found that clients…

  2. A new stellar spectrum interpolation algorithm and its application to Yunnan-III evolutionary population synthesis models

    NASA Astrophysics Data System (ADS)

    Cheng, Liantao; Zhang, Fenghui; Kang, Xiaoyu; Wang, Lang

    2018-05-01

    In evolutionary population synthesis (EPS) models, we need to convert stellar evolutionary parameters into spectra via interpolation in a stellar spectral library. For theoretical stellar spectral libraries, the spectrum grid is homogeneous on the effective-temperature and gravity plane for a given metallicity. It is relatively easy to derive stellar spectra. For empirical stellar spectral libraries, stellar parameters are irregularly distributed and the interpolation algorithm is relatively complicated. In those EPS models that use empirical stellar spectral libraries, different algorithms are used and the codes are often not released. Moreover, these algorithms are often complicated. In this work, based on a radial basis function (RBF) network, we present a new spectrum interpolation algorithm and its code. Compared with the other interpolation algorithms that are used in EPS models, it can be easily understood and is highly efficient in terms of computation. The code is written in MATLAB scripts and can be used on any computer system. Using it, we can obtain the interpolated spectra from a library or a combination of libraries. We apply this algorithm to several stellar spectral libraries (such as MILES, ELODIE-3.1 and STELIB-3.2) and give the integrated spectral energy distributions (ISEDs) of stellar populations (with ages from 1 Myr to 14 Gyr) by combining them with Yunnan-III isochrones. Our results show that the differences caused by the adoption of different EPS model components are less than 0.2 dex. All data about the stellar population ISEDs in this work and the RBF spectrum interpolation code can be obtained by request from the first author or downloaded from http://www1.ynao.ac.cn/˜zhangfh.

  3. Associations and World Issues.

    ERIC Educational Resources Information Center

    Harris, Virginia

    1985-01-01

    This article reviews activities of selected associations in dealing with three issues of major interest and controversy in the 1980s--foreign policy and defense, the economy, and energy and nuclear power. Important publications in areas of interest to library users and methods of acquisition are noted. (EJS)

  4. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; Gough, Sean T.

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  5. Prediction of the Reactor Antineutrino Flux for the Double Chooz Experiment

    NASA Astrophysics Data System (ADS)

    Jones, Chirstopher LaDon

    This thesis benchmarks the deterministic lattice code, DRAGON, against data, and then applies this code to make a prediction for the antineutrino flux from the Chooz Bl and B2 reactors. Data from the destructive assay of rods from the Takahama-3 reactor and from the SONGS antineutrino detector are used for comparisons. The resulting prediction from the tuned DRAGON code is then compared to the first antineutrino event spectra from Double Chooz. Use of this simulation in nuclear nonproliferation studies is discussed. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs@mit.edu)

  6. Multifractal Characterization of Geologic Noise for Improved UXO Detection and Discrimination

    DTIC Science & Technology

    2008-03-01

    12 Recovery of the Universal Multifractal Parameters ...dipole-model to each magnetic anomaly and compares the extracted model parameters with a library of UXO items. They found that remnant magnetization...the survey parameters , and the geologic environment. In this pilot study we have focused on the multifractal representation of natural variations

  7. The relative pose estimation of aircraft based on contour model

    NASA Astrophysics Data System (ADS)

    Fu, Tai; Sun, Xiangyi

    2017-02-01

    This paper proposes a relative pose estimation approach based on object contour model. The first step is to obtain a two-dimensional (2D) projection of three-dimensional (3D)-model-based target, which will be divided into 40 forms by clustering and LDA analysis. Then we proceed by extracting the target contour in each image and computing their Pseudo-Zernike Moments (PZM), thus a model library is constructed in an offline mode. Next, we spot a projection contour that resembles the target silhouette most in the present image from the model library with reference of PZM; then similarity transformation parameters are generated as the shape context is applied to match the silhouette sampling location, from which the identification parameters of target can be further derived. Identification parameters are converted to relative pose parameters, in the premise that these values are the initial result calculated via iterative refinement algorithm, as the relative pose parameter is in the neighborhood of actual ones. At last, Distance Image Iterative Least Squares (DI-ILS) is employed to acquire the ultimate relative pose parameters.

  8. NMRbot: Python scripts enable high-throughput data collection on current Bruker BioSpin NMR spectrometers.

    PubMed

    Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L

    2013-06-01

    To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.

  9. Measurement of the intensity ratio of Auger and conversion electrons for the electron capture decay of 125I.

    PubMed

    Alotiby, M; Greguric, I; Kibédi, T; Lee, B Q; Roberts, M; Stuchbery, A E; Tee, Pi; Tornyi, T; Vos, M

    2018-03-21

    Auger electrons emitted after nuclear decay have potential application in targeted cancer therapy. For this purpose it is important to know the Auger electron yield per nuclear decay. In this work we describe a measurement of the ratio of the number of conversion electrons (emitted as part of the nuclear decay process) to the number of Auger electrons (emitted as part of the atomic relaxation process after the nuclear decay) for the case of 125 I. Results are compared with Monte-Carlo type simulations of the relaxation cascade using the BrIccEmis code. Our results indicate that for 125 I the calculations based on rates from the Evaluated Atomic Data Library underestimate the K Auger yields by 20%.

  10. Measurement of the intensity ratio of Auger and conversion electrons for the electron capture decay of 125I

    NASA Astrophysics Data System (ADS)

    Alotiby, M.; Greguric, I.; Kibédi, T.; Lee, B. Q.; Roberts, M.; Stuchbery, A. E.; Tee, Pi; Tornyi, T.; Vos, M.

    2018-03-01

    Auger electrons emitted after nuclear decay have potential application in targeted cancer therapy. For this purpose it is important to know the Auger electron yield per nuclear decay. In this work we describe a measurement of the ratio of the number of conversion electrons (emitted as part of the nuclear decay process) to the number of Auger electrons (emitted as part of the atomic relaxation process after the nuclear decay) for the case of 125I. Results are compared with Monte-Carlo type simulations of the relaxation cascade using the BrIccEmis code. Our results indicate that for 125I the calculations based on rates from the Evaluated Atomic Data Library underestimate the K Auger yields by 20%.

  11. Transcriptome Analyses of Mosaic (MSC) Mitochondrial Mutants of Cucumber in a Highly Inbred Nuclear Background

    PubMed Central

    Mróz, Tomasz L.; Eves-van den Akker, Sebastian; Bernat, Agata; Skarzyńska, Agnieszka; Pryszcz, Leszek; Olberg, Madeline; Havey, Michael J.; Bartoszewski, Grzegorz

    2018-01-01

    Cucumber (Cucumis sativus L.) has a large, paternally transmitted mitochondrial genome. Cucumber plants regenerated from cell cultures occasionally show paternally transmitted mosaic (MSC) phenotypes, characterized by slower growth, chlorotic patterns on the leaves and fruit, lower fertility, and rearrangements in their mitochondrial DNAs (mtDNAs). MSC lines 3, 12, and 16 originated from different cell cultures all established using the highly inbred, wild-type line B. These MSC lines possess different rearrangements and under-represented regions in their mtDNAs. We completed RNA-seq on normalized and non-normalized cDNA libraries from MSC3, MSC12, and MSC16 to study their nuclear gene-expression profiles relative to inbred B. Results from both libraries indicated that gene expression in MSC12 and MSC16 were more similar to each other than MSC3. Forty-one differentially expressed genes (DEGs) were upregulated and one downregulated in the MSC lines relative to B. Gene functional classifications revealed that more than half of these DEGs are associated with stress-response pathways. Consistent with this observation, we detected elevated levels of hydrogen peroxide throughout leaf tissue in all MSC lines compared to wild-type line B. These results demonstrate that independently produced MSC lines with different mitochondrial polymorphisms show unique and shared nuclear responses. This study revealed genes associated with stress response that could become selection targets to develop cucumber cultivars with increased stress tolerance, and further support of cucumber as a model plant to study nuclear-mitochondrial interactions. PMID:29330162

  12. Developments in capture-γ libraries for nonproliferation applications

    NASA Astrophysics Data System (ADS)

    Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Bleuel, D. L.; Basunia, M. S.; Bečvář, F.; Belgya, T.; Bernstein, L. A.; Carroll, J. J.; Detwiler, B.; Escher, J. E.; Genreith, C.; Goldblum, B. L.; Krtička, M.; Lerch, A. G.; Matters, D. A.; McClory, J. W.; McHale, S. R.; Révay, Zs.; Szentmiklosi, L.; Turkoglu, D.; Ureche, A.; Vujic, J.

    2017-09-01

    The neutron-capture reaction is fundamental for identifying and analyzing the γ-ray spectrum from an unknown assembly because it provides unambiguous information on the neutron-absorbing isotopes. Nondestructive-assay applications may exploit this phenomenon passively, for example, in the presence of spontaneous-fission neutrons, or actively where an external neutron source is used as a probe. There are known gaps in the Evaluated Nuclear Data File libraries corresponding to neutron-capture γ-ray data that otherwise limit transport-modeling applications. In this work, we describe how new thermal neutron-capture data are being used to improve information in the neutron-data libraries for isotopes relevant to nonproliferation applications. We address this problem by providing new experimentally-deduced partial and total neutron-capture reaction cross sections and then evaluate these data by comparison with statistical-model calculations.

  13. An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence

    NASA Astrophysics Data System (ADS)

    Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras

    2014-05-01

    We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.

  14. Study of nuclear morphometry on cytology specimens of benign and malignant breast lesions: A study of 122 cases

    PubMed Central

    Kashyap, Anamika; Jain, Manjula; Shukla, Shailaja; Andley, Manoj

    2017-01-01

    Background: Breast cancer has emerged as a leading site of cancer among women in India. Fine needle aspiration cytology (FNAC) has been routinely applied in assessment of breast lesions. Cytological evaluation in breast lesions is subjective with a “gray zone” of 6.9–20%. Quantitative evaluation of nuclear size, shape, texture, and density parameters by morphometry can be of diagnostic help in breast tumor. Aims: To apply nuclear morphometry on cytological breast aspirates and assess its role in differentiating between benign and malignant breast lesions with derivation of suitable cut-off values between the two groups. Settings and Designs: The present study was a descriptive cross-sectional hospital-based study of nuclear morphometric parameters of benign and malignant cases. Materials and Methods: The study included 50 benign breast disease (BBD), 8 atypical ductal hyperplasia (ADH), and 64 carcinoma cases. Image analysis was performed on Papanicolaou-stained FNAC slides by Nikon Imaging Software (NIS)–Elements Advanced Research software (Version 4.00). Nuclear morphometric parameters analyzed included 5 nuclear size, 2 shape, 4 texture, and 2 density parameters. Results: Nuclear morphometry could differentiate between benign and malignant aspirates with a gradually increasing nuclear size parameters from BBD to ADH to carcinoma. Cut-off values of 31.93 μm2, 6.325 μm, 5.865 μm, 7.855 μm, and 21.55 μm for mean nuclear area, equivalent diameter, minimum feret, maximum ferret, and perimeter, respectively, were derived between benign and malignant cases, which could correctly classify 7 out of 8 ADH cases. Conclusion: Nuclear morphometry is a highly objective tool that could be used to supplement FNAC in differentiating benign from malignant lesions, with an important role in cases with diagnostic dilemma. PMID:28182052

  15. Study of nuclear morphometry on cytology specimens of benign and malignant breast lesions: A study of 122 cases.

    PubMed

    Kashyap, Anamika; Jain, Manjula; Shukla, Shailaja; Andley, Manoj

    2017-01-01

    Breast cancer has emerged as a leading site of cancer among women in India. Fine needle aspiration cytology (FNAC) has been routinely applied in assessment of breast lesions. Cytological evaluation in breast lesions is subjective with a "gray zone" of 6.9-20%. Quantitative evaluation of nuclear size, shape, texture, and density parameters by morphometry can be of diagnostic help in breast tumor. To apply nuclear morphometry on cytological breast aspirates and assess its role in differentiating between benign and malignant breast lesions with derivation of suitable cut-off values between the two groups. The present study was a descriptive cross-sectional hospital-based study of nuclear morphometric parameters of benign and malignant cases. The study included 50 benign breast disease (BBD), 8 atypical ductal hyperplasia (ADH), and 64 carcinoma cases. Image analysis was performed on Papanicolaou-stained FNAC slides by Nikon Imaging Software (NIS)-Elements Advanced Research software (Version 4.00). Nuclear morphometric parameters analyzed included 5 nuclear size, 2 shape, 4 texture, and 2 density parameters. Nuclear morphometry could differentiate between benign and malignant aspirates with a gradually increasing nuclear size parameters from BBD to ADH to carcinoma. Cut-off values of 31.93 μm 2 , 6.325 μm, 5.865 μm, 7.855 μm, and 21.55 μm for mean nuclear area, equivalent diameter, minimum feret, maximum ferret, and perimeter, respectively, were derived between benign and malignant cases, which could correctly classify 7 out of 8 ADH cases. Nuclear morphometry is a highly objective tool that could be used to supplement FNAC in differentiating benign from malignant lesions, with an important role in cases with diagnostic dilemma.

  16. FIER: Software for analytical modeling of delayed gamma-ray spectra

    NASA Astrophysics Data System (ADS)

    Matthews, E. F.; Goldblum, B. L.; Bernstein, L. A.; Quiter, B. J.; Brown, J. A.; Younes, W.; Burke, J. T.; Padgett, S. W.; Ressler, J. J.; Tonchev, A. P.

    2018-05-01

    A new software package, the Fission Induced Electromagnetic Response (FIER) code, has been developed to analytically predict delayed γ-ray spectra following fission. FIER uses evaluated nuclear data and solutions to the Bateman equations to calculate the time-dependent populations of fission products and their decay daughters resulting from irradiation of a fissionable isotope. These populations are then used in the calculation of γ-ray emission rates to obtain the corresponding delayed γ-ray spectra. FIER output was compared to experimental data obtained by irradiation of a 235U sample in the Godiva critical assembly. This investigation illuminated discrepancies in the input nuclear data libraries, showcasing the usefulness of FIER as a tool to address nuclear data deficiencies through comparison with experimental data. FIER provides traceability between γ-ray emissions and their contributing nuclear species, decay chains, and parent fission fragments, yielding a new capability for the nuclear science community.

  17. IAEA Nuclear Data Section: provision of atomic and nuclear databases for user applications.

    PubMed

    Humbert, Denis P; Nichols, Alan L; Schwerer, Otto

    2004-01-01

    The Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA) provides a wide range of atomic and nuclear data services to scientists worldwide, with particular emphasis placed on the needs of developing countries. Highly focused Co-ordinated Research Projects and multinational data networks are sponsored under the auspices of the IAEA for the development and assembly of databases through the organised participation of specialists from Member States. More than 100 data libraries are readily available cost-free through the Internet, CD-ROM and other media. These databases are used in a wide range of applications, including fission- and fusion-energy, non-energy applications and basic research studies. Further information concerning the various services can be found through the web address of the IAEA Nuclear Data Section: and a mirror site at IPEN, Brazil that is maintained by NDS staff:.

  18. Using Partial Genomic Fosmid Libraries for Sequencing CompleteOrganellar Genomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeal, Joel R.; Leebens-Mack, James H.; Arumuganathan, K.

    2005-08-26

    Organellar genome sequences provide numerous phylogenetic markers and yield insight into organellar function and molecular evolution. These genomes are much smaller in size than their nuclear counterparts; thus, their complete sequencing is much less expensive than total nuclear genome sequencing, making broader phylogenetic sampling feasible. However, for some organisms it is challenging to isolate plastid DNA for sequencing using standard methods. To overcome these difficulties, we constructed partial genomic libraries from total DNA preparations of two heterotrophic and two autotrophic angiosperm species using fosmid vectors. We then used macroarray screening to isolate clones containing large fragments of plastid DNA. Amore » minimum tiling path of clones comprising the entire genome sequence of each plastid was selected, and these clones were shotgun-sequenced and assembled into complete genomes. Although this method worked well for both heterotrophic and autotrophic plants, nuclear genome size had a dramatic effect on the proportion of screened clones containing plastid DNA and, consequently, the overall number of clones that must be screened to ensure full plastid genome coverage. This technique makes it possible to determine complete plastid genome sequences for organisms that defy other available organellar genome sequencing methods, especially those for which limited amounts of tissue are available.« less

  19. Planetary Image Geometry Library

    NASA Technical Reports Server (NTRS)

    Deen, Robert C.; Pariser, Oleg

    2010-01-01

    The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.

  20. A new scripting library for modeling flow and transport in fractured rock with channel networks

    NASA Astrophysics Data System (ADS)

    Dessirier, Benoît; Tsang, Chin-Fu; Niemi, Auli

    2018-02-01

    Deep crystalline bedrock formations are targeted to host spent nuclear fuel owing to their overall low permeability. They are however highly heterogeneous and only a few preferential paths pertaining to a small set of dominant rock fractures usually carry most of the flow or mass fluxes, a behavior known as channeling that needs to be accounted for in the performance assessment of repositories. Channel network models have been developed and used to investigate the effect of channeling. They are usually simpler than discrete fracture networks based on rock fracture mappings and rely on idealized full or sparsely populated lattices of channels. This study reexamines the fundamental parameter structure required to describe a channel network in terms of groundwater flow and solute transport, leading to an extended description suitable for unstructured arbitrary networks of channels. An implementation of this formalism in a Python scripting library is presented and released along with this article. A new algebraic multigrid preconditioner delivers a significant speedup in the flow solution step compared to previous channel network codes. 3D visualization is readily available for verification and interpretation of the results by exporting the results to an open and free dedicated software. The new code is applied to three example cases to verify its results on full uncorrelated lattices of channels, sparsely populated percolation lattices and to exemplify the use of unstructured networks to accommodate knowledge on local rock fractures.

  1. X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin

    NASA Astrophysics Data System (ADS)

    Laycock, Silas; Yang, Jun; Christodoulou, Dimitris; Coe, Malcolm; Cappallo, Rigel; Zezas, Andreas; Ho, Wynn C. G.; Hong, JaeSub; Fingerman, Samuel; Drake, Jeremy J.; Kretschmar, Peter; Antoniou, Vallia

    2017-08-01

    We present our multi-satellite library of X-ray Pulsar observations to the community, and highlight recent science results. Available at www.xraypulsars.space the library provides a range of high-level data products, including: activity histories, pulse-profiles, phased event files, and a unique pulse-profile modeling interface. The initial release (v1.0) contains some 15 years of RXTE-PCA, Chandra ACIS-I, and XMM-PN observations of the Small Magellanic Cloud, creating a valuable record of pulsar behavior. Our library is intended to enable new progress on fundamental NS parameters and accretion physics. The major motivations are (1) Assemble a large homogeneous sample to enable population statistics. This has so far been used to map the propeller transition, and explore the role of retrograde and pro-grade accretion disks. (2) Obtain pulse-profiles for the same pulsars on many different occasions, at different luminosities and states in order to break model degeneracies. This effort has led to preliminary measurements of the offsets between magnetic and spin axes. With the addition of other satellites, and Galactic pulsars, the library will cover the entire available range of luminosity, variability timescales and accretion regimes.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sleaford, Brad W.; Hurst, Aaron M.

    This report describes the measurement, evaluation and incorporation of new -ray spectroscopic data into the Evaluated Nuclear Data File (ENDF) for nonproliferation applications. Analysis and processing techniques are described along with key deliverables that have been met over the course of this project. A total of nine new ENDF libraries have been submitted to the National Nuclear Data Center at the Brookhaven National Laboratory and are now available in the ENDF/B-VIII.beta2 release. Furthermore, this project has led to more than ten peer-reviewed publications and provided theses for ve graduate students. This project is a component of the NA-22 venture collaborationmore » on \\Correlated Nuclear Data in Fission Events" (LA14-V-CorrData-PD2Jb).« less

  3. Copper benchmark experiment for the testing of JEFF-3.2 nuclear data for fusion applications

    NASA Astrophysics Data System (ADS)

    Angelone, M.; Flammini, D.; Loreti, S.; Moro, F.; Pillon, M.; Villar, R.; Klix, A.; Fischer, U.; Kodeli, I.; Perel, R. L.; Pohorecky, W.

    2017-09-01

    A neutronics benchmark experiment on a pure Copper block (dimensions 60 × 70 × 70 cm3) aimed at testing and validating the recent nuclear data libraries for fusion applications was performed in the frame of the European Fusion Program at the 14 MeV ENEA Frascati Neutron Generator (FNG). Reaction rates, neutron flux spectra and doses were measured using different experimental techniques (e.g. activation foils techniques, NE213 scintillator and thermoluminescent detectors). This paper first summarizes the analyses of the experiment carried-out using the MCNP5 Monte Carlo code and the European JEFF-3.2 library. Large discrepancies between calculation (C) and experiment (E) were found for the reaction rates both in the high and low neutron energy range. The analysis was complemented by sensitivity/uncertainty analyses (S/U) using the deterministic and Monte Carlo SUSD3D and MCSEN codes, respectively. The S/U analyses enabled to identify the cross sections and energy ranges which are mostly affecting the calculated responses. The largest discrepancy among the C/E values was observed for the thermal (capture) reactions indicating severe deficiencies in the 63,65Cu capture and elastic cross sections at lower rather than at high energy. Deterministic and MC codes produced similar results. The 14 MeV copper experiment and its analysis thus calls for a revision of the JEFF-3.2 copper cross section and covariance data evaluation. A new analysis of the experiment was performed with the MCNP5 code using the revised JEFF-3.3-T2 library released by NEA and a new, not yet distributed, revised JEFF-3.2 Cu evaluation produced by KIT. A noticeable improvement of the C/E results was obtained with both new libraries.

  4. Increasing Chemical Space Coverage by Combining Empirical and Computational Fragment Screens

    PubMed Central

    2015-01-01

    Most libraries for fragment-based drug discovery are restricted to 1,000–10,000 compounds, but over 500,000 fragments are commercially available and potentially accessible by virtual screening. Whether this larger set would increase chemotype coverage, and whether a computational screen can pragmatically prioritize them, is debated. To investigate this question, a 1281-fragment library was screened by nuclear magnetic resonance (NMR) against AmpC β-lactamase, and hits were confirmed by surface plasmon resonance (SPR). Nine hits with novel chemotypes were confirmed biochemically with KI values from 0.2 to low mM. We also computationally docked 290,000 purchasable fragments with chemotypes unrepresented in the empirical library, finding 10 that had KI values from 0.03 to low mM. Though less novel than those discovered by NMR, the docking-derived fragments filled chemotype holes from the empirical library. Crystal structures of nine of the fragments in complex with AmpC β-lactamase revealed new binding sites and explained the relatively high affinity of the docking-derived fragments. The existence of chemotype holes is likely a general feature of fragment libraries, as calculation suggests that to represent the fragment substructures of even known biogenic molecules would demand a library of minimally over 32,000 fragments. Combining computational and empirical fragment screens enables the discovery of unexpected chemotypes, here by the NMR screen, while capturing chemotypes missing from the empirical library and tailored to the target, with little extra cost in resources. PMID:24807704

  5. Development and Testing of Neutron Cross Section Covariance Data for SCALE 6.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Williams, Mark L; Wiarda, Dorothea

    2015-01-01

    Neutron cross-section covariance data are essential for many sensitivity/uncertainty and uncertainty quantification assessments performed both within the TSUNAMI suite and more broadly throughout the SCALE code system. The release of ENDF/B-VII.1 included a more complete set of neutron cross-section covariance data: these data form the basis for a new cross-section covariance library to be released in SCALE 6.2. A range of testing is conducted to investigate the properties of these covariance data and ensure that the data are reasonable. These tests include examination of the uncertainty in critical experiment benchmark model k eff values due to nuclear data uncertainties, asmore » well as similarity assessments of irradiated pressurized water reactor (PWR) and boiling water reactor (BWR) fuel with suites of critical experiments. The contents of the new covariance library, the testing performed, and the behavior of the new covariance data are described in this paper. The neutron cross-section covariances can be combined with a sensitivity data file generated using the TSUNAMI suite of codes within SCALE to determine the uncertainty in system k eff caused by nuclear data uncertainties. The Verified, Archived Library of Inputs and Data (VALID) maintained at Oak Ridge National Laboratory (ORNL) contains over 400 critical experiment benchmark models, and sensitivity data are generated for each of these models. The nuclear data uncertainty in k eff is generated for each experiment, and the resulting uncertainties are tabulated and compared to the differences in measured and calculated results. The magnitude of the uncertainty for categories of nuclides (such as actinides, fission products, and structural materials) is calculated for irradiated PWR and BWR fuel to quantify the effect of covariance library changes between the SCALE 6.1 and 6.2 libraries. One of the primary applications of sensitivity/uncertainty methods within SCALE is the assessment of similarities between benchmark experiments and safety applications. This is described by a c k value for each experiment with each application. Several studies have analyzed typical c k values for a range of critical experiments compared with hypothetical irradiated fuel applications. The c k value is sensitive to the cross-section covariance data because the contribution of each nuclide is influenced by its uncertainty; large uncertainties indicate more likely bias sources and are thus given more weight. Changes in c k values resulting from different covariance data can be used to examine and assess underlying data changes. These comparisons are performed for PWR and BWR fuel in storage and transportation systems.« less

  6. Moldauer's sum rule as a test of the consistency of transmission coefficients in Hauser Feshbach theory

    NASA Astrophysics Data System (ADS)

    Brown, David; Nobre, Gustavo; Herman, Michal

    2017-09-01

    For neutron induced reactions below 20 MeV incident energy, the Unresolved Resonance Region (URR) connects the fast neutron region with the Resolved Resonance Region (RRR). The URR is problematic since resonances are not resolvable experimentally yet the fluctuations in the neutron cross sections play a discernible and technologically important role - the URR in a typical nucleus is in the 100 keV - 2 MeV window where the typical fission spectrum peaks. The URR also represents the transition between R-matrix theory used to describe isolated resonances and Hauser-Feshbach theory which accurately describes the average cross sections. In practice, only average or systematic features of the resonances in the URR are known and are tabulated in evaluations in a nuclear data library such as ENDF/B-VII.1. Here we apply Moldauer's ``sum rule for resonance reactions'' to compute the effective transmission coefficients for reactions in the RRR and URR regions. We compare these to the transmission coefficients used in the fast region in the EMPIRE Hauser-Feshbach code, demonstrating the consistency (or lack thereof) between these different physical regimes. This work suggests a better approach to evaluating the URR average parameters using the results from the fast region modeling. This material is based upon work supported by the US Department of Energy, Office of Science, Office of Nuclear Physics, under Contract No. DE-SC0012704 (BNL).

  7. Comparative Study on Various Geometrical Core Design of 300 MWth Gas Cooled Fast Reactor with UN-PuN Fuel Longlife without Refuelling

    NASA Astrophysics Data System (ADS)

    Dewi Syarifah, Ratna; Su'ud, Zaki; Basar, Khairul; Irwanto, Dwi

    2017-07-01

    Nuclear power has progressive improvement in the operating performance of exiting reactors and ensuring economic competitiveness of nuclear electricity around the world. The GFR use gas coolant and fast neutron spectrum. This research use helium coolant which has low neutron moderation, chemical inert and single phase. Comparative study on various geometrical core design for modular GFR with UN-PuN fuel long life without refuelling has been done. The calculation use SRAC2006 code both PIJ calculation and CITATION calculation. The data libraries use JENDL 4.0. The variation of fuel fraction is 40% until 65%. In this research, we varied the geometry of core reactor to find the optimum geometry design. The variation of the geometry design is balance cylinder; it means that the diameter active core (D) same with height active core (H). Second, pancake cylinder (D>H) and third, tall cylinder (D

  8. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  9. Sensitivity Analysis and Optimization of the Nuclear Fuel Cycle: A Systematic Approach

    NASA Astrophysics Data System (ADS)

    Passerini, Stefano

    For decades, nuclear energy development was based on the expectation that recycling of the fissionable materials in the used fuel from today's light water reactors into advanced (fast) reactors would be implemented as soon as technically feasible in order to extend the nuclear fuel resources. More recently, arguments have been made for deployment of fast reactors in order to reduce the amount of higher actinides, hence the longevity of radioactivity, in the materials destined to a geologic repository. The cost of the fast reactors, together with concerns about the proliferation of the technology of extraction of plutonium from used LWR fuel as well as the large investments in construction of reprocessing facilities have been the basis for arguments to defer the introduction of recycling technologies in many countries including the US. In this thesis, the impacts of alternative reactor technologies on the fuel cycle are assessed. Additionally, metrics to characterize the fuel cycles and systematic approaches to using them to optimize the fuel cycle are presented. The fuel cycle options of the 2010 MIT fuel cycle study are re-examined in light of the expected slower rate of growth in nuclear energy today, using the CAFCA (Code for Advanced Fuel Cycle Analysis). The Once Through Cycle (OTC) is considered as the base-line case, while advanced technologies with fuel recycling characterize the alternative fuel cycle options available in the future. The options include limited recycling in L WRs and full recycling in fast reactors and in high conversion LWRs. Fast reactor technologies studied include both oxide and metal fueled reactors. Additional fuel cycle scenarios presented for the first time in this work assume the deployment of innovative recycling reactor technologies such as the Reduced Moderation Boiling Water Reactors and Uranium-235 initiated Fast Reactors. A sensitivity study focused on system and technology parameters of interest has been conducted to test the robustness of the conclusions presented in the MIT Fuel Cycle Study. These conclusions are found to still hold, even when considering alternative technologies and different sets of simulation assumptions. Additionally, a first of a kind optimization scheme for the nuclear fuel cycle analysis is proposed and the applications of such an optimization are discussed. Optimization metrics of interest for different stakeholders in the fuel cycle (economics, fuel resource utilization, high level waste, transuranics/proliferation management, and environmental impact) are utilized for two different optimization techniques: a linear one and a stochastic one. Stakeholder elicitation provided sets of relative weights for the identified metrics appropriate to each stakeholder group, which were then successfully used to arrive at optimum fuel cycle configurations for recycling technologies. The stochastic optimization tool, based on a genetic algorithm, was used to identify non-inferior solutions according to Pareto's dominance approach to optimization. The main tradeoff for fuel cycle optimization was found to be between economics and most of the other identified metrics. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  10. Photofission cross-section ratio measurement of 235 U/ 238 U using monoenergetic photons in the energy range of 9.0–16.6 MeV

    DOE PAGES

    Krishichayan,; Bhike, Megha; Finch, S. W.; ...

    2017-05-01

    Photofission cross-section ratios of 235U and 238U have been measured using monoenergetic photon beams from the High Intensity Gamma-ray Source facility at the Triangle Universities Nuclear Laboratory. These measurements have been performed in small energy steps between 9.0 and 16.6 MeV using a dual-fission ionization chamber. The measured cross-section ratios are compared with the previous experimental data as well as with the recent evaluated nuclear data library ENDF.

  11. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, N.; Bossant, M.; Dupont, E.

    2014-06-01

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  12. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soppera, N., E-mail: nicolas.soppera@oecd.org; Bossant, M.; Dupont, E.

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  13. Computer program FPIP-REV calculates fission product inventory for U-235 fission

    NASA Technical Reports Server (NTRS)

    Brown, W. S.; Call, D. W.

    1967-01-01

    Computer program calculates fission product inventories and source strengths associated with the operation of U-235 fueled nuclear power reactor. It utilizes a fission-product nuclide library of 254 nuclides, and calculates the time dependent behavior of the fission product nuclides formed by fissioning of U-235.

  14. Electron Microscopy Lab

    Science.gov Websites

    Facilities Science Pillars Research Library Science Briefs Science News Science Highlights Lab Organizations Science Programs Applied Energy Programs Civilian Nuclear Energy Programs Laboratory Directed Research Science Seaborg Institute Fellows Conferences Research Opportunities Center for Integrated

  15. 76 FR 63330 - Policy Regarding Submittal of Amendments for Processing of Equivalent Feed at Licensed Uranium...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-12

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0217] Policy Regarding Submittal of Amendments for... NRC's policy regarding receipt and processing, without a license amendment, of equivalent feed at an... and Management System (ADAMS) and in the NRC Library, and update the date voluntary responses should...

  16. 77 FR 64834 - Computational Fluid Dynamics Best Practice Guidelines for Dry Cask Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-23

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0250] Computational Fluid Dynamics Best Practice... public comments on draft NUREG-2152, ``Computational Fluid Dynamics Best Practice Guidelines for Dry Cask... System (ADAMS): You may access publicly-available documents online in the NRC Library at http://www.nrc...

  17. Resonance Parameter Adjustment Based on Integral Experiments

    DOE PAGES

    Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...

    2016-06-02

    Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less

  18. Determination of parameters of a nuclear reactor through noise measurements

    DOEpatents

    Cohn, C.E.

    1975-07-15

    A method of measuring parameters of a nuclear reactor by noise measurements is described. Noise signals are developed by the detectors placed in the reactor core. The polarity coincidence between the noise signals is used to develop quantities from which various parameters of the reactor can be calculated. (auth)

  19. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  20. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    NASA Astrophysics Data System (ADS)

    Frisoni, Manuela

    2016-03-01

    ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1) containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2) containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG) of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E) are shown and discussed in this paper.

  1. Advances toward fully automated in vivo assessment of oral epithelial dysplasia by nuclear endomicroscopy-A pilot study.

    PubMed

    Liese, Jan; Winter, Karsten; Glass, Änne; Bertolini, Julia; Kämmerer, Peer Wolfgang; Frerich, Bernhard; Schiefke, Ingolf; Remmerbach, Torsten W

    2017-11-01

    Uncertainties in detection of oral epithelial dysplasia (OED) frequently result from sampling error especially in inflammatory oral lesions. Endomicroscopy allows non-invasive, "en face" imaging of upper oral epithelium, but parameters of OED are unknown. Mucosal nuclei were imaged in 34 toluidine blue-stained oral lesions with a commercial endomicroscopy. Histopathological diagnosis showed four biopsies in "dys-/neoplastic," 23 in "inflammatory," and seven in "others" disease groups. Strength of different assessment strategies of nuclear scoring, nuclear count, and automated nuclear analysis were measured by area under ROC curve (AUC) to identify histopathological "dys-/neoplastic" group. Nuclear objects from automated image analysis were visually corrected. Best-performing parameters of nuclear-to-image ratios were the count of large nuclei (AUC=0.986) and 6-nearest neighborhood relation (AUC=0.896), and best parameters of nuclear polymorphism were the count of atypical nuclei (AUC=0.996) and compactness of nuclei (AUC=0.922). Excluding low-grade OED, nuclear scoring and count reached 100% sensitivity and 98% specificity for detection of dys-/neoplastic lesions. In automated analysis, combination of parameters enhanced diagnostic strength. Sensitivity of 100% and specificity of 87% were seen for distances of 6-nearest neighbors and aspect ratios even in uncorrected objects. Correction improved measures of nuclear polymorphism only. The hue of background color was stronger than nuclear density (AUC=0.779 vs 0.687) to detect dys-/neoplastic group indicating that macroscopic aspect is biased. Nuclear-to-image ratios are applicable for automated optical in vivo diagnostics for oral potentially malignant disorders. Nuclear endomicroscopy may promote non-invasive, early detection of dys-/neoplastic lesions by reducing sampling error. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. CHEMICAL EVOLUTION LIBRARY FOR GALAXY FORMATION SIMULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saitoh, Takayuki R., E-mail: saitoh@elsi.jp

    We have developed a software library for chemical evolution simulations of galaxy formation under the simple stellar population (SSP) approximation. In this library, all of the necessary components concerning chemical evolution, such as initial mass functions, stellar lifetimes, yields from Type II and Type Ia supernovae, asymptotic giant branch stars, and neutron star mergers, are compiled from the literature. Various models are pre-implemented in this library so that users can choose their favorite combination of models. Subroutines of this library return released energy and masses of individual elements depending on a given event type. Since the redistribution manner of thesemore » quantities depends on the implementation of users’ simulation codes, this library leaves it up to the simulation code. As demonstrations, we carry out both one-zone, closed-box simulations and 3D simulations of a collapsing gas and dark matter system using this library. In these simulations, we can easily compare the impact of individual models on the chemical evolution of galaxies, just by changing the control flags and parameters of the library. Since this library only deals with the part of chemical evolution under the SSP approximation, any simulation codes that use the SSP approximation—namely, particle-base and mesh codes, as well as semianalytical models—can use it. This library is named “CELib” after the term “Chemical Evolution Library” and is made available to the community.« less

  3. Optimizing the Performance of Radionuclide Identification Software in the Hunt for Nuclear Security Threats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fotion, Katherine A.

    2016-08-18

    The Radionuclide Analysis Kit (RNAK), my team’s most recent nuclide identification software, is entering the testing phase. A question arises: will removing rare nuclides from the software’s library improve its overall performance? An affirmative response indicates fundamental errors in the software’s framework, while a negative response confirms the effectiveness of the software’s key machine learning algorithms. After thorough testing, I found that the performance of RNAK cannot be improved with the library choice effect, thus verifying the effectiveness of RNAK’s algorithms—multiple linear regression, Bayesian network using the Viterbi algorithm, and branch and bound search.

  4. Real time gamma-ray signature identifier

    DOEpatents

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  5. Anisn-Dort Neutron-Gamma Flux Intercomparison Exercise for a Simple Testing Model

    NASA Astrophysics Data System (ADS)

    Boehmer, B.; Konheiser, J.; Borodkin, G.; Brodkin, E.; Egorov, A.; Kozhevnikov, A.; Zaritsky, S.; Manturov, G.; Voloschenko, A.

    2003-06-01

    The ability of transport codes ANISN, DORT, ROZ-6, MCNP and TRAMO, as well as nuclear data libraries BUGLE-96, ABBN-93, VITAMIN-B6 and ENDF/B-6 to deliver consistent gamma and neutron flux results was tested in the calculation of a one-dimensional cylindrical model consisting of a homogeneous core and an outer zone with a single material. Model variants with H2O, Fe, Cr and Ni in the outer zones were investigated. The results are compared with MCNP-ENDF/B-6 results. Discrepancies are discussed. The specified test model is proposed as a computational benchmark for testing calculation codes and data libraries.

  6. Implementation of the NMR CHEmical Shift Covariance Analysis (CHESCA): A Chemical Biologist's Approach to Allostery.

    PubMed

    Boulton, Stephen; Selvaratnam, Rajeevan; Ahmed, Rashik; Melacini, Giuseppe

    2018-01-01

    Mapping allosteric sites is emerging as one of the central challenges in physiology, pathology, and pharmacology. Nuclear Magnetic Resonance (NMR) spectroscopy is ideally suited to map allosteric sites, given its ability to sense at atomic resolution the dynamics underlying allostery. Here, we focus specifically on the NMR CHEmical Shift Covariance Analysis (CHESCA), in which allosteric systems are interrogated through a targeted library of perturbations (e.g., mutations and/or analogs of the allosteric effector ligand). The atomic resolution readout for the response to such perturbation library is provided by NMR chemical shifts. These are then subject to statistical correlation and covariance analyses resulting in clusters of allosterically coupled residues that exhibit concerted responses to the common set of perturbations. This chapter provides a description of how each step in the CHESCA is implemented, starting from the selection of the perturbation library and ending with an overview of different clustering options.

  7. Validation of the BUGJEFF311.BOLIB, BUGENDF70.BOLIB and BUGLE-B7 broad-group libraries on the PCA-Replica (H2O/Fe) neutron shielding benchmark experiment

    NASA Astrophysics Data System (ADS)

    Pescarini, Massimo; Orsi, Roberto; Frisoni, Manuela

    2016-03-01

    The PCA-Replica 12/13 (H2O/Fe) neutron shielding benchmark experiment was analysed using the TORT-3.2 3D SN code. PCA-Replica reproduces a PWR ex-core radial geometry with alternate layers of water and steel including a pressure vessel simulator. Three broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format with the same energy group structure (47 n + 20 γ) and based on different nuclear data were alternatively used: the ENEA BUGJEFF311.BOLIB (JEFF-3.1.1) and UGENDF70.BOLIB (ENDF/B-VII.0) libraries and the ORNL BUGLE-B7 (ENDF/B-VII.0) library. Dosimeter cross sections derived from the IAEA IRDF-2002 dosimetry file were employed. The calculated reaction rates for the Rh-103(n,n')Rh-103m, In-115(n,n')In-115m and S-32(n,p)P-32 threshold activation dosimeters and the calculated neutron spectra are compared with the corresponding experimental results.

  8. Library of Giant Planet Reflection Spectra for WFirst and Future Space Telescopes

    NASA Astrophysics Data System (ADS)

    Smith, Adam J. R. W.; Fortney, Jonathan; Morley, Caroline; Batalha, Natasha E.; Lewis, Nikole K.

    2018-01-01

    Future large space space telescopes will be able to directly image exoplanets in optical light. The optical light of a resolved planet is due to stellar flux reflected by Rayleigh scattering or cloud scattering, with absorption features imprinted due to molecular bands in the planetary atmosphere. To aid in the design of such missions, and to better understand a wide range of giant planet atmospheres, we have built a library of model giant planet reflection spectra, for the purpose of determining effective methods of spectral analysis as well as for comparison with actual imaged objects. This library covers a wide range of parameters: objects are modeled at ten orbital distances between 0.5 AU and 5.0 AU, which ranges from planets too warm for water clouds, out to those that are true Jupiter analogs. These calculations include six metalicities between solar and 100x solar, with a variety of different cloud thickness parameters, and across all possible phase angles.

  9. Role of morphometry in the cytological differentiation of benign and malignant thyroid lesions

    PubMed Central

    Khatri, Pallavi; Choudhury, Monisha; Jain, Manjula; Thomas, Shaji

    2017-01-01

    Context: Thyroid nodules represent a common problem, with an estimated prevalence of 4–7%. Although fine needle aspiration cytology (FNAC) has been accepted as a first line diagnostic test, the rate of false negative reports of malignancy is still high. Nuclear morphometry is the measurement of nuclear parameters by image analysis. Image analysis can merge the advantages of morphologic interpretation with those of quantitative data. Aims: To evaluate the nuclear morphometric parameters in fine needle aspirates of thyroid lesions and to study its role in differentiating benign from malignant thyroid lesions. Material and Methods: The study included 19 benign and 16 malignant thyroid lesions. Image analysis was performed on Giemsa-stained FNAC slides by Nikon NIS-Elements Advanced Research software (Version 4.00). Nuclear morphometric parameters analyzed included nuclear size, shape, texture, and density parameters. Statistical Analysis: Normally distributed continuous variables were compared using the unpaired t-test for two groups and analysis of variance was used for three or more groups. Tukey or Tamhane's T2 multiple comparison test was used to assess the differences between the individual groups. Categorical variables were analyzed using the chi square test. Results and Conclusion: Five out of the six nuclear size parameters as well as all the texture and density parameters studied were significant in distinguishing between benign and malignant thyroid lesions (P < 0.05). Cut-off values were derived to differentiate between benign and malignant cases. PMID:28182069

  10. Using Computerized Cytomorphometry to Distinguish between Benign and Malignant Cases in Thyroid Fine-Needle Aspiration Cytology.

    PubMed

    Celik, Zeliha Esin; Altinay, Serdar; Kilinc, Fahriye; Arslan, Nur; Yilmaz, Burcu Sanal; Karabagli, Pınar; Ugurluoglu, Ceyhan

    2016-11-01

    Only a small number of studies on computerized cytomorphometry have been performed for thyroid FNAC. The present study aimed to determine the usefulness of computerized cytomorphometry methods to further classify thyroid lesions as benign or malignant and to compare the practicability and value of using Papanicolaou (Pap) and Giemsa stains in thyroid FNAC by evaluating their association to various cytologic nuclear parameters. Fifty-eight thyroid lesions diagnosed by FNAC and categorized according to the Bethesda system for reporting thyroid cytopathology were evaluated in terms of various cytologic nuclear parameters, including nuclear area (NA), nuclear perimeter (NP), nuclear density (ND), long nuclear diameter (LND), and short nuclear diameter (SND). The Pap- and Giemsa-stained slides were examined separately. In the malignant cases, NA, NP, LND, and SND were higher than in the benign cases for both the Pap and Giemsa stains. NA, NP, LND, and SND were higher in Giemsa than Pap for both the benign and malignant groups. Statistically significant differences were detected between the benign and malignant cases in the AUS category. Computerized cytomorphometry is useful in distinguishing between benign and malignant lesions in thyroid FNAC. The measurement of cytologic nuclear parameters in cases suggestive of AUS may be useful for the probable classification of cases as benign or malignant. Although further studies are needed, in nuclear morphometric assessment of thyroid FNAC, Giemsa staining may be more useful and valuable than the Pap stain because of its association with various cytologic nuclear parameters. Diagn. Cytopathol. 2016;44:902-911. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Revived STIS. II. Properties of Stars in the Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Heap, Sara R.; Lindler, D.

    2010-01-01

    Spectroscopic surveys of galaxies at high redshift will bring the rest-frame ultraviolet into view of large, ground-based telescopes. The UV-blue spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). The NGSL contains UV-optical spectra (0.2 - 1.0 microns) of 374 stars having a wide range in temperature, luminosity, and metallicity. We will describe our work to derive basic stellar parameters from NGSL spectra using modern model spectra and to use these stellar parameters to develop UV-blue spectral diagnostics.

  12. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    NASA Astrophysics Data System (ADS)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.

  13. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  14. Uncertainty quantification in (α,n) neutron source calculations for an oxide matrix

    DOE PAGES

    Pigni, M. T.; Croft, S.; Gauld, I. C.

    2016-04-25

    Here we present a methodology to propagate nuclear data covariance information in neutron source calculations from (α,n) reactions. The approach is applied to estimate the uncertainty in the neutron generation rates for uranium oxide fuel types due to uncertainties on 1) 17,18O( α,n) reaction cross sections and 2) uranium and oxygen stopping power cross sections. The procedure to generate reaction cross section covariance information is based on the Bayesian fitting method implemented in the R-matrix SAMMY code. The evaluation methodology uses the Reich-Moore approximation to fit the 17,18O(α,n) reaction cross-sections in order to derive a set of resonance parameters andmore » a related covariance matrix that is then used to calculate the energydependent cross section covariance matrix. The stopping power cross sections and related covariance information for uranium and oxygen were obtained by the fit of stopping power data in the -energy range of 1 keV up to 12 MeV. Cross section perturbation factors based on the covariance information relative to the evaluated 17,18O( α,n) reaction cross sections, as well as uranium and oxygen stopping power cross sections, were used to generate a varied set of nuclear data libraries used in SOURCES4C and ORIGEN for inventory and source term calculations. The set of randomly perturbed output (α,n) source responses, provide the mean values and standard deviations of the calculated responses reflecting the uncertainties in nuclear data used in the calculations. Lastly, the results and related uncertainties are compared with experiment thick target (α,n) yields for uranium oxide.« less

  15. Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor

    DOE PAGES

    Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.; ...

    2017-02-28

    Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less

  16. Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.

    Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less

  17. Subgroup A : nuclear model codes report to the Sixteenth Meeting of the WPEC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talou, P.; Chadwick, M. B.; Dietrich, F. S.

    2004-01-01

    The Subgroup A activities focus on the development of nuclear reaction models and codes, used in evaluation work for nuclear reactions from the unresolved energy region up to the pion threshold production limit, and for target nuclides from the low teens and heavier. Much of the efforts are devoted by each participant to the continuing development of their own Institution codes. Progresses in this arena are reported in detail for each code in the present document. EMPIRE-II is of public access. The release of the TALYS code has been announced for the ND2004 Conference in Santa Fe, NM, October 2004.more » McGNASH is still under development and is not expected to be released in the very near future. In addition, Subgroup A members have demonstrated a growing interest in working on common modeling and codes capabilities, which would significantly reduce the amount of duplicate work, help manage efficiently the growing lines of existing codes, and render codes inter-comparison much easier. A recent and important activity of the Subgroup A has therefore been to develop the framework and the first bricks of the ModLib library, which is constituted of mostly independent pieces of codes written in Fortran 90 (and above) to be used in existing and future nuclear reaction codes. Significant progresses in the development of ModLib have been made during the past year. Several physics modules have been added to the library, and a few more have been planned in detail for the coming year.« less

  18. The UF/NCI family of hybrid computational phantoms representing the current US population of male and female children, adolescents, and adults—application to CT dosimetry

    NASA Astrophysics Data System (ADS)

    Geyer, Amy M.; O'Reilly, Shannon; Lee, Choonsik; Long, Daniel J.; Bolch, Wesley E.

    2014-09-01

    Substantial increases in pediatric and adult obesity in the US have prompted a major revision to the current UF/NCI (University of Florida/National Cancer Institute) family of hybrid computational phantoms to more accurately reflect current trends in larger body morphometry. A decision was made to construct the new library in a gridded fashion by height/weight without further reference to age-dependent weight/height percentiles as these become quickly outdated. At each height/weight combination, circumferential parameters were defined and used for phantom construction. All morphometric data for the new library were taken from the CDC NHANES survey data over the time period 1999-2006, the most recent reported survey period. A subset of the phantom library was then used in a CT organ dose sensitivity study to examine the degree to which body morphometry influences the magnitude of organ doses for patients that are underweight to morbidly obese in body size. Using primary and secondary morphometric parameters, grids containing 100 adult male height/weight bins, 93 adult female height/weight bins, 85 pediatric male height/weight bins and 73 pediatric female height/weight bins were constructed. These grids served as the blueprints for construction of a comprehensive library of patient-dependent phantoms containing 351 computational phantoms. At a given phantom standing height, normalized CT organ doses were shown to linearly decrease with increasing phantom BMI for pediatric males, while curvilinear decreases in organ dose were shown with increasing phantom BMI for adult females. These results suggest that one very useful application of the phantom library would be the construction of a pre-computed dose library for CT imaging as needed for patient dose-tracking.

  19. Single-level resonance parameters fit nuclear cross-sections

    NASA Technical Reports Server (NTRS)

    Drawbaugh, D. W.; Gibson, G.; Miller, M.; Page, S. L.

    1970-01-01

    Least squares analyses of experimental differential cross-section data for the U-235 nucleus have yielded single level Breit-Wigner resonance parameters that fit, simultaneously, three nuclear cross sections of capture, fission, and total.

  20. Computer program TRACK_TEST for calculating parameters and plotting profiles for etch pits in nuclear track materials

    NASA Astrophysics Data System (ADS)

    Nikezic, D.; Yu, K. N.

    2006-01-01

    A computer program called TRACK_TEST for calculating parameters (lengths of the major and minor axes) and plotting profiles in nuclear track materials resulted from light-ion irradiation and subsequent chemical etching is described. The programming steps are outlined, including calculations of alpha-particle ranges, determination of the distance along the particle trajectory penetrated by the chemical etchant, calculations of track coordinates, determination of the lengths of the major and minor axes and determination of the contour of the track opening. Descriptions of the program are given, including the built-in V functions for the two commonly employed nuclear track materials commercially known as LR 115 (cellulose nitrate) and CR-39 (poly allyl diglycol carbonate) irradiated by alpha particles. Program summaryTitle of the program:TRACK_TEST Catalogue identifier:ADWT Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWT Computer:Pentium PC Operating systems:Windows 95+ Programming language:Fortran 90 Memory required to execute with typical data:256 MB No. of lines in distributed program, including test data, etc.: 2739 No. of bytes in distributed program, including test data, etc.:204 526 Distribution format:tar.gz External subprograms used:The entire code must be linked with the MSFLIB library Nature of problem: Fast heavy charged particles (like alpha particles and other light ions etc.) create latent tracks in some dielectric materials. After chemical etching in aqueous NaOH or KOH solutions, these tracks become visible under an optical microscope. The growth of a track is based on the simultaneous actions of the etchant on undamaged regions (with the bulk etch rate V) and along the particle track (with the track etch rate V). Growth of the track is described satisfactorily by these two parameters ( V and V). Several models have been presented in the past describing the track development, one of which is the model of Nikezic and Yu (2003) [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45] used in the present program. The present computer program has been written to calculate coordinates of points on the track wall and to determine other relevant track parameters. Solution method:Coordinates of points on the track wall assuming normal incidence were calculated by using the method as described by Fromm et al. (1988) [M. Fromm, A. Chambaudet, F. Membrey, Data bank for alpha particle tracks in CR39 with energies ranging from 0.5 to 5 MeV recording for various incident angles, Nucl. Tracks Radiat. Meas. 15 (1988) 115-118]. The track is then rotated through the incident angle in order to obtain the coordinates of the oblique track [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45; D. Nikezic, Three dimensional analytical determination of the track parameters, Radiat. Meas. 32 (2000) 277-282]. In this way, the track profile in two dimensions (2D) was obtained. In the next step, points in the track wall profile are rotated around the particle trajectory. In this way, circles that outline the track in three dimensions (3D) are obtained. The intersection between the post-etching surface of the detector and the 3D track is the track opening (or the track contour). Coordinates of the track 2D and 3D profiles and the track opening are saved in separate output data files. Restrictions: The program cannot calculate track parameters for the incident angle of exactly 90°. The alpha-particle energy should be smaller than 10 MeV. Furthermore, the program cannot perform calculations for tracks in some extreme cases, such as for very low incident energies or very small incident angles. Additional comments: This is a freeware, but publications arising from using this program should cite the present paper and the paper describing the track growth model [D. Nikezic, K.N. Yu, Three-dimensional analytical determination of the track parameters. Over-etched tracks, Radiat. Meas. 37 (2003) 39-45]. Moreover, the references for the V functions used should also be cited. For the CR-39 detector: Function (1): S.A. Durrani, R.K. Bull, Solid State Nuclear Track Detection. Principles, Methods and Applications, Pergamon Press, 1987. Function (2): C. Brun, M. Fromm, M. Jouffroy, P. Meyer, J.E. Groetz, F. Abel, A. Chambaudet, B. Dorschel, D. Hermsdorf, R. Bretschneider, K. Kadner, H. Kuhne, Intercomparative study of the detection characteristics of the CR-39 SSNTD for light ions: Present status of the Besancon-Dresden approaches, Radiat. Meas. 31 (1999) 89-98. Function (3): K.N. Yu, F.M.F. Ng, D. Nikezic, Measuring depths of sub-micron tracks in a CR-39 detector from replicas using atomic force microscopy, Radiat. Meas. 40 (2005) 380-383. For the LR 115 detector: Function (1): S.A. Durrani, P.F. Green, The effect of etching conditions on the response of LR 115, Nucl. Tracks 8 (1984) 21-24. Function (2): C.W.Y. Yip, D. Nikezic, J.P.Y Ho, K.N. Yu, Chemical etching characteristics for cellulose nitrate, Mat. Chem. Phys. 95 (2005) 307-312. Running time: Order of several minutes, dependent on input parameters and the resolution requested by the user.

  1. a New ENDF/B-VII.0 Based Multigroup Cross-Section Library for Reactor Dosimetry

    NASA Astrophysics Data System (ADS)

    Alpan, F. A.; Anderson, S. L.

    2009-08-01

    The latest of the ENDF/B libraries, ENDF/B-VII.0 was released in December 2006. In this paper, the ENDF/B-VII.O evaluations were used in generating a new coupled neutron/gamma multigroup library having the same group structure of VITAMIN-B6, i.e., the 199-neutron, 42-gamma group library. The new library was generated utilizing NJOY99.259 for pre-processing and the AMPX modules for post-processing of cross sections. An ENDF/B-VI.3 based VITAMIN-B6-like library was also generated. The fine-group libraries and the ENDF/B-VI.3 based 47-neutron, 20-gamma group BUGLE-96 library were used with the discrete ordinates code DORT to obtain a three-dimensional synthesized flux distribution from r, r-θ, and r-z models for a standard Westinghouse 3-loop design reactor. Reaction rates were calculated for ex-vessel neutron dosimetry containing 63Cu(n,α)60Co, 46Ti(n,p)46Sc, 54Fe(n,P)54Mn, 58Ni(n,P)58Co, 238U(n,f)137Cs, 237Np(n,f)137Cs, and 59Co(n,γ)60Co (bare and cadmium covered) reactions. Results were compared to measurements. In comparing the 199-neutron, 42-gamma group ENDF/B-VI.3 and ENDF/B-VII.O libraries, it was observed that the ENDF/B-VI.3 based library results were in better agreement with measurements. There is a maximum difference of 7% (for the 63Cu(n,α)60Co reaction rate calculation) between ENDF/B-VI.3 and ENDF/B-VII.O. Differences between ENDF/B-VI.3 and ENDF/B-VII.O libraries are due to 16O, 1H, 90Zr, 91Zr, 92Zr, 238U, and 239Pu evaluations. Both ENDF/B-VI.3 and ENDF/B-VII.O library calculated reaction rates are within 20% of measurement and meet the criterion specified in the U. S. Nuclear Regulatory Commission Regulatory Guide 1.190, "Calculational and Dosimetry Methods for Determining Pressure Vessel Neutron Fluence."

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B.; Mughabghab, S.F.

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present papermore » contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.« less

  3. The relationship of quantitative nuclear morphology to molecular genetic alterations in the adenoma-carcinoma sequence of the large bowel.

    PubMed Central

    Mulder, J. W.; Offerhaus, G. J.; de Feyter, E. P.; Floyd, J. J.; Kern, S. E.; Vogelstein, B.; Hamilton, S. R.

    1992-01-01

    The relationship of abnormal nuclear morphology to molecular genetic alterations that are important in colorectal tumorigenesis is unknown. Therefore, Feulgen-stained isolated nuclei from 22 adenomas and 42 carcinomas that had been analyzed for ras gene mutations and allelic deletions on chromosomes 5q, 18q, and 17p were characterized by computerized image analysis. Both nuclear area and the nuclear shape factor representing irregularity correlated with adenoma-carcinoma progression (r = 0.57 and r = 0.52, P < 0.0001), whereas standard nuclear texture, a parameter of chromatin homogeneity, was inversely correlated with progression (r = -0.80, P < 0.0001). The nuclear parameters were strongly interrelated (P < 0.0005). In multivariate analysis, the nuclear parameters were predominantly associated with adenoma-carcinoma progression (P < or = 0.0001) and were not influenced significantly by the individual molecular genetic alterations. Nuclear texture, however, was inversely correlated with fractional allelic loss, a global measure of genetic changes, in carcinomas (r = -0.39, P = 0.011). The findings indicate that nuclear morphology in colorectal neoplasms is strongly related to tumor progression. Nuclear morphology and biologic behavior appear to be influenced by accumulated alterations in cancer-associated genes. Images Figure 1 PMID:1357973

  4. Atmospheric and Fundamental Parameters of Stars in Hubble's Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Heap, Sally

    2010-01-01

    Hubble's Next Generation Spectral Library (NGSL) consists of R approximately 1000 spectra of 374 stars of assorted temperature, gravity, and metallicity. We are presently working to determine the atmospheric and fundamental parameters of the stars from the NGSL spectra themselves via full-spectrum fitting of model spectra to the observed (extinction-corrected) spectrum over the full wavelength range, 0.2-1.0 micron. We use two grids of model spectra for this purpose: the very low-resolution spectral grid from Castelli-Kurucz (2004), and the grid from MARCS (2008). Both the observed spectrum and the MARCS spectra are first degraded in resolution to match the very low resolution of the Castelli-Kurucz models, so that our fitting technique is the same for both model grids. We will present our preliminary results with a comparison with those from the Sloan/Segue Stellar Parameter Pipeline, ELODIE, and MILES, etc.

  5. The Aldermaston Nuclear Data Library.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    1983-12-01

    UKNDL81 contains the 1981 editions of NDL1, NDL2, and NDL3, and also references nearly 500 UKNDL archived files. The user can see what neutron reaction cross sections are available in any given file and the energy range over which these data are tabulated by referring to the Story and Smith report cited in references.

  6. 76 FR 53500 - Notice of the Nuclear Regulatory Commission Issuance of Materials License SUA-1598 and Record of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... (ADAMS), which provides text and image files of the NRC's public documents in the NRC Library at http... considered, but eliminated from detailed analysis, include conventional uranium mining and milling, conventional mining and heap leach processing, alternate lixiviants, and alternative wastewater disposal...

  7. Preparation of Ferroelectric Samples for Electrical and Radiation Characterization Studies

    DTIC Science & Technology

    1991-12-01

    Nuclear Agency Attn Technology Dir Attn RAEE , LTC A. Constantine 5001 Eisenhower Ave Attn RAEE , MAJ G. Kweder Alexandria, VA 22333-0001 Attn RAEE , L...Palkuti Attn RAEE , LCDR L. Cohn Director Attn TITL, Technical Library Div Night Vision & Electro-Optics Lab.. LABCOM 680’ Telegraph RD Attn AMSEL-TMS

  8. Computer program calculates gamma ray source strengths of materials exposed to neutron fluxes

    NASA Technical Reports Server (NTRS)

    Heiser, P. C.; Ricks, L. O.

    1968-01-01

    Computer program contains an input library of nuclear data for 44 elements and their isotopes to determine the induced radioactivity for gamma emitters. Minimum input requires the irradiation history of the element, a four-energy-group neutron flux, specification of an alloy composition by elements, and selection of the output.

  9. Exclusive data-based modeling of neutron-nuclear reactions below 20 MeV

    NASA Astrophysics Data System (ADS)

    Savin, Dmitry; Kosov, Mikhail

    2017-09-01

    We are developing CHIPS-TPT physics library for exclusive simulation of neutron-nuclear reactions below 20 MeV. Exclusive modeling reproduces each separate scattering and thus requires conservation of energy, momentum and quantum numbers in each reaction. Inclusive modeling reproduces only selected values while averaging over the others and imposes no such constraints. Therefore the exclusive modeling allows to simulate additional quantities like secondary particle correlations and gamma-lines broadening and avoid artificial fluctuations. CHIPS-TPT is based on the formerly included in Geant4 CHIPS library, which follows the exclusive approach, and extends it to incident neutrons with the energy below 20 MeV. The NeutronHP model for neutrons below 20 MeV included in Geant4 follows the inclusive approach like the well known MCNP code. Unfortunately, the available data in this energy region is mostly presented in ENDF-6 format and semi-inclusive. Imposing additional constraints on secondary particles complicates modeling but also allows to detect inconsistencies in the input data and to avoid errors that may remain unnoticed in inclusive modeling.

  10. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  11. CHEMKIN2. General Gas-Phase Chemical Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupley, F.M.

    1992-01-24

    CHEMKIN is a high-level tool for chemists to use to describe arbitrary gas-phase chemical reaction mechanisms and systems of governing equations. It remains, however, for the user to select and implement a solution method; this is not provided. It consists of two major components: the Interpreter and the Gas-phase Subroutine Library. The Interpreter reads a symbolic description of an arbitrary, user-specified chemical reaction mechanism. A data file is generated which forms a link to the Gas-phase Subroutine Library, a collection of about 200 modular subroutines which may be called to return thermodynamic properties, chemical production rates, derivatives of thermodynamic properties,more » derivatives of chemical production rates, or sensitivity parameters. Both single and double precision versions of CHEMKIN are included. Also provided is a set of FORTRAN subroutines for evaluating gas-phase transport properties such as thermal conductivities, viscosities, and diffusion coefficients. These properties are an important part of any computational simulation of a chemically reacting flow. The transport properties subroutines are designed to be used in conjunction with the CHEMKIN Subroutine Library. The transport properties depend on the state of the gas and on certain molecular parameters. The parameters considered are the Lennard-Jones potential well depth and collision diameter, the dipole moment, the polarizability, and the rotational relaxation collision number.« less

  12. Scanning electron microscope measurement of width and shape of 10nm patterned lines using a JMONSEL-modeled library.

    PubMed

    Villarrubia, J S; Vladár, A E; Ming, B; Kline, R J; Sunday, D F; Chawla, J S; List, S

    2015-07-01

    The width and shape of 10nm to 12 nm wide lithographically patterned SiO2 lines were measured in the scanning electron microscope by fitting the measured intensity vs. position to a physics-based model in which the lines' widths and shapes are parameters. The approximately 32 nm pitch sample was patterned at Intel using a state-of-the-art pitch quartering process. Their narrow widths and asymmetrical shapes are representative of near-future generation transistor gates. These pose a challenge: the narrowness because electrons landing near one edge may scatter out of the other, so that the intensity profile at each edge becomes width-dependent, and the asymmetry because the shape requires more parameters to describe and measure. Modeling was performed by JMONSEL (Java Monte Carlo Simulation of Secondary Electrons), which produces a predicted yield vs. position for a given sample shape and composition. The simulator produces a library of predicted profiles for varying sample geometry. Shape parameter values are adjusted until interpolation of the library with those values best matches the measured image. Profiles thereby determined agreed with those determined by transmission electron microscopy and critical dimension small-angle x-ray scattering to better than 1 nm. Published by Elsevier B.V.

  13. SELECTIVE DISSEMINATION OF INFORMATION--REVIEW OF SELECTED SYSTEMS AND A DESIGN FOR ARMY TECHNICAL LIBRARIES. FINAL REPORT. ARMY TECHNICAL LIBRARY IMPROVEMENT STUDIES (ATLIS), REPORT NO. 8.

    ERIC Educational Resources Information Center

    BIVONA, WILLIAM A.

    THIS REPORT PRESENTS AN ANALYSIS OF OVER EIGHTEEN SMALL, INTERMEDIATE, AND LARGE SCALE SYSTEMS FOR THE SELECTIVE DISSEMINATION OF INFORMATION (SDI). SYSTEMS ARE COMPARED AND ANALYZED WITH RESPECT TO DESIGN CRITERIA AND THE FOLLOWING NINE SYSTEM PARAMETERS--(1) INFORMATION INPUT, (2) METHODS OF INDEXING AND ABSTRACTING, (3) USER INTEREST PROFILE…

  14. Nuclear envelope expansion is crucial for proper chromosomal segregation during a closed mitosis.

    PubMed

    Takemoto, Ai; Kawashima, Shigehiro A; Li, Juan-Juan; Jeffery, Linda; Yamatsugu, Kenzo; Elemento, Olivier; Nurse, Paul

    2016-03-15

    Here, we screened a 10,371 library of diverse molecules using a drug-sensitive fission yeast strain to identify compounds which cause defects in chromosome segregation during mitosis. We identified a phosphorium-ylide-based compound Cutin-1 which inhibits nuclear envelope expansion and nuclear elongation during the closed mitosis of fission yeast, and showed that its target is the β-subunit of fatty acid synthase. A point mutation in the dehydratase domain of Fas1 conferred in vivo and in vitro resistance to Cutin-1. Time-lapse photomicrography showed that the bulk of the chromosomes were only transiently separated during mitosis, and nucleoli separation was defective. Subsequently sister chromatids re-associated leading to chromosomal mis-segregation. These segregation defects were reduced when the nuclear volume was increased and were increased when the nuclear volume was reduced. We propose that there needs to be sufficient nuclear volume to allow the nuclear elongation necessary during a closed mitosis to take place for proper chromosome segregation, and that inhibition of fatty acid synthase compromises nuclear elongation and leads to defects in chromosomal segregation. © 2016. Published by The Company of Biologists Ltd.

  15. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  16. Computer Center CDC Libraries/NSRD (Subprograms).

    DTIC Science & Technology

    1984-06-01

    VALUES Y - ARRAY OR CORRESPONDING Y-VALUES N - NUMBER OF VALUES CM REQUIRED: IOOB ERROR MESSAGE ’ L=XXXXX, X=X.XXXXXXX E+YY, X NOT MONOTONE STOP SELF ...PARAMETERS (SUBSEQUENT REPORTS MAY BE UNSOLICITED) . PCRTP1 - REQUEST TERMINAL PARAMETERS (SUBSEQUENT REPORTS ONLY IN RESPOSE TO HOST REQUEST) DA - REQUEST

  17. Balancing novelty with confined chemical space in modern drug discovery.

    PubMed

    Medina-Franco, José L; Martinez-Mayorga, Karina; Meurice, Nathalie

    2014-02-01

    The concept of chemical space has broad applications in drug discovery. In response to the needs of drug discovery campaigns, different approaches are followed to efficiently populate, mine and select relevant chemical spaces that overlap with biologically relevant chemical spaces. This paper reviews major trends in current drug discovery and their impact on the mining and population of chemical space. We also survey different approaches to develop screening libraries with confined chemical spaces balancing physicochemical properties. In this context, the confinement is guided by criteria that can be divided in two broad categories: i) library design focused on a relevant therapeutic target or disease and ii) library design focused on the chemistry or a desired molecular function. The design and development of chemical libraries should be associated with the specific purpose of the library and the project goals. The high complexity of drug discovery and the inherent imperfection of individual experimental and computational technologies prompt the integration of complementary library design and screening approaches to expedite the identification of new and better drugs. Library design approaches including diversity-oriented synthesis, biological-oriented synthesis or combinatorial library design, to name a few, and the design of focused libraries driven by target/disease, chemical structure or molecular function are more efficient if they are guided by multi-parameter optimization. In this context, consideration of pharmaceutically relevant properties is essential for balancing novelty with chemical space in drug discovery.

  18. Determination of nuclear quadrupolar parameters using singularities in field-swept NMR patterns.

    PubMed

    Ichijo, Naoki; Takeda, Kazuyuki; Yamada, Kazuhiko; Takegoshi, K

    2016-10-07

    We propose a simple data-analysis scheme to determine the coupling constant and the asymmetry parameter of nuclear quadrupolar interactions in field-swept nuclear magnetic resonance (NMR) for static powder samples. This approach correlates the quadrupolar parameters to the positions of the singularities, which can readily be found out as sharp peaks in the field-swept pattern. Moreover, the parameters can be determined without quantitative acquisition and elaborate calculation of the overall profile of the pattern. Since both experimental and computational efforts are significantly reduced, the approach presented in this work will enhance the power of the field-swept NMR for yet unexplored quadrupolar nuclei. We demonstrate this approach in 33 S in α-S 8 and 35 Cl in chloranil. The accuracy of the obtained quadrupolar parameters is also discussed.

  19. Hubble's Next Generation Spectral Library

    NASA Astrophysics Data System (ADS)

    Heap, Sara R.; Lindler, D.

    2008-03-01

    Spectroscopic surveys of galaxies at z 1 or more bring the rest-frame ultraviolet into view of large, ground-based telescopes. This spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). This library contains UV-optical spectra (0.2-1.0 microns) of 378 stars having a wide range in temperature, luminosity, and metallicity. We have derived the basic stellar parameters from the optical spectral region (0.35 - 1.0 microns) and are using them to calibrate UV spectral diagnostic indices and colors.

  20. Modeling of frequency agile devices: development of PKI neuromodeling library based on hierarchical network structure

    NASA Astrophysics Data System (ADS)

    Sanchez, P.; Hinojosa, J.; Ruiz, R.

    2005-06-01

    Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.

  1. Nuclear Decay Data Evaluations at IFIN-HH, Romania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luca, A., E-mail: aluca@nipne.ro

    2014-06-15

    An IAEA Coordinated Research Project (CRP) on Updated Decay Data Library for Actinides was implemented during the period 2005-2012. The author participated in the CRP, as a representative of the Horia Hulubei National Institute of Physics and Nuclear Engineering (IFIN-HH), the Radionuclide Metrology Laboratory. Decay data for five actinide nuclides were evaluated by the author, according to the procedures and rules of the international cooperation Decay Data Evaluation Project (DDEP): {sup 236}U, {sup 234}Th, {sup 228}Ra, {sup 211}Bi and {sup 211}Po. The most important results, conclusions and some recommendations of the evaluator are presented. The IFIN-HH involvement in several newmore » international and national research projects in the field is briefly mentioned; new evaluations and experimental determination of some nuclear decay data (photon absolute emission probability, half-life) for nuclear medicine applications are foreseen.« less

  2. Extension of the energy range of the experimental activation cross-sections data of longer-lived products of proton induced nuclear reactions on dysprosium up to 65MeV.

    PubMed

    Tárkányi, F; Ditrói, F; Takács, S; Hermanne, A; Ignatyuk, A V

    2015-04-01

    Activation cross-sections data of longer-lived products of proton induced nuclear reactions on dysprosium were extended up to 65MeV by using stacked foil irradiation and gamma spectrometry experimental methods. Experimental cross-sections data for the formation of the radionuclides (159)Dy, (157)Dy, (155)Dy, (161)Tb, (160)Tb, (156)Tb, (155)Tb, (154m2)Tb, (154m1)Tb, (154g)Tb, (153)Tb, (152)Tb and (151)Tb are reported in the 36-65MeV energy range, and compared with an old dataset from 1964. The experimental data were also compared with the results of cross section calculations of the ALICE and EMPIRE nuclear model codes and of the TALYS nuclear reaction model code as listed in the latest on-line libraries TENDL 2013. Copyright © 2015. Published by Elsevier Ltd.

  3. New Neutron Cross-Section Measurements at ORELA for Improved Nuclear Data Calculations

    NASA Astrophysics Data System (ADS)

    Guber, K. H.; Leal, L. C.; Sayer, R. O.; Koehler, P. E.; Valentine, T. E.; Derrien, H.; Harvey, J. A.

    2005-05-01

    Many older neutron cross-section evaluations from libraries such as ENDF/B-VI or JENDL-3.2 exhibit deficiencies or do not cover energy ranges that are important for criticality safety applications. These deficiencies may occur in the resolved and unresolved-resonance regions. Consequently, these evaluated data may not be adequate for nuclear criticality calculations where effects such as self-shielding, multiple scattering, or Doppler broadening are important. To support the Nuclear Criticality Predictability Program, neutron cross-section measurements have been initiated at the Oak Ridge Electron Linear Accelerator (ORELA). ORELA is the only high-power white neutron source with excellent time resolution still operating in the United States. It is ideally suited to measure fission, neutron total, and capture cross sections in the energy range from 1 eV to ˜600 keV, which is important for many nuclear criticality safety applications.

  4. Calculations of Nuclear Astrophysics and Californium Fission Neutron Spectrum Averaged Cross Section Uncertainties Using ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0 and Low-fidelity Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B., E-mail: pritychenko@bnl.gov

    Nuclear astrophysics and californium fission neutron spectrum averaged cross sections and their uncertainties for ENDF materials have been calculated. Absolute values were deduced with Maxwellian and Mannhart spectra, while uncertainties are based on ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0 and Low-Fidelity covariances. These quantities are compared with available data, independent benchmarks, EXFOR library, and analyzed for a wide range of cases. Recommendations for neutron cross section covariances are given and implications are discussed.

  5. Minutes of the HUSKY PUP Prefielding Instrumentation Meeting Held on 28- 29 April 1975 at Kirtland AFB, New Mexico

    DTIC Science & Technology

    1976-02-26

    D£? E ;-:: E NUCLEAR ( y AGENCY „ v—TECHNICAL LIBRARY GU^ 28-29 APRIL 1975 DEFENSE NUCLEAR AGENCY TEST DIRECTORATE KIRTLAND AFB, NEW...HENRY J. THAYER ’^LTC, USA CHIEF, ENGINEERING BRANCH Q\\W\\0N ^ e .-igi«3 /^A i,nmiiJiiJMii y M MU«umwjm$iBKv^^ ■■. DISTRIBUTION: Director, Defense...ATTN: Mr. E . Sumner, Mr. T. Maguire, Mr. F. Moyer, Dept. 85-85, Bldg. 10?, P.O. Box 50^, Sunnyvale, CA 9^088 Lockheed Missiles and Space Company, ATTN

  6. Constraints on the nuclear equation of state from nuclear masses and radii in a Thomas-Fermi meta-modeling approach

    NASA Astrophysics Data System (ADS)

    Chatterjee, D.; Gulminelli, F.; Raduta, Ad. R.; Margueron, J.

    2017-12-01

    The question of correlations among empirical equation of state (EoS) parameters constrained by nuclear observables is addressed in a Thomas-Fermi meta-modeling approach. A recently proposed meta-modeling for the nuclear EoS in nuclear matter is augmented with a single finite size term to produce a minimal unified EoS functional able to describe the smooth part of the nuclear ground state properties. This meta-model can reproduce the predictions of a large variety of models, and interpolate continuously between them. An analytical approximation to the full Thomas-Fermi integrals is further proposed giving a fully analytical meta-model for nuclear masses. The parameter space is sampled and filtered through the constraint of nuclear mass reproduction with Bayesian statistical tools. We show that this simple analytical meta-modeling has a predictive power on masses, radii, and skins comparable to full Hartree-Fock or extended Thomas-Fermi calculations with realistic energy functionals. The covariance analysis on the posterior distribution shows that no physical correlation is present between the different EoS parameters. Concerning nuclear observables, a strong correlation between the slope of the symmetry energy and the neutron skin is observed, in agreement with previous studies.

  7. Assessment of antibody library diversity through next generation sequencing and technical error compensation

    PubMed Central

    Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error. PMID:28505201

  8. Assessment of antibody library diversity through next generation sequencing and technical error compensation.

    PubMed

    Fantini, Marco; Pandolfini, Luca; Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Terrigno, Marco; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error.

  9. Qualification of APOLLO2 BWR calculation scheme on the BASALA mock-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaglio-Gaudard, C.; Santamarina, A.; Sargeni, A.

    2006-07-01

    A new neutronic APOLLO2/MOC/SHEM/CEA2005 calculation scheme for BWR applications has been developed by the French 'Commissariat a l'Energie Atomique'. This scheme is based on the latest calculation methodology (accurate mutual and self-shielding formalism, MOC treatment of the transport equation) and the recent JEFF3.1 nuclear data library. This paper presents the experimental validation of this new calculation scheme on the BASALA BWR mock-up The BASALA programme is devoted to the measurements of the physical parameters of high moderation 100% MOX BWR cores, in hot and cold conditions. The experimental validation of the calculation scheme deals with core reactivity, fission rate maps,more » reactivity worth of void and absorbers (cruciform control blades and Gd pins), as well as temperature coefficient. Results of the analysis using APOLLO2/MOC/SHEM/CEA2005 show an overestimation of the core reactivity by 600 pcm for BASALA-Hot and 750 pcm for BASALA-Cold. Reactivity worth of gadolinium poison pins and hafnium or B{sub 4}C control blades are predicted by APOLLO2 calculation within 2% accuracy. Furthermore, the radial power map is well predicted for every core configuration, including Void configuration and Hf / B{sub 4}C configurations: fission rates in the central assembly are calculated within the {+-}2% experimental uncertainty for the reference cores. The C/E bias on the isothermal Moderator Temperature Coefficient, using the CEA2005 library based on JEFF3.1 file, amounts to -1.7{+-}03 pcm/ deg. C on the range 10 deg. C-80 deg. C. (authors)« less

  10. TaqMan Real-Time PCR Assays To Assess Arbuscular Mycorrhizal Responses to Field Manipulation of Grassland Biodiversity: Effects of Soil Characteristics, Plant Species Richness, and Functional Traits▿ †

    PubMed Central

    König, Stephan; Wubet, Tesfaye; Dormann, Carsten F.; Hempel, Stefan; Renker, Carsten; Buscot, François

    2010-01-01

    Large-scale (temporal and/or spatial) molecular investigations of the diversity and distribution of arbuscular mycorrhizal fungi (AMF) require considerable sampling efforts and high-throughput analysis. To facilitate such efforts, we have developed a TaqMan real-time PCR assay to detect and identify AMF in environmental samples. First, we screened the diversity in clone libraries, generated by nested PCR, of the nuclear ribosomal DNA internal transcribed spacer (ITS) of AMF in environmental samples. We then generated probes and forward primers based on the detected sequences, enabling AMF sequence type-specific detection in TaqMan multiplex real-time PCR assays. In comparisons to conventional clone library screening and Sanger sequencing, the TaqMan assay approach provided similar accuracy but higher sensitivity with cost and time savings. The TaqMan assays were applied to analyze the AMF community composition within plots of a large-scale plant biodiversity manipulation experiment, the Jena Experiment, primarily designed to investigate the interactive effects of plant biodiversity on element cycling and trophic interactions. The results show that environmental variables hierarchically shape AMF communities and that the sequence type spectrum is strongly affected by previous land use and disturbance, which appears to favor disturbance-tolerant members of the genus Glomus. The AMF species richness of disturbance-associated communities can be largely explained by richness of plant species and plant functional groups, while plant productivity and soil parameters appear to have only weak effects on the AMF community. PMID:20418424

  11. Code C# for chaos analysis of relativistic many-body systems with reactions

    NASA Astrophysics Data System (ADS)

    Grossu, I. V.; Besliu, C.; Jipa, Al.; Stan, E.; Esanu, T.; Felea, D.; Bordeianu, C. C.

    2012-04-01

    In this work we present a reaction module for “Chaos Many-Body Engine” (Grossu et al., 2010 [1]). Following our goal of creating a customizable, object oriented code library, the list of all possible reactions, including the corresponding properties (particle types, probability, cross section, particle lifetime, etc.), could be supplied as parameter, using a specific XML input file. Inspired by the Poincaré section, we propose also the “Clusterization Map”, as a new intuitive analysis method of many-body systems. For exemplification, we implemented a numerical toy-model for nuclear relativistic collisions at 4.5 A GeV/c (the SKM200 Collaboration). An encouraging agreement with experimental data was obtained for momentum, energy, rapidity, and angular π distributions. Catalogue identifier: AEGH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 184 628 No. of bytes in distributed program, including test data, etc.: 7 905 425 Distribution format: tar.gz Programming language: Visual C#.NET 2005 Computer: PC Operating system: Net Framework 2.0 running on MS Windows Has the code been vectorized or parallelized?: Each many-body system is simulated on a separate execution thread. One processor used for each many-body system. RAM: 128 Megabytes Classification: 6.2, 6.5 Catalogue identifier of previous version: AEGH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1464 External routines: Net Framework 2.0 Library Does the new version supersede the previous version?: Yes Nature of problem: Chaos analysis of three-dimensional, relativistic many-body systems with reactions. Solution method: Second order Runge-Kutta algorithm for simulating relativistic many-body systems with reactions. Object oriented solution, easy to reuse, extend and customize, in any development environment which accepts .Net assemblies or COM components. Treatment of two particles reactions and decays. For each particle, calculation of the time measured in the particle reference frame, according to the instantaneous velocity. Possibility to dynamically add particle properties (spin, isospin, etc.), and reactions/decays, using a specific XML input file. Basic support for Monte Carlo simulations. Implementation of: Lyapunov exponent, “fragmentation level”, “average system radius”, “virial coefficient”, “clusterization map”, and energy conservation precision test. As an example of use, we implemented a toy-model for nuclear relativistic collisions at 4.5 A GeV/c. Reasons for new version: Following our goal of applying chaos theory to nuclear relativistic collisions at 4.5 A GeV/c, we developed a reaction module integrated with the Chaos Many-Body Engine. In the previous version, inheriting the Particle class was the only possibility of implementing more particle properties (spin, isospin, and so on). In the new version, particle properties can be dynamically added using a dictionary object. The application was improved in order to calculate the time measured in the own reference frame of each particle. two particles reactions: a+b→c+d, decays: a→c+d, stimulated decays, more complicated schemas, implemented as various combinations of previous reactions. Following our goal of creating a flexible application, the reactions list, including the corresponding properties (cross sections, particles lifetime, etc.), could be supplied as parameter, using a specific XML configuration file. The simulation output files were modified for systems with reactions, assuring also the backward compatibility. We propose the “Clusterization Map” as a new investigation method of many-body systems. The multi-dimensional Lyapunov Exponent was adapted in order to be used for systems with variable structure. Basic support for Monte Carlo simulations was also added. Additional comments: Windows forms application for testing the engine. Easy copy/paste based deployment method. Running time: Quadratic complexity.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blom, Philip Stephen; Marcillo, Omar Eduardo; Euler, Garrett Gene

    InfraPy is a Python-based analysis toolkit being development at LANL. The algorithms are intended for ground-based nuclear detonation detection applications to detect, locate, and characterize explosive sources using infrasonic observations. The implementation is usable as a stand-alone Python library or as a command line driven tool operating directly on a database. With multiple scientists working on the project, we've begun using a LANL git repository for collaborative development and version control. Current and planned work on InfraPy focuses on the development of new algorithms and propagation models. Collaboration with Southern Methodist University (SMU) has helped identify bugs and limitations ofmore » the algorithms. The current focus of usage development is focused on library imports and CLI.« less

  13. Variance Reduction Factor of Nuclear Data for Integral Neutronics Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiba, G., E-mail: go_chiba@eng.hokudai.ac.jp; Tsuji, M.; Narabayashi, T.

    We propose a new quantity, a variance reduction factor, to identify nuclear data for which further improvements are required to reduce uncertainties of target integral neutronics parameters. Important energy ranges can be also identified with this variance reduction factor. Variance reduction factors are calculated for several integral neutronics parameters. The usefulness of the variance reduction factors is demonstrated.

  14. Flight Software Math Library

    NASA Technical Reports Server (NTRS)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  15. \\Space: A new code to estimate \\temp, \\logg, and elemental abundances

    NASA Astrophysics Data System (ADS)

    Boeche, C.

    2016-09-01

    \\Space is a FORTRAN95 code that derives stellar parameters and elemental abundances from stellar spectra. To derive these parameters, \\Space does not measure equivalent widths of lines nor it uses templates of synthetic spectra, but it employs a new method based on a library of General Curve-Of-Growths. To date \\Space works on the wavelength range 5212-6860 Å and 8400-8921 Å, and at the spectral resolution R=2000-20000. Extensions of these limits are possible. \\Space is a highly automated code suitable for application to large spectroscopic surveys. A web front end to this service is publicly available at http://dc.g-vo.org/SP_ACE together with the library and the binary code.

  16. A technical assessment of the porcine ejaculated spermatozoa for a sperm-specific RNA-seq analysis.

    PubMed

    Gòdia, Marta; Mayer, Fabiana Quoos; Nafissi, Julieta; Castelló, Anna; Rodríguez-Gil, Joan Enric; Sánchez, Armand; Clop, Alex

    2018-04-26

    The study of the boar sperm transcriptome by RNA-seq can provide relevant information on sperm quality and fertility and might contribute to animal breeding strategies. However, the analysis of the spermatozoa RNA is challenging as these cells harbor very low amounts of highly fragmented RNA, and the ejaculates also contain other cell types with larger amounts of non-fragmented RNA. Here, we describe a strategy for a successful boar sperm purification, RNA extraction and RNA-seq library preparation. Using these approaches our objectives were: (i) to evaluate the sperm recovery rate (SRR) after boar spermatozoa purification by density centrifugation using the non-porcine-specific commercial reagent BoviPure TM ; (ii) to assess the correlation between SRR and sperm quality characteristics; (iii) to evaluate the relationship between sperm cell RNA load and sperm quality traits and (iv) to compare different library preparation kits for both total RNA-seq (SMARTer Universal Low Input RNA and TruSeq RNA Library Prep kit) and small RNA-seq (NEBNext Small RNA and TailorMix miRNA Sample Prep v2) for high-throughput sequencing. Our results show that pig SRR (~22%) is lower than in other mammalian species and that it is not significantly dependent of the sperm quality parameters analyzed in our study. Moreover, no relationship between the RNA yield per sperm cell and sperm phenotypes was found. We compared a RNA-seq library preparation kit optimized for low amounts of fragmented RNA with a standard kit designed for high amount and quality of input RNA and found that for sperm, a protocol designed to work on low-quality RNA is essential. We also compared two small RNA-seq kits and did not find substantial differences in their performance. We propose the methodological workflow described for the RNA-seq screening of the boar spermatozoa transcriptome. FPKM: fragments per kilobase of transcript per million mapped reads; KRT1: keratin 1; miRNA: micro-RNA; miscRNA: miscellaneous RNA; Mt rRNA: mitochondrial ribosomal RNA; Mt tRNA: mitochondrial transference RNA; OAZ3: ornithine decarboxylase antizyme 3; ORT: osmotic resistance test; piRNA: Piwi-interacting RNA; PRM1: protamine 1; PTPRC: protein tyrosine phosphatase receptor type C; rRNA: ribosomal RNA; snoRNA: small nucleolar RNA; snRNA: small nuclear RNA; SRR: sperm recovery rate; tRNA: transfer RNA.

  17. The HST/STIS Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Gregg, M. D.; Silva, D.; Rayner, J.; Worthey, G.; Valdes, F.; Pickles, A.; Rose, J.; Carney, B.; Vacca, W.

    2006-01-01

    During Cycles 10, 12, and 13, we obtained STIS G230LB, G430L, and G750L spectra of 378 bright stars covering a wide range in abundance, effective temperature, and luminosity. This HST/STIS Next Generation Spectral Library was scheduled to reach its goal of 600 targets by the end of Cycle 13 when STIS came to an untimely end. Even at 2/3 complete, the library significantly improves the sampling of stellar atmosphere parameter space compared to most other spectral libraries by including the near-UV and significant numbers of metal poor and super-solar abundance stars. Numerous calibration challenges have been encountered, some expected, some not; these arise from the use of the E1 aperture location, non-standard wavelength calibration, and, most significantly, the serious contamination of the near-UV spectra by red light. Maximizing the utility of the library depends directly on overcoming or at least minimizing these problems, especially correcting the UV spectra.

  18. PrecisePrimer: an easy-to-use web server for designing PCR primers for DNA library cloning and DNA shuffling.

    PubMed

    Pauthenier, Cyrille; Faulon, Jean-Loup

    2014-07-01

    PrecisePrimer is a web-based primer design software made to assist experimentalists in any repetitive primer design task such as preparing, cloning and shuffling DNA libraries. Unlike other popular primer design tools, it is conceived to generate primer libraries with popular PCR polymerase buffers proposed as pre-set options. PrecisePrimer is also meant to design primers in batches, such as for DNA libraries creation of DNA shuffling experiments and to have the simplest interface possible. It integrates the most up-to-date melting temperature algorithms validated with experimental data, and cross validated with other computational tools. We generated a library of primers for the extraction and cloning of 61 genes from yeast DNA genomic extract using default parameters. All primer pairs efficiently amplified their target without any optimization of the PCR conditions. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Raster graphics display library

    NASA Technical Reports Server (NTRS)

    Grimsrud, Anders; Stephenson, Michael B.

    1987-01-01

    The Raster Graphics Display Library (RGDL) is a high level subroutine package that give the advanced raster graphics display capabilities needed. The RGDL uses FORTRAN source code routines to build subroutines modular enough to use as stand-alone routines in a black box type of environment. Six examples are presented which will teach the use of RGDL in the fastest, most complete way possible. Routines within the display library that are used to produce raster graphics are presented in alphabetical order, each on a separate page. Each user-callable routine is described by function and calling parameters. All common blocks that are used in the display library are listed and the use of each variable within each common block is discussed. A reference on the include files that are necessary to compile the display library is contained. Each include file and its purpose are listed. The link map for MOVIE.BYU version 6, a general purpose computer graphics display system that uses RGDL software, is also contained.

  20. The structure of a thermophilic kinase shapes fitness upon random circular permutation

    PubMed Central

    Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.

    2016-01-01

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658

  1. The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.

    PubMed

    Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J

    2016-05-20

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.

  2. 75 FR 28822 - Duke Energy Carolina, LLC; William States Lee III Combined License Application; Notice of Intent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-24

    ..., LLC; William States Lee III Combined License Application; Notice of Intent To Conduct a Supplemental... an application for combined licenses (COL) for its William States Lee III Nuclear Station (Lee) site.../new-licensing/col/lee.html . In addition, the Cherokee County Public Library, 300 E. Rutledge Avenue...

  3. Activation cross-section measurement of proton induced reactions on cerium

    NASA Astrophysics Data System (ADS)

    Tárkányi, F.; Hermanne, A.; Ditrói, F.; Takács, S.; Spahn, I.; Spellerberg, S.

    2017-12-01

    In the framework of a systematic study of proton induced nuclear reactions on lanthanides we have measured the excitation functions on natural cerium for the production of 142,139,138m,137Pr, 141,139,137m,137g,135Ce and 133La up to 65 MeV proton energy using the activation method with stacked-foil irradiation technique and high-resolution γ-ray spectrometry. The cross-sections of the investigated reactions were compared with the data retrieved from the TENDL-2014 and TENDL-2015 libraries, based on the latest version of the TALYS code system. No earlier experimental data were found in the literature. The measured cross-section data are important for further improvement of nuclear reaction models and for practical applications in nuclear medicine, other labeling and activation studies.

  4. Properties of ΣQ*, ΞQ* and ΩQ* heavy baryons in cold nuclear matter

    NASA Astrophysics Data System (ADS)

    Azizi, K.; Er, N.

    2018-02-01

    The in-medium properties of the heavy spin-3/2 ΣQ*, ΞQ* and ΩQ* baryons with Q being b or c quark are investigated. The shifts in some spectroscopic parameters of these particles due to the saturated cold nuclear matter are calculated. The variations of those parameters with respect to the changes in the density of the cold nuclear medium are studied, as well. It is observed that the parameters of ΣQ* baryons are considerably affected by the nuclear matter compared to the ΞQ* and ΩQ* particles that roughly do not see the medium. The results obtained may be used in analyses of the data to be provided by the in-medium experiments like PANDA.

  5. Determination of elastomeric foam parameters for simulations of complex loading.

    PubMed

    Petre, M T; Erdemir, A; Cavanagh, P R

    2006-08-01

    Finite element (FE) analysis has shown promise for the evaluation of elastomeric foam personal protection devices. Although appropriate representation of foam materials is necessary in order to obtain realistic simulation results, material definitions used in the literature vary widely and often fail to account for the multi-mode loading experienced by these devices. This study aims to provide a library of elastomeric foam material parameters that can be used in FE simulations of complex loading scenarios. Twelve foam materials used in footwear were tested in uni-axial compression, simple shear and volumetric compression. For each material, parameters for a common compressible hyperelastic material model used in FE analysis were determined using: (a) compression; (b) compression and shear data; and (c) data from all three tests. Material parameters and Drucker stability limits for the best fits are provided with their associated errors. The material model was able to reproduce deformation modes for which data was provided during parameter determination but was unable to predict behavior in other deformation modes. Simulation results were found to be highly dependent on the extent of the test data used to determine the parameters in the material definition. This finding calls into question the many published results of simulations of complex loading that use foam material parameters obtained from a single mode of testing. The library of foam parameters developed here presents associated errors in three deformation modes that should provide for a more informed selection of material parameters.

  6. Using herbarium-derived DNAs to assemble a large-scale DNA barcode library for the vascular plants of Canada.

    PubMed

    Kuzmina, Maria L; Braukmann, Thomas W A; Fazekas, Aron J; Graham, Sean W; Dewaard, Stephanie L; Rodrigues, Anuar; Bennett, Bruce A; Dickinson, Timothy A; Saarela, Jeffery M; Catling, Paul M; Newmaster, Steven G; Percy, Diana M; Fenneman, Erin; Lauron-Moreau, Aurélien; Ford, Bruce; Gillespie, Lynn; Subramanyam, Ragupathy; Whitton, Jeannette; Jennings, Linda; Metsger, Deborah; Warne, Connor P; Brown, Allison; Sears, Elizabeth; Dewaard, Jeremy R; Zakharov, Evgeny V; Hebert, Paul D N

    2017-12-01

    Constructing complete, accurate plant DNA barcode reference libraries can be logistically challenging for large-scale floras. Here we demonstrate the promise and challenges of using herbarium collections for building a DNA barcode reference library for the vascular plant flora of Canada. Our study examined 20,816 specimens representing 5076 of 5190 vascular plant species in Canada (98%). For 98% of the specimens, at least one of the DNA barcode regions was recovered from the plastid loci rbcL and matK and from the nuclear ITS2 region. We used beta regression to quantify the effects of age, type of preservation, and taxonomic affiliation (family) on DNA sequence recovery. Specimen age and method of preservation had significant effects on sequence recovery for all markers, but influenced some families more (e.g., Boraginaceae) than others (e.g., Asteraceae). Our DNA barcode library represents an unparalleled resource for metagenomic and ecological genetic research working on temperate and arctic biomes. An observed decline in sequence recovery with specimen age may be associated with poor primer matches, intragenomic variation (for ITS2), or inhibitory secondary compounds in some taxa.

  7. Using herbarium-derived DNAs to assemble a large-scale DNA barcode library for the vascular plants of Canada1

    PubMed Central

    Kuzmina, Maria L.; Braukmann, Thomas W. A.; Fazekas, Aron J.; Graham, Sean W.; Dewaard, Stephanie L.; Rodrigues, Anuar; Bennett, Bruce A.; Dickinson, Timothy A.; Saarela, Jeffery M.; Catling, Paul M.; Newmaster, Steven G.; Percy, Diana M.; Fenneman, Erin; Lauron-Moreau, Aurélien; Ford, Bruce; Gillespie, Lynn; Subramanyam, Ragupathy; Whitton, Jeannette; Jennings, Linda; Metsger, Deborah; Warne, Connor P.; Brown, Allison; Sears, Elizabeth; Dewaard, Jeremy R.; Zakharov, Evgeny V.; Hebert, Paul D. N.

    2017-01-01

    Premise of the study: Constructing complete, accurate plant DNA barcode reference libraries can be logistically challenging for large-scale floras. Here we demonstrate the promise and challenges of using herbarium collections for building a DNA barcode reference library for the vascular plant flora of Canada. Methods: Our study examined 20,816 specimens representing 5076 of 5190 vascular plant species in Canada (98%). For 98% of the specimens, at least one of the DNA barcode regions was recovered from the plastid loci rbcL and matK and from the nuclear ITS2 region. We used beta regression to quantify the effects of age, type of preservation, and taxonomic affiliation (family) on DNA sequence recovery. Results: Specimen age and method of preservation had significant effects on sequence recovery for all markers, but influenced some families more (e.g., Boraginaceae) than others (e.g., Asteraceae). Discussion: Our DNA barcode library represents an unparalleled resource for metagenomic and ecological genetic research working on temperate and arctic biomes. An observed decline in sequence recovery with specimen age may be associated with poor primer matches, intragenomic variation (for ITS2), or inhibitory secondary compounds in some taxa. PMID:29299394

  8. The U. S. Geological Survey, Digital Spectral Library: Version 1 (0.2 to 3.0um)

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea J.; King, Trude V.V.; Calvin, Wendy M.

    1993-01-01

    We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 498 spectra of 444 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 um . The spectral resolution (Full Width Half Maximum) of the reflectance data is <= 4 nm in the visible (0.2-0.8 um) and <= 10 nm in the NIR (0.8-2.35 um). All spectra were corrected to absolute reflectance using an NIST Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from borate, carbonate, chloride, element, halide, hydroxide, nitrate, oxide, phosphate, sulfate, sulfide, sulfosalt, and the silicate (cyclosilicate, inosilicate, nesosilicate, phyllosilicate, sorosilicate, and tectosilicate) classes are represented. X-Ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses. The library and software are available as a series of U.S.G.S. Open File reports. PC user software is available to convert the binary data to ascii files (a separate U.S.G.S. open file report). Additionally, a binary data files are on line at the U.S.G.S. in Denver for anonymous ftp to users on the Internet. The library search software enables a user to search on documentation parameters as well as spectral features. The analysis system includes general spectral analysis routines, plotting packages, radiative transfer software for computing intimate mixtures, routines to derive optical constants from reflectance spectra, tools to analyze spectral features, and the capability to access imaging spectrometer data cubes for spectral analysis. Users may build customized libraries (at specific wavelengths and spectral resolution) for their own instruments using the library software. We are currently extending spectral coverage to 150 um. The libraries (original and convolved) will be made available in the future on a CD-ROM.

  9. CCFpams: Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, Luca; Lovis, Christophe; Pepe, Francesco; Sneden, Christopher; Udry, Stephane

    2017-07-01

    CCFpams allows the measurement of stellar temperature, metallicity and gravity within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, the technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. Literature stellar parameters of high signal-to-noise (SNR) and high-resolution HARPS spectra of FGK Main Sequence stars are used to calibrate the stellar parameters as a function of CCF areas.

  10. ANITA-IEAF activation code package - updating of the decay and cross section data libraries and validation on the experimental data from the Karlsruhe Isochronous Cyclotron

    NASA Astrophysics Data System (ADS)

    Frisoni, Manuela

    2017-09-01

    ANITA-IEAF is an activation package (code and libraries) developed in the past in ENEA-Bologna in order to assess the activation of materials exposed to neutrons with energies greater than 20 MeV. An updated version of the ANITA-IEAF activation code package has been developed. It is suitable to be applied to the study of the irradiation effects on materials in facilities like the International Fusion Materials Irradiation Facility (IFMIF) and the DEMO Oriented Neutron Source (DONES), in which a considerable amount of neutrons with energies above 20 MeV is produced. The present paper summarizes the main characteristics of the updated version of ANITA-IEAF, able to use decay and cross section data based on more recent evaluated nuclear data libraries, i.e. the JEFF-3.1.1 Radioactive Decay Data Library and the EAF-2010 neutron activation cross section library. In this paper the validation effort related to the comparison between the code predictions and the activity measurements obtained from the Karlsruhe Isochronous Cyclotron is presented. In this integral experiment samples of two different steels, SS-316 and F82H, pure vanadium and a vanadium alloy, structural materials of interest in fusion technology, were activated in a neutron spectrum similar to the IFMIF neutron field.

  11. Nuclear magnetic and nuclear quadrupole resonance parameters of β-carboline derivatives calculated using density functional theory

    NASA Astrophysics Data System (ADS)

    Ahmadinejad, Neda; Tari, Mostafa Talebi

    2017-04-01

    A density functional theory (DFT) calculations using B3LYP/6-311++G( d,p) method were carried out to investigate the relative stability of the molecules of β-carboline derivatives such as harmaline, harmine, harmalol, harmane and norharmane. Calculated nuclear quadrupole resonance (NQR) parameters were used to determine the 14N nuclear quadrupole coupling constant χ, asymmetry parameter η and EFG tensor ( q zz ). For better understanding of the electronic structure of β-carboline derivatives, natural bond orbital (NBO) analysis, isotropic and anisotropic NMR chemical shieldings were calculated for 14N nuclei using GIAO method for the optimized structures. The NBO analysis shows that pyrrole ring nitrogen (N9) atom has greater tendency than pyridine ring nitrogen (N2) atom to participate in resonance interactions and aromaticity development in the all of these structures. The NMR and NQR parameters were studied in order to find the correlations between electronic structure and the structural stability of the studied molecules.

  12. Consequences of Normalizing Transcriptomic and Genomic Libraries of Plant Genomes Using a Duplex-Specific Nuclease and Tetramethylammonium Chloride

    PubMed Central

    Froenicke, Lutz; Lavelle, Dean; Martineau, Belinda; Perroud, Bertrand; Michelmore, Richard

    2013-01-01

    Several applications of high throughput genome and transcriptome sequencing would benefit from a reduction of the high-copy-number sequences in the libraries being sequenced and analyzed, particularly when applied to species with large genomes. We adapted and analyzed the consequences of a method that utilizes a thermostable duplex-specific nuclease for reducing the high-copy components in transcriptomic and genomic libraries prior to sequencing. This reduces the time, cost, and computational effort of obtaining informative transcriptomic and genomic sequence data for both fully sequenced and non-sequenced genomes. It also reduces contamination from organellar DNA in preparations of nuclear DNA. Hybridization in the presence of 3 M tetramethylammonium chloride (TMAC), which equalizes the rates of hybridization of GC and AT nucleotide pairs, reduced the bias against sequences with high GC content. Consequences of this method on the reduction of high-copy and enrichment of low-copy sequences are reported for Arabidopsis and lettuce. PMID:23409088

  13. How to Use Benchmark and Cross-section Studies to Improve Data Libraries and Models

    NASA Astrophysics Data System (ADS)

    Wagner, V.; Suchopár, M.; Vrzalová, J.; Chudoba, P.; Svoboda, O.; Tichý, P.; Krása, A.; Majerle, M.; Kugler, A.; Adam, J.; Baldin, A.; Furman, W.; Kadykov, M.; Solnyshkin, A.; Tsoupko-Sitnikov, S.; Tyutyunikov, S.; Vladimirovna, N.; Závorka, L.

    2016-06-01

    Improvements of the Monte Carlo transport codes and cross-section libraries are very important steps towards usage of the accelerator-driven transmutation systems. We have conducted a lot of benchmark experiments with different set-ups consisting of lead, natural uranium and moderator irradiated by relativistic protons and deuterons within framework of the collaboration “Energy and Transmutation of Radioactive Waste”. Unfortunately, the knowledge of the total or partial cross-sections of important reactions is insufficient. Due to this reason we have started extensive studies of different reaction cross-sections. We measure cross-sections of important neutron reactions by means of the quasi-monoenergetic neutron sources based on the cyclotrons at Nuclear Physics Institute in Řež and at The Svedberg Laboratory in Uppsala. Measurements of partial cross-sections of relativistic deuteron reactions were the second direction of our studies. The new results obtained during last years will be shown. Possible use of these data for improvement of libraries, models and benchmark studies will be discussed.

  14. Measurement of the 23Na(n,2n) cross section in 235U and 252Cf fission neutron spectra

    NASA Astrophysics Data System (ADS)

    Košťál, Michal; Schulc, Martin; Rypar, Vojtěch; Losa, Evžen; Švadlenková, Marie; Baroň, Petr; Jánský, Bohumil; Novák, Evžen; Mareček, Martin; Uhlíř, Jan

    2017-09-01

    The presented paper aims to compare the calculated and experimental reaction rates of 23Na(n,2n)22Na in a well-defined reactor spectra and in the spontaneous fission spectrum of 252Cf. The experimentally determined reaction rate, derived using gamma spectroscopy of irradiated NaF sample, is used for average cross section determination.Estimation of this cross-section is important as it is included in International Reactor Dosimetry and Fusion File and is also relevant to the correct estimation of long-term activity of Na coolant in Sodium Fast Reactors. The calculations were performed with the MCNP6 code using ENDF/B-VII.0, JEFF-3.1, JEFF-3.2, JENDL-3.3, JENDL-4, ROSFOND-2010, CENDL-3.1 and IRDFF nuclear data libraries. In the case of reactor spectrum, reasonable agreement was not achieved with any library. However, in the case of 252Cf spectrum agreement was achieved with IRDFF, JEFF-3.1 and JENDL libraries.

  15. Using single nuclei for RNA-seq to capture the transcriptome of postmortem neurons

    PubMed Central

    Krishnaswami, Suguna Rani; Grindberg, Rashel V; Novotny, Mark; Venepally, Pratap; Lacar, Benjamin; Bhutani, Kunal; Linker, Sara B; Pham, Son; Erwin, Jennifer A; Miller, Jeremy A; Hodge, Rebecca; McCarthy, James K; Kelder, Martin; McCorrison, Jamison; Aevermann, Brian D; Fuertes, Francisco Diez; Scheuermann, Richard H; Lee, Jun; Lein, Ed S; Schork, Nicholas; McConnell, Michael J; Gage, Fred H; Lasken, Roger S

    2016-01-01

    A protocol is described for sequencing the transcriptome of a cell nucleus. Nuclei are isolated from specimens and sorted by FACS, cDNA libraries are constructed and RNA-seq is performed, followed by data analysis. Some steps follow published methods (Smart-seq2 for cDNA synthesis and Nextera XT barcoded library preparation) and are not described in detail here. Previous single-cell approaches for RNA-seq from tissues include cell dissociation using protease treatment at 30 °C, which is known to alter the transcriptome. We isolate nuclei at 4 °C from tissue homogenates, which cause minimal damage. Nuclear transcriptomes can be obtained from postmortem human brain tissue stored at −80 °C, making brain archives accessible for RNA-seq from individual neurons. The method also allows investigation of biological features unique to nuclei, such as enrichment of certain transcripts and precursors of some noncoding RNAs. By following this procedure, it takes about 4 d to construct cDNA libraries that are ready for sequencing. PMID:26890679

  16. Consequences of normalizing transcriptomic and genomic libraries of plant genomes using a duplex-specific nuclease and tetramethylammonium chloride.

    PubMed

    Matvienko, Marta; Kozik, Alexander; Froenicke, Lutz; Lavelle, Dean; Martineau, Belinda; Perroud, Bertrand; Michelmore, Richard

    2013-01-01

    Several applications of high throughput genome and transcriptome sequencing would benefit from a reduction of the high-copy-number sequences in the libraries being sequenced and analyzed, particularly when applied to species with large genomes. We adapted and analyzed the consequences of a method that utilizes a thermostable duplex-specific nuclease for reducing the high-copy components in transcriptomic and genomic libraries prior to sequencing. This reduces the time, cost, and computational effort of obtaining informative transcriptomic and genomic sequence data for both fully sequenced and non-sequenced genomes. It also reduces contamination from organellar DNA in preparations of nuclear DNA. Hybridization in the presence of 3 M tetramethylammonium chloride (TMAC), which equalizes the rates of hybridization of GC and AT nucleotide pairs, reduced the bias against sequences with high GC content. Consequences of this method on the reduction of high-copy and enrichment of low-copy sequences are reported for Arabidopsis and lettuce.

  17. Cytological Evaluation of Thyroid Lesions by Nuclear Morphology and Nuclear Morphometry.

    PubMed

    Yashaswini, R; Suresh, T N; Sagayaraj, A

    2017-01-01

    Fine needle aspiration (FNA) of the thyroid gland is an effective diagnostic method. The Bethesda system for reporting thyroid cytopathology classifies them into six categories and gives implied risk for malignancy and management protocol in each category. Though the system gives specific criteria, diagnostic dilemma still exists. Using nuclear morphometry, we can quantify the number of parameters, such as those related to nuclear size and shape. The evaluation of nuclear morphometry is not well established in thyroid cytology. To classify thyroid lesions on fine needle aspiration cytology (FNAC) using Bethesda system and to evaluate the significance of nuclear parameters in improving the prediction of thyroid malignancy. In the present study, 120 FNAC cases of thyroid lesions with histological diagnosis were included. Computerized nuclear morphometry was done on 81 cases which had confirmed cytohistological correlation, using Aperio computer software. One hundred nuclei from each case were outlined and eight nuclear parameters were analyzed. In the present study, thyroid lesions were common in female with M: F ratio of 1:5 and most commonly in 40-60 yrs. Under Bethesda system, 73 (60.83%) were category II; 14 (11.6%) were category III, 3 (2.5%) were category IV, 8 (6.6%) were category V, and 22 (18.3%) were category VI, which were malignant on histopathological correlation. Sensitivity, specificity, and diagnostic accuracy of Bethesda reporting system are 62.5, 84.38, and 74.16%, respectively. Minimal nuclear diameter, maximal nuclear diameter, nuclear perimeter, and nuclear area were higher in malignant group compared to nonneoplastic and benign group. The Bethesda system is a useful standardized system of reporting thyroid cytopathology. It gives implied risk of malignancy. Nuclear morphometry by computerized image analysis can be utilized as an additional diagnostic tool.

  18. The Medical Library Association Benchmarking Network: development and implementation.

    PubMed

    Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C; Smith, Bernie Todd

    2006-04-01

    This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program.

  19. The Medical Library Association Benchmarking Network: development and implementation*

    PubMed Central

    Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C.; Smith, Bernie Todd

    2006-01-01

    Objective: This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. Methods: The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. Results: The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. Conclusions: The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program. PMID:16636702

  20. Design of a genetic algorithm for the simulated evolution of a library of asymmetric transfer hydrogenation catalysts.

    PubMed

    Vriamont, Nicolas; Govaerts, Bernadette; Grenouillet, Pierre; de Bellefon, Claude; Riant, Olivier

    2009-06-15

    A library of catalysts was designed for asymmetric-hydrogen transfer to acetophenone. At first, the whole library was submitted to evaluation using high-throughput experiments (HTE). The catalysts were listed in ascending order, with respect to their performance, and best catalysts were identified. In the second step, various simulated evolution experiments, based on a genetic algorithm, were applied to this library. A small part of the library, called the mother generation (G0), thus evolved from generation to generation. The goal was to use our collection of HTE data to adjust the parameters of the genetic algorithm, in order to obtain a maximum of the best catalysts within a minimal number of generations. It was namely found that simulated evolution's results depended on the selection of G0 and that a random G0 should be preferred. We also demonstrated that it was possible to get 5 to 6 of the ten best catalysts while investigating only 10 % of the library. Moreover, we developed a double algorithm making this result still achievable if the evolution started with one of the worst G0.

  1. Power quality considerations for nuclear spectroscopy applications: Grounding

    NASA Astrophysics Data System (ADS)

    García-Hernández, J. M.; Ramírez-Jiménez, F. J.; Mondragón-Contreras, L.; López-Callejas, R.; Torres-Bribiesca, M. A.; Peña-Eguiluz, R.

    2013-11-01

    Traditionally the electrical installations are designed for supplying power and to assure the personnel safety. In nuclear analysis laboratories, additional issues about grounding also must be considered for proper operation of high resolution nuclear spectroscopy systems. This paper shows the traditional ways of grounding nuclear spectroscopy systems and through different scenarios, it shows the effects on the more sensitive parameter of these systems: the energy resolution, it also proposes the constant monitoring of a power quality parameter as a way to preserve or to improve the resolution of the systems, avoiding the influence of excessive extrinsic noise.

  2. Nuclear-size correction to the Lamb shift of one-electron atoms

    NASA Astrophysics Data System (ADS)

    Yerokhin, Vladimir A.

    2011-01-01

    The nuclear-size effect on the one-loop self-energy and vacuum polarization is evaluated for the 1s, 2s, 3s, 2p1/2, and 2p3/2 states of hydrogen-like ions. The calculation is performed to all orders in the nuclear binding strength parameter Zα. Detailed comparison is made with previous all-order calculations and calculations based on the expansion in the parameter Zα. Extrapolation of the all-order numerical results obtained toward Z=1 provides results for the radiative nuclear-size effect on the hydrogen Lamb shift.

  3. Equation of state for dense nucleonic matter from metamodeling. I. Foundational aspects

    NASA Astrophysics Data System (ADS)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Metamodeling for the nucleonic equation of state (EOS), inspired from a Taylor expansion around the saturation density of symmetric nuclear matter, is proposed and parameterized in terms of the empirical parameters. The present knowledge of nuclear empirical parameters is first reviewed in order to estimate their average values and associated uncertainties, and thus defining the parameter space of the metamodeling. They are divided into isoscalar and isovector types, and ordered according to their power in the density expansion. The goodness of the metamodeling is analyzed against the predictions of the original models. In addition, since no correlation among the empirical parameters is assumed a priori, all arbitrary density dependences can be explored, which might not be accessible in existing functionals. Spurious correlations due to the assumed functional form are also removed. This meta-EOS allows direct relations between the uncertainties on the empirical parameters and the density dependence of the nuclear equation of state and its derivatives, and the mapping between the two can be done with standard Bayesian techniques. A sensitivity analysis shows that the more influential empirical parameters are the isovector parameters Lsym and Ksym, and that laboratory constraints at supersaturation densities are essential to reduce the present uncertainties. The present metamodeling for the EOS for nuclear matter is proposed for further applications in neutron stars and supernova matter.

  4. Experimental Determination of η/s for Finite Nuclear Matter.

    PubMed

    Mondal, Debasish; Pandit, Deepak; Mukhopadhyay, S; Pal, Surajit; Dey, Balaram; Bhattacharya, Srijit; De, A; Bhattacharya, Soumik; Bhattacharyya, S; Roy, Pratap; Banerjee, K; Banerjee, S R

    2017-05-12

    We present, for the first time, simultaneous determination of shear viscosity (η) and entropy density (s) and thus, η/s for equilibrated nuclear systems from A∼30 to A∼208 at different temperatures. At finite temperature, η is estimated by utilizing the γ decay of the isovector giant dipole resonance populated via fusion evaporation reaction, while s is evaluated from the nuclear level density parameter (a) and nuclear temperature (T), determined precisely by the simultaneous measurements of the evaporated neutron energy spectra and the compound nuclear angular momenta. The transport parameter η and the thermodynamic parameter s both increase with temperature, resulting in a mild decrease of η/s with temperature. The extracted η/s is also found to be independent of the neutron-proton asymmetry at a given temperature. Interestingly, the measured η/s values are comparable to that of the high-temperature quark-gluon plasma, pointing towards the fact that strong fluidity may be the universal feature of the strong interaction of many-body quantum systems.

  5. Atmospheric particulate analysis using angular light scattering

    NASA Technical Reports Server (NTRS)

    Hansen, M. Z.

    1980-01-01

    Using the light scattering matrix elements measured by a polar nephelometer, a procedure for estimating the characteristics of atmospheric particulates was developed. A theoretical library data set of scattering matrices derived from Mie theory was tabulated for a range of values of the size parameter and refractive index typical of atmospheric particles. Integration over the size parameter yielded the scattering matrix elements for a variety of hypothesized particulate size distributions. A least squares curve fitting technique was used to find a best fit from the library data for the experimental measurements. This was used as a first guess for a nonlinear iterative inversion of the size distributions. A real index of 1.50 and an imaginary index of -0.005 are representative of the smoothed inversion results for the near ground level atmospheric aerosol in Tucson.

  6. Prototype of Multifunctional Full-text Library in the Architecture Web-browser / Web-server / SQL-server

    NASA Astrophysics Data System (ADS)

    Lyapin, Sergey; Kukovyakin, Alexey

    Within the framework of the research program "Textaurus" an operational prototype of multifunctional library T-Libra v.4.1. has been created which makes it possible to carry out flexible parametrizable search within a full-text database. The information system is realized in the architecture Web-browser / Web-server / SQL-server. This allows to achieve an optimal combination of universality and efficiency of text processing, on the one hand, and convenience and minimization of expenses for an end user (due to applying of a standard Web-browser as a client application), on the other one. The following principles underlie the information system: a) multifunctionality, b) intelligence, c) multilingual primary texts and full-text searching, d) development of digital library (DL) by a user ("administrative client"), e) multi-platform working. A "library of concepts", i.e. a block of functional models of semantic (concept-oriented) searching, as well as a subsystem of parametrizable queries to a full-text database, which is closely connected with the "library", serve as a conceptual basis of multifunctionality and "intelligence" of the DL T-Libra v.4.1. An author's paragraph is a unit of full-text searching in the suggested technology. At that, the "logic" of an educational / scientific topic or a problem can be built in a multilevel flexible structure of a query and the "library of concepts", replenishable by the developers and experts. About 10 queries of various level of complexity and conceptuality are realized in the suggested version of the information system: from simple terminological searching (taking into account lexical and grammatical paradigms of Russian) to several kinds of explication of terminological fields and adjustable two-parameter thematic searching (a [set of terms] and a [distance between terms] within the limits of an author's paragraph are such parameters correspondingly).

  7. Surface water retardation around single-chain polymeric nanoparticles: critical for catalytic function?

    PubMed

    Stals, Patrick J M; Cheng, Chi-Yuan; van Beek, Lotte; Wauters, Annelies C; Palmans, Anja R A; Han, Songi; Meijer, E W

    2016-03-01

    A library of water-soluble dynamic single-chain polymeric nanoparticles (SCPN) was prepared using a controlled radical polymerisation technique followed by the introduction of functional groups, including probes at targeted positions. The combined tools of electron paramagnetic resonance (EPR) and Overhauser dynamic nuclear polarization (ODNP) reveal that these SCPNs have structural and surface hydration properties resembling that of enzymes.

  8. 78 FR 49305 - Luminant Generation Company LLC, Comanche Peak Nuclear Power Plant, Unit Nos. 1 and 2...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... http://www.regulations.gov and search for Docket ID NRC-2013-0182. Address questions about NRC dockets... NRC Library at http://www.nrc.gov/reading-rm/adams.html . To begin the search, select ``ADAMS Public Documents'' and then select ``Begin Web- based ADAMS Search.'' For problems with ADAMS, please contact the...

  9. 78 FR 27260 - Southern California Edison, San Onofre Nuclear Generating Station, Units 2 and 3 Request for Action

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... regarding this document. You may access information related to this document, which the NRC possesses and is... and Management System (ADAMS): You may access publicly available documents online in the NRC Library... available in ADAMS) is provided the first time that a document is referenced. NRC's PDR: You may examine and...

  10. 78 FR 11688 - Notice of Issuance of Amendment to Facility License R-77 Incorporating a Decommissioning Plan for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Reactor at the State University of New York at Buffalo AGENCY: Nuclear Regulatory Commission. ACTION... University of New York at Buffalo (UB) decommissioning plan (DP) by amendment to the Facility License R-77... in the NRC Library at http://www.nrc.gov/reading-rm/adams.html . To begin the search, select ``ADAMS...

  11. Assessment of beryllium and molybdenum nuclear data files with the RPI neutron scattering system in the energy region from 0.5 to 20 MeV

    NASA Astrophysics Data System (ADS)

    Daskalakis, Adam; Blain, Ezekiel; Leinweber, Gregory; Rapp, Michael; Barry, Devin; Block, Robert; Danon, Yaron

    2017-09-01

    A series of neutron scattering benchmark measurements were performed on beryllium and molybdenum with the Rensselaer Polytechnic Institute's Neutron Scattering System. The pulsed neutron source was produced by the Rensselaer Polytechnic Institute's Linear Accelerator and a well collimated neutron beam was incident onto the samples located at a distance of 30.07 m. Neutrons that scattered from the sample were measured using the time-of-flight by eight EJ-301 liquid scintillator detectors positioned 0.5 m from the sample of interest. A total of eight experiments were performed with two sample thicknesses each, measured by detectors placed at two sets of angles. All data were processed using pulse shape analysis that separated the neutron and gamma ray events and included a gamma misclassification correction to account for erroneously identified gamma rays. A detailed model of the neutron scattering system simulated each experiment with several current evaluated nuclear data libraries and their predecessors. Results for each evaluation were compared to the experimental data using a figure-of-merit. The neutron scattering system has been used as a means to quantify a library's performance.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  13. Nuclear Decay Data for the International Reactor Dosimetry Library for Fission and Fusion (IRDFF): Updated Evaluations of the Half-Lives and Gamma Ray Intensities

    NASA Astrophysics Data System (ADS)

    Chechev, Valery P.; Kuzmenko, Nikolay K.

    2016-02-01

    Updated evaluations of the half-lives and prominent gamma ray intensities have been presented for 20 radionuclides - dosimetry reaction residuals. The new values of these decay characteristics recommended for the IRDFF library were obtained using the approaches and methodology adopted by the working group of the Decay Data Evaluation Project (DDEP) cooperation. The experimental data published up to 2014 were taken into account in updated evaluations. The list of radionuclides includes 3H, 18F, 22Na, 24Na, 46Sc, 51Cr, 54Mn, 59Fe, 57Co, 60Co, 57Ni, 64Cu, 88Y, 132Te, 131I, 140Ba, 140La, 141Ce, 182Ta, 198Au.

  14. SkyNet: A Modular Nuclear Reaction Network Library

    NASA Astrophysics Data System (ADS)

    Lippuner, Jonas; Roberts, Luke F.

    2017-12-01

    Almost all of the elements heavier than hydrogen that are present in our solar system were produced by nuclear burning processes either in the early universe or at some point in the life cycle of stars. In all of these environments, there are dozens to thousands of nuclear species that interact with each other to produce successively heavier elements. In this paper, we present SkyNet, a new general-purpose nuclear reaction network that evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. SkyNet is free and open source, and aims to be easy to use and flexible. Any list of isotopes can be evolved, and SkyNet supports different types of nuclear reactions. SkyNet is modular so that new or existing physics, like nuclear reactions or equations of state, can easily be added or modified. Here, we present in detail the physics implemented in SkyNet with a focus on a self-consistent transition to and from nuclear statistical equilibrium to non-equilibrium nuclear burning, our implementation of electron screening, and coupling of the network to an equation of state. We also present comprehensive code tests and comparisons with existing nuclear reaction networks. We find that SkyNet agrees with published results and other codes to an accuracy of a few percent. Discrepancies, where they exist, can be traced to differences in the physics implementations.

  15. A comment on the validity of fragmentation parameters measured in nuclear emulsions. [cosmic ray nuclei

    NASA Technical Reports Server (NTRS)

    Waddington, C. J.

    1978-01-01

    Evidence is reexamined which has been cited as suggesting serious errors in the use of fragmentation parameters appropriate to an airlike medium deduced from measurements made in nuclear emulsions to evaluate corrections for certain effects in balloon-borne observations of cosmic-ray nuclei. Fragmentation parameters for hydrogenlike interactions are calculated and shown to be in overall good agreement with those obtained previously for air. Experimentally measured fragmentation parameters in emulsion are compared with values computed semiempirically, and reasonable agreement is indicated.

  16. Construction of a directed hammerhead ribozyme library: towards the identification of optimal target sites for antisense-mediated gene inhibition.

    PubMed Central

    Pierce, M L; Ruffner, D E

    1998-01-01

    Antisense-mediated gene inhibition uses short complementary DNA or RNA oligonucleotides to block expression of any mRNA of interest. A key parameter in the success or failure of an antisense therapy is the identification of a suitable target site on the chosen mRNA. Ultimately, the accessibility of the target to the antisense agent determines target suitability. Since accessibility is a function of many complex factors, it is currently beyond our ability to predict. Consequently, identification of the most effective target(s) requires examination of every site. Towards this goal, we describe a method to construct directed ribozyme libraries against any chosen mRNA. The library contains nearly equal amounts of ribozymes targeting every site on the chosen transcript and the library only contains ribozymes capable of binding to that transcript. Expression of the ribozyme library in cultured cells should allow identification of optimal target sites under natural conditions, subject to the complexities of a fully functional cell. Optimal target sites identified in this manner should be the most effective sites for therapeutic intervention. PMID:9801305

  17. Hole filling and library optimization: application to commercially available fragment libraries.

    PubMed

    An, Yuling; Sherman, Woody; Dixon, Steven L

    2012-09-15

    Compound libraries comprise an integral component of drug discovery in the pharmaceutical and biotechnology industries. While in-house libraries often contain millions of molecules, this number pales in comparison to the accessible space of drug-like molecules. Therefore, care must be taken when adding new compounds to an existing library in order to ensure that unexplored regions in the chemical space are filled efficiently while not needlessly increasing the library size. In this work, we present an automated method to fill holes in an existing library using compounds from an external source and apply it to commercially available fragment libraries. The method, called Canvas HF, uses distances computed from 2D chemical fingerprints and selects compounds that fill vacuous regions while not suffering from the problem of selecting only compounds at the edge of the chemical space. We show that the method is robust with respect to different databases and the number of requested compounds to retrieve. We also present an extension of the method where chemical properties can be considered simultaneously with the selection process to bias the compounds toward a desired property space without imposing hard property cutoffs. We compare the results of Canvas HF to those obtained with a standard sphere exclusion method and with random compound selection and find that Canvas HF performs favorably. Overall, the method presented here offers an efficient and effective hole-filling strategy to augment compound libraries with compounds from external sources. The method does not have any fit parameters and therefore it should be applicable in most hole-filling applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Nuclear analysis of structural damage and nuclear heating on enhanced K-DEMO divertor model

    NASA Astrophysics Data System (ADS)

    Park, J.; Im, K.; Kwon, S.; Kim, J.; Kim, D.; Woo, M.; Shin, C.

    2017-12-01

    This paper addresses nuclear analysis on the Korean fusion demonstration reactor (K-DEMO) divertor to estimate the overall trend of nuclear heating values and displacement damages. The K-DEMO divertor model was created and converted by the CAD (Pro-Engineer™) and Monte Carlo automatic modeling programs as a 22.5° sector of the tokamak. The Monte Carlo neutron photon transport and ADVANTG codes were used in this calculation with the FENDL-2.1 nuclear data library. The calculation results indicate that the highest values appeared on the upper outboard target (OT) area, which means the OT is exposed to the highest radiation conditions among the three plasma-facing parts (inboard, central and outboard) in the divertor. Especially, much lower nuclear heating values and displacement damages are indicated on the lower part of the OT area than others. These are important results contributing to thermal-hydraulic and thermo-mechanical analyses on the divertor and also it is expected that the copper alloy materials may be partially used as a heat sink only at the lower part of the OT instead of the reduced activation ferritic-martensitic steel due to copper alloy’s high thermal conductivity.

  19. NNDC Databases

    Science.gov Websites

    radiation. It includes an interactive chart of nuclides and a level plotting tool. XUNDL Experimental Unevaluated Nuclear Data List Experimental nuclear structure and decay data, covering more than 2,500 recent parameters* Retrieved information CSISRS alias EXFOR Nuclear reaction experimental data Experimental nuclear

  20. In-medium effects via nuclear stopping in asymmetric colliding nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaur, Mandeep

    2016-05-06

    The nuclear stopping is studied using isospin-dependent quantum molecular dynamics (IQMD) model in asymmetric colliding nuclei by varying mass asymmetry. The calculations have been done at incident energies varying between 50 and 400 MeV/nucleon for different impact parameters. We investigate the relative role of constant scaled and density-dependent scaled cross-sections. Our study reveals that nuclear stopping depends on the mass asymmetry, incident energy and impact parameter, however, it is independent of the way of scaling the cross-section.

  1. Extension of the energy range of experimental activation cross-sections data of deuteron induced nuclear reactions on indium up to 50MeV.

    PubMed

    Tárkányi, F; Ditrói, F; Takács, S; Hermanne, A; Ignatyuk, A V

    2015-11-01

    The energy range of our earlier measured activation cross-sections data of longer-lived products of deuteron induced nuclear reactions on indium were extended from 40MeV up to 50MeV. The traditional stacked foil irradiation technique and non-destructive gamma spectrometry were used. No experimental data were found in literature for this higher energy range. Experimental cross-sections for the formation of the radionuclides (113,110)Sn, (116m,115m,114m,113m,111,110g,109)In and (115)Cd are reported in the 37-50MeV energy range, for production of (110)Sn and (110g,109)In these are the first measurements ever. The experimental data were compared with the results of cross section calculations of the ALICE and EMPIRE nuclear model codes and of the TALYS 1.6 nuclear model code as listed in the on-line library TENDL-2014. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Electron lithography STAR design guidelines. Part 2: The design of a STAR for space applications

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Newman, W.

    1982-01-01

    The STAR design system developed by NASA enables any user with a logic diagram to design a semicustom digital MOS integrated circuit. The system is comprised of a library of standard logic cells and computr programs to place, route, and display designs implemented with cells from the library. Also described is the development of a radiation-hard array designed for the STAR system. The design is based on the CMOS silicon gate technology developed by SANDIA National Laboratories. The design rules used are given as well as the model parameters developed for the basic array element. Library cells of the CMOS metal gate and CMOS silicon gate technologies were simulated using SPICE, and the results are shown and compared.

  3. An Old Story in the Parallel Synthesis World: An Approach to Hydantoin Libraries.

    PubMed

    Bogolubsky, Andrey V; Moroz, Yurii S; Savych, Olena; Pipko, Sergey; Konovets, Angelika; Platonov, Maxim O; Vasylchenko, Oleksandr V; Hurmach, Vasyl V; Grygorenko, Oleksandr O

    2018-01-08

    An approach to the parallel synthesis of hydantoin libraries by reaction of in situ generated 2,2,2-trifluoroethylcarbamates and α-amino esters was developed. To demonstrate utility of the method, a library of 1158 hydantoins designed according to the lead-likeness criteria (MW 200-350, cLogP 1-3) was prepared. The success rate of the method was analyzed as a function of physicochemical parameters of the products, and it was found that the method can be considered as a tool for lead-oriented synthesis. A hydantoin-bearing submicromolar primary hit acting as an Aurora kinase A inhibitor was discovered with a combination of rational design, parallel synthesis using the procedures developed, in silico and in vitro screenings.

  4. Developing an academic medical library core journal collection in the (almost) post-print era: the Florida State University College of Medicine Medical Library experience

    PubMed Central

    Shearer, Barbara S.; Nagy, Suzanne P.

    2003-01-01

    The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters. PMID:12883565

  5. Performance evaluation of phage-displayed synthetic human single-domain antibody libraries: A retrospective analysis.

    PubMed

    Henry, Kevin A; Tanha, Jamshid

    2018-05-01

    Fully human synthetic single-domain antibodies (sdAbs) are desirable therapeutic molecules but their development is a considerable challenge. Here, using a retrospective analysis of in-house historical data, we examined the parameters that impact the outcome of screening phage-displayed synthetic human sdAb libraries to discover antigen-specific binders. We found no evidence for a differential effect of domain type (V H or V L ), library randomization strategy, incorporation of a stabilizing disulfide linkage or sdAb display format (monovalent vs. multivalent) on the probability of obtaining any antigen-binding human sdAbs, instead finding that the success of library screens was primarily related to properties of target antigens, especially molecular mass. The solubility and binding affinity of sdAbs isolated from successful screens depended both on properties of the sdAb libraries (primarily domain type) and the target antigens. Taking attrition of sdAbs with major manufacturability concerns (aggregation; low expression) and sdAbs that do not recognize native cell-surface antigens as independent probabilities, we calculate the overall likelihood of obtaining ≥1 antigen-binding human sdAb from a single library-target screen as ~24%. Successful library-target screens should be expected to yield ~1.3 human sdAbs on average, each with average binding affinity of ~2 μM. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Developing an academic medical library core journal collection in the (almost) post-print era: the Florida State University College of Medicine Medical Library experience.

    PubMed

    Shearer, Barbara S; Nagy, Suzanne P

    2003-07-01

    The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters.

  7. An approach to adjustment of relativistic mean field model parameters

    NASA Astrophysics Data System (ADS)

    Bayram, Tuncay; Akkoyun, Serkan

    2017-09-01

    The Relativistic Mean Field (RMF) model with a small number of adjusted parameters is powerful tool for correct predictions of various ground-state nuclear properties of nuclei. Its success for describing nuclear properties of nuclei is directly related with adjustment of its parameters by using experimental data. In the present study, the Artificial Neural Network (ANN) method which mimics brain functionality has been employed for improvement of the RMF model parameters. In particular, the understanding capability of the ANN method for relations between the RMF model parameters and their predictions for binding energies (BEs) of 58Ni and 208Pb have been found in agreement with the literature values.

  8. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  9. Symmetry Parameter Constraints from a Lower Bound on Neutron-matter Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tews, Ingo; Lattimer, James M.; Ohnishi, Akira

    We propose the existence of a lower bound on the energy of pure neutron matter (PNM) on the basis of unitary-gas considerations. We discuss its justification from experimental studies of cold atoms as well as from theoretical studies of neutron matter. We demonstrate that this bound results in limits to the density-dependent symmetry energy, which is the difference between the energies of symmetric nuclear matter and PNM. In particular, this bound leads to a lower limit to the volume symmetry energy parameter S {sub 0}. In addition, for assumed values of S {sub 0} above this minimum, this bound impliesmore » both upper and lower limits to the symmetry energy slope parameter L , which describes the lowest-order density dependence of the symmetry energy. A lower bound on neutron-matter incompressibility is also obtained. These bounds are found to be consistent with both recent calculations of the energies of PNM and constraints from nuclear experiments. Our results are significant because several equations of state that are currently used in astrophysical simulations of supernovae and neutron star mergers, as well as in nuclear physics simulations of heavy-ion collisions, have symmetry energy parameters that violate these bounds. Furthermore, below the nuclear saturation density, the bound on neutron-matter energies leads to a lower limit to the density-dependent symmetry energy, which leads to upper limits to the nuclear surface symmetry parameter and the neutron-star crust–core boundary. We also obtain a lower limit to the neutron-skin thicknesses of neutron-rich nuclei. Above the nuclear saturation density, the bound on neutron-matter energies also leads to an upper limit to the symmetry energy, with implications for neutron-star cooling via the direct Urca process.« less

  10. Symmetry Parameter Constraints from a Lower Bound on Neutron-matter Energy

    NASA Astrophysics Data System (ADS)

    Tews, Ingo; Lattimer, James M.; Ohnishi, Akira; Kolomeitsev, Evgeni E.

    2017-10-01

    We propose the existence of a lower bound on the energy of pure neutron matter (PNM) on the basis of unitary-gas considerations. We discuss its justification from experimental studies of cold atoms as well as from theoretical studies of neutron matter. We demonstrate that this bound results in limits to the density-dependent symmetry energy, which is the difference between the energies of symmetric nuclear matter and PNM. In particular, this bound leads to a lower limit to the volume symmetry energy parameter S 0. In addition, for assumed values of S 0 above this minimum, this bound implies both upper and lower limits to the symmetry energy slope parameter L ,which describes the lowest-order density dependence of the symmetry energy. A lower bound on neutron-matter incompressibility is also obtained. These bounds are found to be consistent with both recent calculations of the energies of PNM and constraints from nuclear experiments. Our results are significant because several equations of state that are currently used in astrophysical simulations of supernovae and neutron star mergers, as well as in nuclear physics simulations of heavy-ion collisions, have symmetry energy parameters that violate these bounds. Furthermore, below the nuclear saturation density, the bound on neutron-matter energies leads to a lower limit to the density-dependent symmetry energy, which leads to upper limits to the nuclear surface symmetry parameter and the neutron-star crust-core boundary. We also obtain a lower limit to the neutron-skin thicknesses of neutron-rich nuclei. Above the nuclear saturation density, the bound on neutron-matter energies also leads to an upper limit to the symmetry energy, with implications for neutron-star cooling via the direct Urca process.

  11. A reliable computational workflow for the selection of optimal screening libraries.

    PubMed

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.

  12. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  13. DiSCaMB: a software library for aspherical atom model X-ray scattering factor calculations with CPUs and GPUs.

    PubMed

    Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria

    2018-02-01

    It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.

  14. Use of Integral Data to Improve the European Activation File

    NASA Astrophysics Data System (ADS)

    Forrest, R. A.; Bém, P.; Kopecky, J.; von Möllendorff, U.; Pillon, M.; Seidel, K.; Simakov, S. P.; Sublet, J.-Ch.

    2005-05-01

    The European Activation File is the source of nuclear data for fusion activation calculations that has been developed in Europe. In order to trust the calculations made with the data, validation is essential. A key part of this is the comparison of the EAF data with integral experiments made in fusion relevant neutron spectra on a wide range of materials. A review of the results for the EAF-2001 and -2003 libraries is given, leading on to the recent work on the test library EAF-2004. The latter is innovative in extending the upper energy range from 20 to 60 MeV. Although integral data above 20 MeV are scarce, recent measurements have meant that a start at these energies can be made. Examples of reactions that are considered to be validated are given, which requires that both the integral and differential data are consistent with the EAF data. Cases where integral data are good but differential data are lacking or discrepant are highlighted, as are cases where both types of experimental data differ from EAF. The methodology for the use of measurements of the activity and heat to extract effective cross sections and the use of these to present C/E plots is detailed. This technique has the advantage that the integral data can be used during EAF library development rather than only when the library has been finalised. The improvement of the EAF cross-section data in the various versions of the library is demonstrated.

  15. Detection of Volatile Organic Compounds by Self-assembled Monolayer Coated Sensor Array with Concentration-independent Fingerprints

    PubMed Central

    Chang, Ye; Tang, Ning; Qu, Hemi; Liu, Jing; Zhang, Daihua; Zhang, Hao; Pang, Wei; Duan, Xuexin

    2016-01-01

    In this paper, we have modeled and analyzed affinities and kinetics of volatile organic compounds (VOCs) adsorption (and desorption) on various surface chemical groups using multiple self-assembled monolayers (SAMs) functionalized film bulk acoustic resonator (FBAR) array. The high-frequency and micro-scale resonator provides improved sensitivity in the detections of VOCs at trace levels. With the study of affinities and kinetics, three concentration-independent intrinsic parameters (monolayer adsorption capacity, adsorption energy constant and desorption rate) of gas-surface interactions are obtained to contribute to a multi-parameter fingerprint library of VOC analytes. Effects of functional group’s properties on gas-surface interactions are also discussed. The proposed sensor array with concentration-independent fingerprint library shows potential as a portable electronic nose (e-nose) system for VOCs discrimination and gas-sensitive materials selections. PMID:27045012

  16. Amarillo National Resource Center for Plutonium quarterly technical progress report, August 1, 1997--October 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report summarizes activities of the Amarillo National Resource Center for Plutonium during the quarter. The report describes the Electronic Resource Library; DOE support activities; current and future environmental health and safety programs; pollution prevention and pollution avoidance; communication, education, training, and community involvement programs; and nuclear and other material studies, including plutonium storage and disposition studies.

  17. Development of microsatellite markers in Myotis sodalis and cross-species amplification in M. gricescens, M. leibii, M. lucifugus, and M. septentrionalis

    Treesearch

    Robert G. Trujillo; Sybill K. Amelon

    2009-01-01

    The Indiana bat (Myotis sodalis) is a highly endangered vespertilionid bat whose distribution is associated with limestone caves in the eastern United States. We present nine new polymorphic nuclear microsatellite markers for Myotis sodalis developed using an enriched library method. A total of 62 M. sodalis...

  18. New Kohn-Sham density functional based on microscopic nuclear and neutron matter equations of state

    NASA Astrophysics Data System (ADS)

    Baldo, M.; Robledo, L. M.; Schuck, P.; Viñas, X.

    2013-06-01

    A new version of the Barcelona-Catania-Paris energy functional is applied to a study of nuclear masses and other properties. The functional is largely based on calculated ab initio nuclear and neutron matter equations of state. Compared to typical Skyrme functionals having 10-12 parameters apart from spin-orbit and pairing terms, the new functional has only 2 or 3 adjusted parameters, fine tuning the nuclear matter binding energy and fixing the surface energy of finite nuclei. An energy rms value of 1.58 MeV is obtained from a fit of these three parameters to the 579 measured masses reported in the Audi and Wapstra [Nucl. Phys. ANUPABL0375-947410.1016/j.nuclphysa.2003.11.003 729, 337 (2003)] compilation. This rms value compares favorably with the one obtained using other successful mean field theories, which range from 1.5 to 3.0 MeV for optimized Skyrme functionals and 0.7 to 3.0 for the Gogny functionals. The other properties that have been calculated and compared to experiment are nuclear radii, the giant monopole resonance, and spontaneous fission lifetimes.

  19. Systematic Biological Filter Design with a Desired I/O Filtering Response Based on Promoter-RBS Libraries.

    PubMed

    Hsu, Chih-Yuan; Pan, Zhen-Ming; Hu, Rei-Hsing; Chang, Chih-Chun; Cheng, Hsiao-Chun; Lin, Che; Chen, Bor-Sen

    2015-01-01

    In this study, robust biological filters with an external control to match a desired input/output (I/O) filtering response are engineered based on the well-characterized promoter-RBS libraries and a cascade gene circuit topology. In the field of synthetic biology, the biological filter system serves as a powerful detector or sensor to sense different molecular signals and produces a specific output response only if the concentration of the input molecular signal is higher or lower than a specified threshold. The proposed systematic design method of robust biological filters is summarized into three steps. Firstly, several well-characterized promoter-RBS libraries are established for biological filter design by identifying and collecting the quantitative and qualitative characteristics of their promoter-RBS components via nonlinear parameter estimation method. Then, the topology of synthetic biological filter is decomposed into three cascade gene regulatory modules, and an appropriate promoter-RBS library is selected for each module to achieve the desired I/O specification of a biological filter. Finally, based on the proposed systematic method, a robust externally tunable biological filter is engineered by searching the promoter-RBS component libraries and a control inducer concentration library to achieve the optimal reference match for the specified I/O filtering response.

  20. Construction of Chinese adult male phantom library and its application in the virtual calibration of in vivo measurement.

    PubMed

    Chen, Yizheng; Qiu, Rui; Li, Chunyan; Wu, Zhen; Li, Junli

    2016-03-07

    In vivo measurement is a main method of internal contamination evaluation, particularly for large numbers of people after a nuclear accident. Before the practical application, it is necessary to obtain the counting efficiency of the detector by calibration. The virtual calibration based on Monte Carlo simulation usually uses the reference human computational phantom, and the morphological difference between the monitored personnel with the calibrated phantom may lead to the deviation of the counting efficiency. Therefore, a phantom library containing a wide range of heights and total body masses is needed. In this study, a Chinese reference adult male polygon surface (CRAM_S) phantom was constructed based on the CRAM voxel phantom, with the organ models adjusted to match the Chinese reference data. CRAM_S phantom was then transformed to sitting posture for convenience in practical monitoring. Referring to the mass and height distribution of the Chinese adult male, a phantom library containing 84 phantoms was constructed by deforming the reference surface phantom. Phantoms in the library have 7 different heights ranging from 155 cm to 185 cm, and there are 12 phantoms with different total body masses in each height. As an example of application, organ specific and total counting efficiencies of Ba-133 were calculated using the MCNPX code, with two series of phantoms selected from the library. The influence of morphological variation on the counting efficiency was analyzed. The results show only using the reference phantom in virtual calibration may lead to an error of 68.9% for total counting efficiency. Thus the influence of morphological difference on virtual calibration can be greatly reduced using the phantom library with a wide range of masses and heights instead of a single reference phantom.

  1. Construction of Chinese adult male phantom library and its application in the virtual calibration of in vivo measurement

    NASA Astrophysics Data System (ADS)

    Chen, Yizheng; Qiu, Rui; Li, Chunyan; Wu, Zhen; Li, Junli

    2016-03-01

    In vivo measurement is a main method of internal contamination evaluation, particularly for large numbers of people after a nuclear accident. Before the practical application, it is necessary to obtain the counting efficiency of the detector by calibration. The virtual calibration based on Monte Carlo simulation usually uses the reference human computational phantom, and the morphological difference between the monitored personnel with the calibrated phantom may lead to the deviation of the counting efficiency. Therefore, a phantom library containing a wide range of heights and total body masses is needed. In this study, a Chinese reference adult male polygon surface (CRAM_S) phantom was constructed based on the CRAM voxel phantom, with the organ models adjusted to match the Chinese reference data. CRAMS phantom was then transformed to sitting posture for convenience in practical monitoring. Referring to the mass and height distribution of the Chinese adult male, a phantom library containing 84 phantoms was constructed by deforming the reference surface phantom. Phantoms in the library have 7 different heights ranging from 155 cm to 185 cm, and there are 12 phantoms with different total body masses in each height. As an example of application, organ specific and total counting efficiencies of Ba-133 were calculated using the MCNPX code, with two series of phantoms selected from the library. The influence of morphological variation on the counting efficiency was analyzed. The results show only using the reference phantom in virtual calibration may lead to an error of 68.9% for total counting efficiency. Thus the influence of morphological difference on virtual calibration can be greatly reduced using the phantom library with a wide range of masses and heights instead of a single reference phantom.

  2. Numerical study of wave propagation around an underground cavity: acoustic case

    NASA Astrophysics Data System (ADS)

    Esterhazy, Sofi; Perugia, Ilaria; Schöberl, Joachim; Bokelmann, Götz

    2015-04-01

    Motivated by the need to detect an underground cavity within the procedure of an On-Site-Inspection (OSI) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO), which might be caused by a nuclear explosion/weapon testing, we aim to provide a basic numerical study of the wave propagation around and inside such an underground cavity. The aim of the CTBTO is to ban all nuclear explosions of any size anywhere, by anyone. Therefore, it is essential to build a powerful strategy to efficiently investigate and detect critical signatures such as gas filled cavities, rubble zones and fracture networks below the surface. One method to investigate the geophysical properties of an underground cavity allowed by the Comprehensive Nuclear-test Ban Treaty is referred to as 'resonance seismometry' - a resonance method that uses passive or active seismic techniques, relying on seismic cavity vibrations. This method is in fact not yet entirely determined by the Treaty and there are also only few experimental examples that have been suitably documented to build a proper scientific groundwork. This motivates to investigate this problem on a purely numerical level and to simulate these events based on recent advances in the mathematical understanding of the underlying physical phenomena. Here, we focus our numerical study on the propagation of P-waves in two dimensions. An extension to three dimensions as well as an inclusion of the full elastic wave field is planned in the following. For the numerical simulations of wave propagation we use a high order finite element discretization which has the significant advantage that it can be extended easily from simple toy designs to complex and irregularly shaped geometries without excessive effort. Our computations are done with the parallel Finite Element Library NGSOLVE ontop of the automatic 2D/3D tetrahedral mesh generator NETGEN (http://sourceforge.net/projects/ngsolve/). Using the basic mathematical understanding of the physical equations and the numerical algorithms it is possible for us to investigate the wave field over a large bandwidth of wave numbers. This means we can apply our calculations for a wide range of parameters, while keeping the numerical error explicitly under control. The accurate numerical modeling can facilitate the development of proper analysis techniques to detect the remnants of an underground nuclear test, help to set a rigorous scientific base of OSI and contribute to bringing the Treaty into force.

  3. Neutron Data Compilation Centre, European Nuclear Energy Agency, Newsletter No. 13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1972-02-15

    This edition of the newsletter is intended to inform all users of neutron data about the content of the CCDN Experimental Neutron Data Library as of February 1972. It supercedes the last index issue, no. 11, published in October 1969. Since then, the database has been greatly enlarged thanks to the collaboration of neutron data users in the ENEA area (Western Europe plus Japan) and to the truly worldwide cooperation between the four existing data centers: NNCSC at Brookhaven Lab. in Upton, NY, United States, CCDN in Gif-sur_yvette, France, Centr po Jadernym Dannym in Obninsk, USSR, and the Nuclear Datamore » Section, IAEA, Vienna, Austria.« less

  4. CERT tribal internship program. Final intern report: Lewis Yellowrobe, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-09-01

    The purpose of this internship was to present state legislators with the history and an overview of the Department of Energy`s policies towards occupational health and safety during cleanup of nuclear weapons production facilities. The approach used library research and phone and personal interviews to acquire information on DOE policies. This intern report contains the final report to legislators entitled ``Environmental restoration and waste management: Worker health and safety concerns during nuclear facility cleanup.`` It presents the current status of DOE occupational health and safety at production facilities, Congressional intent, past DOE occupational policies, and options for state legislators tomore » use to get involved with DOE policy direction.« less

  5. Optimization of the water chemistry of the primary coolant at nuclear power plants with VVER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barmin, L. F.; Kruglova, T. K.; Sinitsyn, V. P.

    2005-01-15

    Results of the use of automatic hydrogen-content meter for controlling the parameter of 'hydrogen' in the primary coolant circuit of the Kola nuclear power plant are presented. It is shown that the correlation between the 'hydrogen' parameter in the coolant and the 'hydrazine' parameter in the makeup water can be used for controlling the water chemistry of the primary coolant system, which should make it possible to optimize the water chemistry at different power levels.

  6. Does the Fuhrman or World Health Organization/International Society of Urological Pathology Grading System Apply to the Xp11.2 Translocation Renal Cell Carcinoma?: A 10-Year Single-Center Study.

    PubMed

    Liu, Ning; Gan, Weidong; Qu, Feng; Wang, Zhen; Zhuang, Wenyuan; Agizamhan, Sezim; Xu, Linfeng; Yin, Juanjuan; Guo, Hongqian; Li, Dongmei

    2018-04-01

    The Fuhrman and World Health Organization/International Society of Urological Pathology (WHO/ISUP) grading systems are widely used to predict survival for patients with conventional renal cell carcinoma. To determine the validity of nuclear grading systems (both the Fuhrman and the WHO/ISUP) and the individual components of the Fuhrman grading system in predicting the prognosis of Xp11.2 translocation renal cell carcinoma (Xp11.2 tRCC), we identified and followed up 47 patients with Xp11.2 tRCC in our center from January 2007 to June 2017. The Fuhrman and WHO/ISUP grading was reassigned by two pathologists. Nuclear size and shape were determined for each case based on the greatest degree of nuclear pleomorphism using image analysis software. Univariate and multivariate analyses were performed to evaluate the capacity of the grading systems and nuclear parameters to predict overall survival and progression-free survival. On univariate Cox regression analysis, the parameters of nuclear size were associated significantly with overall survival and progression-free survival, whereas the grading systems and the parameters of nuclear shape failed to reach a significant correlation. On multivariate analysis, however, none of the parameters was associated independently with survival. Our findings indicate that neither the Fuhrman nor the WHO/ISUP grading system is applicable to Xp11.2 tRCC. The assessment of nuclear size instead may be novel outcome predictors for patients with Xp11.2 tRCC. Copyright © 2018 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  7. Analysis of a library of macaque nuclear mitochondrial sequences confirms macaque origin of divergent sequences from old oral polio vaccine samples.

    PubMed

    Vartanian, Jean-Pierre; Wain-Hobson, Simon

    2002-05-28

    Nuclear mtDNA sequences (numts) are a widespread family of paralogs evolving as pseudogenes in chromosomal DNA [Zhang, D. E. & Hewitt, G. M. (1996) TREE 11, 247-251 and Bensasson, D., Zhang, D., Hartl, D. L. & Hewitt, G. M. (2001) TREE 16, 314-321]. When trying to identify the species origin of an unknown DNA sample by way of an mtDNA locus, PCR may amplify both mtDNA and numts. Indeed, occasionally numts dominate confounding attempts at species identification [Bensasson, D., Zhang, D. X. & Hewitt, G. M. (2000) Mol. Biol. Evol. 17, 406-415; Wallace, D. C., et al. (1997) Proc. Natl. Acad. Sci. USA 94, 14900-14905]. Rhesus and cynomolgus macaque mtDNA haplotypes were identified in a study of oral polio vaccine samples dating from the late 1950s [Blancou, P., et al. (2001) Nature (London) 410, 1045-1046]. They were accompanied by a number of putative numts. To confirm that these putative numts were of macaque origin, a library of numts corresponding to a small segment of 12S rDNA locus has been made by using DNA from a Chinese rhesus macaque. A broad distribution was found with up to 30% sequence variation. Phylogenetic analysis showed that the evolutionary trajectories of numts and bona fide mtDNA haplotypes do not overlap with the signal exception of the host species; mtDNA fragments are continually crossing over into the germ line. In the case of divergent mtDNA sequences from old oral polio vaccine samples [Blancou, P., et al. (2001) Nature (London) 410, 1045-1046], all were closely related to numts in the Chinese macaque library.

  8. Development of Neutron Energy Spectral Signatures for Passive Monitoring of Spent Nuclear Fuels in Dry Cask Storage

    NASA Astrophysics Data System (ADS)

    Harkness, Ira; Zhu, Ting; Liang, Yinong; Rauch, Eric; Enqvist, Andreas; Jordan, Kelly A.

    2018-01-01

    Demand for spent nuclear fuel dry casks as an interim storage solution has increased globally and the IAEA has expressed a need for robust safeguards and verification technologies for ensuring the continuity of knowledge and the integrity of radioactive materials inside spent fuel casks. Existing research has been focusing on "fingerprinting" casks based on count rate statistics to represent radiation emission signatures. The current research aims to expand to include neutron energy spectral information as part of the fuel characteristics. First, spent fuel composition data are taken from the Next Generation Safeguards Initiative Spent Fuel Libraries, representative for Westinghouse 17ˣ17 PWR assemblies. The ORIGEN-S code then calculates the spontaneous fission and (α,n) emissions for individual fuel rods, followed by detailed MCNP simulations of neutrons transported through the fuel assemblies. A comprehensive database of neutron energy spectral profiles is to be constructed, with different enrichment, burn-up, and cooling time conditions. The end goal is to utilize the computational spent fuel library, predictive algorithm, and a pressurized 4He scintillator to verify the spent fuel assemblies inside a cask. This work identifies neutron spectral signatures that correlate with the cooling time of spent fuel. Both the total and relative contributions from spontaneous fission and (α,n) change noticeably with respect to cooling time, due to the relatively short half-life (18 years) of the major neutron source 244Cm. Identification of this and other neutron spectral signatures allows the characterization of spent nuclear fuels in dry cask storage.

  9. Parameter study of dual-mode space nuclear fission solid core power and propulsion systems, NUROC3A. AMS report No. 1239c

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, W.W.; Layton, J.P.

    1976-09-13

    The three-volume report describes a dual-mode nuclear space power and propulsion system concept that employs an advanced solid-core nuclear fission reactor coupled via heat pipes to one of several electric power conversion systems. The NUROC3A systems analysis code was designed to provide the user with performance characteristics of the dual-mode system. Volume 3 describes utilization of the NUROC3A code to produce a detailed parameter study of the system.

  10. Shutdown Dose Rate Analysis for the long-pulse D-D Operation Phase in KSTAR

    NASA Astrophysics Data System (ADS)

    Park, Jin Hun; Han, Jung-Hoon; Kim, D. H.; Joo, K. S.; Hwang, Y. S.

    2017-09-01

    KSTAR is a medium size fully superconducting tokamak. The deuterium-deuterium (D-D) reaction in the KSTAR tokamak generates neutrons with a peak yield of 3.5x1016 per second through a pulse operation of 100 seconds. The effect of neutron generation from full D-D high power KSTAR operation mode to the machine, such as activation, shutdown dose rate, and nuclear heating, are estimated for an assurance of safety during operation, maintenance, and machine upgrade. The nuclear heating of the in-vessel components, and neutron activation of the surrounding materials have been investigated. The dose rates during operation and after shutdown of KSTAR have been calculated by a 3D CAD model of KSTAR with the Monte Carlo code MCNP5 (neutron flux and decay photon), the inventory code FISPACT (activation and decay photon) and the FENDL 2.1 nuclear data library.

  11. Neutronics Analysis of Water-Cooled Ceramic Breeder Blanket for CFETR

    NASA Astrophysics Data System (ADS)

    Zhu, Qingjun; Li, Jia; Liu, Songlin

    2016-07-01

    In order to investigate the nuclear response to the water-cooled ceramic breeder blanket models for CFETR, a detailed 3D neutronics model with 22.5° torus sector was developed based on the integrated geometry of CFETR, including heterogeneous WCCB blanket models, shield, divertor, vacuum vessel, toroidal and poloidal magnets, and ports. Using the Monte Carlo N-Particle Transport Code MCNP5 and IAEA Fusion Evaluated Nuclear Data Library FENDL2.1, the neutronics analyses were performed. The neutron wall loading, tritium breeding ratio, the nuclear heating, neutron-induced atomic displacement damage, and gas production were determined. The results indicate that the global TBR of no less than 1.2 will be a big challenge for the water-cooled ceramic breeder blanket for CFETR. supported by the National Magnetic Confinement Fusion Science Program of China (Nos. 2013GB108004, 2014GB122000, and 2014GB119000), and National Natural Science Foundation of China (No. 11175207)

  12. Nuclear Engine System Simulation (NESS) version 2.0

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    The topics are presented in viewgraph form and include the following; nuclear thermal propulsion (NTP) engine system analysis program development; nuclear thermal propulsion engine analysis capability requirements; team resources used to support NESS development; expanded liquid engine simulations (ELES) computer model; ELES verification examples; NESS program development evolution; past NTP ELES analysis code modifications and verifications; general NTP engine system features modeled by NESS; representative NTP expander, gas generator, and bleed engine system cycles modeled by NESS; NESS program overview; NESS program flow logic; enabler (NERVA type) nuclear thermal rocket engine; prismatic fuel elements and supports; reactor fuel and support element parameters; reactor parameters as a function of thrust level; internal shield sizing; and reactor thermal model.

  13. Use of Massive Parallel Computing Libraries in the Context of Global Gravity Field Determination from Satellite Data

    NASA Astrophysics Data System (ADS)

    Brockmann, J. M.; Schuh, W.-D.

    2011-07-01

    The estimation of the global Earth's gravity field parametrized as a finite spherical harmonic series is computationally demanding. The computational effort depends on the one hand on the maximal resolution of the spherical harmonic expansion (i.e. the number of parameters to be estimated) and on the other hand on the number of observations (which are several millions for e.g. observations from the GOCE satellite missions). To circumvent these restrictions, a massive parallel software based on high-performance computing (HPC) libraries as ScaLAPACK, PBLAS and BLACS was designed in the context of GOCE HPF WP6000 and the GOCO consortium. A prerequisite for the use of these libraries is that all matrices are block-cyclic distributed on a processor grid comprised by a large number of (distributed memory) computers. Using this set of standard HPC libraries has the benefit that once the matrices are distributed across the computer cluster, a huge set of efficient and highly scalable linear algebra operations can be used.

  14. Empirical calibration of the near-infrared Ca ii triplet - I. The stellar library and index definition

    NASA Astrophysics Data System (ADS)

    Cenarro, A. J.; Cardiel, N.; Gorgas, J.; Peletier, R. F.; Vazdekis, A.; Prada, F.

    2001-09-01

    A new stellar library at the near-IR spectral region developed for the empirical calibration of the Caii triplet and stellar population synthesis modelling is presented. The library covers the range λλ8348-9020 at 1.5-Å (FWHM) spectral resolution, and consists of 706 stars spanning a wide range in atmospheric parameters. We have defined a new set of near-IR indices, CaT*, CaT and PaT, which mostly overcome the limitations of previous definitions, the former being specially suited for the measurement of the Caii triplet strength corrected for the contamination from Paschen lines. We also present a comparative study of the new and the previous Ca indices, as well as the corresponding transformations between the different systems. A thorough analysis of the sources of index errors and the procedure to calculate them is given. Finally, index and error measurements for the whole stellar library are provided together with the final spectra.

  15. Comprehensive overview of the Point-by-Point model of prompt emission in fission

    NASA Astrophysics Data System (ADS)

    Tudora, A.; Hambsch, F.-J.

    2017-08-01

    The investigation of prompt emission in fission is very important in understanding the fission process and to improve the quality of evaluated nuclear data required for new applications. In the last decade remarkable efforts were done for both the development of prompt emission models and the experimental investigation of the properties of fission fragments and the prompt neutrons and γ-ray emission. The accurate experimental data concerning the prompt neutron multiplicity as a function of fragment mass and total kinetic energy for 252Cf(SF) and 235 ( n, f) recently measured at JRC-Geel (as well as other various prompt emission data) allow a consistent and very detailed validation of the Point-by-Point (PbP) deterministic model of prompt emission. The PbP model results describe very well a large variety of experimental data starting from the multi-parametric matrices of prompt neutron multiplicity ν (A,TKE) and γ-ray energy E_{γ}(A,TKE) which validate the model itself, passing through different average prompt emission quantities as a function of A ( e.g., ν(A), E_{γ}(A), < ɛ > (A) etc.), as a function of TKE ( e.g., ν (TKE), E_{γ}(TKE)) up to the prompt neutron distribution P (ν) and the total average prompt neutron spectrum. The PbP model does not use free or adjustable parameters. To calculate the multi-parametric matrices it needs only data included in the reference input parameter library RIPL of IAEA. To provide average prompt emission quantities as a function of A, of TKE and total average quantities the multi-parametric matrices are averaged over reliable experimental fragment distributions. The PbP results are also in agreement with the results of the Monte Carlo prompt emission codes FIFRELIN, CGMF and FREYA. The good description of a large variety of experimental data proves the capability of the PbP model to be used in nuclear data evaluations and its reliability to predict prompt emission data for fissioning nuclei and incident energies for which the experimental information is completely missing. The PbP treatment can also provide input parameters of the improved Los Alamos model with non-equal residual temperature distributions recently reported by Madland and Kahler, especially for fissioning nuclei without any experimental information concerning the prompt emission.

  16. Refinement of parameters of weak nuclear explosions conducted at the Semipalatinsk test site on the basis of historical seismograms study

    NASA Astrophysics Data System (ADS)

    Sokolova, Inna

    2014-05-01

    Many researchers working in the field of monitoring and discriminating of nuclear tests encounter the problem of lacking in seismic catalogues the information about source parameters for weak nuclear explosions. As usual, the information about origin time, coordinates and magnitude is absent, there is information about date, approximate coordinates and information about explosion yield. Huge work conducted on recovery of parameters of small underground nuclear explosions conducted at the Semipalatinsk Test Site using records of analogue seismic stations of the USSR located at regional distances was conducted by V. Khalturin, T. Rayutian, P. Richards (Pure and Applied Geophysics, 2001). However, if underground nuclear explosions are studied and described in literature quite well, then air and contact explosions were small and were not recorded by standard permanent seismic stations. In 1961-1962 maximum number of air and contact explosions was conducted at Opytnoye polye site of the STS. We managed to find and analyze additional seismic data from some temporary and permanent stations. That time IPE AS USSR installed a network of high-sensitive stations along Pamir-Baykal profile to study earth crust structure and upper mantle, the profile length was 3500 km. Epicentral distance from some stations of the profile to Opytnoye polye was 300-400 km. In addition, a permanent seismic station Semipalatinsk (SEM) located 175 km away from the site started its operation. The seismograms from this station became available recently. The digitized historical seismograms allowed to recover and add parameters for more than 36 air and surface explosions. Origin time, coordinates, magnitudes mpv, MLV and energy class K were determined for explosions. A regional travel-time curve for Central Kazakhstan constructed using records of calibration chemical explosions conducted at the STS in 1997-2000 and ground-truth underground nuclear explosions was used to determine kinematic parameters of explosions. MLV, mpv, and energy class K were determined for all underground nuclear explosions conducted at the STS using historical seismograms from Central Asia stations. Dependencies of regional magnitudes on yield were received for air and underground nuclear explosions. Thus, application of historical seismograms at regional distances allows to recover and replenish the seismic catalogues of past nuclear explosions for further use in scientific investigations and monitoring tasks.

  17. Nuclear Data Sheets page at the NNDC

    Science.gov Websites

    Nuclear Data Sheets Home Index Special Issues Citation Elsevier ENSDF NSR NSDD NNDC Citation Parameters: A few plots that help characterize the Nuclear Data Sheets (NDS) journal are shown in this page number of citations per article during the 1992-2002 period are plotted in the figure below for Nuclear

  18. Gas inflow patterns and nuclear rings in barred galaxies

    NASA Astrophysics Data System (ADS)

    Shen, Juntai; Li, Zhi

    2017-06-01

    Nuclear rings, dust lanes, and nuclear spirals are common structures in the inner region of barred galaxies, with their shapes and properties linked to the physical parameters of the galaxies. We use high-resolution hydrodynamical simulations to study gas inflow patterns in barred galaxies, with special attention on the nuclear rings. The location and thickness of nuclear ringsare tightly correlated with galactic properties, such as the bar pattern speed and bulge central density, within certain ranges. We identify the backbone of nuclear rings with a major orbital family of bars. The rings form exactly at the radius where the residual angular momentum of inflowing gas balances the centrifugal force. We propose a new simple method to predict the bar pattern speed for barred galaxies possessing a nuclear ring, without actually doing simulations. We apply this method to some real galaxies and find that our predicted bar pattern speed compare reasonably well with other estimates. Our study may have important implications for using nuclear ringsto measure the parameters of real barred galaxies with detailed gas kinematics. We have also extended current hydrodynamical simulations to model gas features in the Milky Way.

  19. Differential diagnosis of well-differentiated squamous cell carcinoma from non-neoplastic oral mucosal lesions: New cytopathologic evaluation method dependent on keratinization-related parameters but not nuclear atypism.

    PubMed

    Hara, Hitoshi; Misawa, Tsuneo; Ishii, Eri; Nakagawa, Miki; Koshiishi, Saki; Amemiya, Kenji; Oyama, Toshio; Tominaga, Kazuya; Cheng, Jun; Tanaka, Akio; Saku, Takashi

    2017-05-01

    The cytology of oral squamous cell carcinoma (SCC) is challenging because oral SCC cells tend to be well differentiated and lack nuclear atypia, often resulting in a false negative diagnosis. The purpose of this study was to establish practical cytological parameters specific to oral SCCs. We reviewed 123 cases of malignancy and 53 of non-neoplastic lesions of the oral mucosa, which had been diagnosed using both cytology and histopathology specimens. From those, we selected 12 SCC and 4 CIS cases that had initially been categorized as NILM to ASC-H with the Bethesda system, as well as 4 non-neoplastic samples categorized as LSIL or ASC-H as controls, and compared their characteristic findings. After careful examinations, we highlighted five cytological parameters, as described in Results. Those 20 cytology samples were then reevaluated by 4 independent examiners using the Bethesda system as well as the 5 parameters. Five cytological features, (i) concentric arrangement of orangeophilic cells (indicating keratin pearls), (ii) large number of orangeophilic cells, (iii) bizarre-shaped orangeophilic cells without nuclear atypia, (iv) keratoglobules, and (v) uneven filamentous cytoplasm, were found to be significant parameters. All malignant cases contained at least one of those parameters, while none were observed in the four non-neoplastic cases with nuclear atypia. In reevaluations, the Bethesda system did not help the screeners distinguish oral SCCs from non-neoplastic lesions, while use of the five parameters enabled them to make a diagnosis of SCC. Recognition of the present five parameters is useful for oral SCC cytology. Diagn. Cytopathol. 2017;45:406-417. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. 10 CFR 52.93 - Exemptions and variances.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... referencing a nuclear power reactor manufactured under a manufacturing license issued under subpart F of this... NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS..., site parameters, terms and conditions, or approved design of the manufactured reactor. The Commission...

  1. Evaluation of prompt gamma-ray data and nuclear structure of niobium-94 with statistical model calculations

    NASA Astrophysics Data System (ADS)

    Turkoglu, Danyal

    Precise knowledge of prompt gamma-ray intensities following neutron capture is critical for elemental and isotopic analyses, homeland security, modeling nuclear reactors, etc. A recently-developed database of prompt gamma-ray production cross sections and nuclear structure information in the form of a decay scheme, called the Evaluated Gamma-ray Activation File (EGAF), is under revision. Statistical model calculations are useful for checking the consistency of the decay scheme, providing insight on its completeness and accuracy. Furthermore, these statistical model calculations are necessary to estimate the contribution of continuum gamma-rays, which cannot be experimentally resolved due to the high density of excited states in medium- and heavy-mass nuclei. Decay-scheme improvements in EGAF lead to improvements to other databases (Evaluated Nuclear Structure Data File, Reference Input Parameter Library) that are ultimately used in nuclear-reaction models to generate the Evaluated Nuclear Data File (ENDF). Gamma-ray transitions following neutron capture in 93Nb have been studied at the cold-neutron beam facility at the Budapest Research Reactor. Measurements have been performed using a coaxial HPGe detector with Compton suppression. Partial gamma-ray production capture cross sections at a neutron velocity of 2200 m/s have been deduced relative to that of the 255.9-keV transition after cold-neutron capture by 93Nb. With the measurement of a niobium chloride target, this partial cross section was internally standardized to the cross section for the 1951-keV transition after cold-neutron capture by 35Cl. The resulting (0.1377 +/- 0.0018) barn (b) partial cross section produced a calibration factor that was 23% lower than previously measured for the EGAF database. The thermal-neutron cross sections were deduced for the 93Nb(n,gamma ) 94mNb and 93Nb(n,gamma) 94gNb reactions by summing the experimentally-measured partial gamma-ray production cross sections associated with the ground-state transitions below the 396-keV level and combining that summation with the contribution to the ground state from the quasi-continuum above 396 keV, determined with Monte Carlo statistical model calculations using the DICEBOX computer code. These values, sigmam and sigma 0, were (0.83 +/- 0.05) b and (1.16 +/- 0.11) b, respectively, and found to be in agreement with literature values. Comparison of the modeled population and experimental depopulation of individual levels confirmed tentative spin assignments and suggested changes where imbalances existed.

  2. Air Pollution and Quality of Sperm: A Meta-Analysis

    PubMed Central

    Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin

    2015-01-01

    Context: Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. Evidence Acquisition: The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. Results: The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. Conclusions: The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram. PMID:26023349

  3. Air pollution and quality of sperm: a meta-analysis.

    PubMed

    Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin

    2015-04-01

    Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram.

  4. Case Studies of Seismic Discrimination Problems and Regional Discriminant Transportability.

    DTIC Science & Technology

    1995-07-31

    UCRL -JC- 118551 Part 1, Lawrence Livermore National Laboratory, September 1994. Wuster, J. (1993). Discrimination of chemical explosions and...Steven Bratt Dr. Jeffrey W. Given ARPA/NMRO SAIC 3701 North Fairfax Drive 10260 Campus Point Drive Arlington, VA 22203-1714 San Diego, CA 92121 Dale...5007 BERGEN NORWAY Newington, VA 22122 ARPA, OASB/Library David Jepsen 3701 North Fairfax Drive Acting Head, Nuclear Monitoring Section Arlington, VA

  5. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  6. Identifying MicroRNAs and Transcript Targets in Jatropha Seeds

    PubMed Central

    Galli, Vanessa; Guzman, Frank; de Oliveira, Luiz F. V.; Loss-Morais, Guilherme; Körbes, Ana P.; Silva, Sérgio D. A.; Margis-Pinheiro, Márcia M. A. N.; Margis, Rogério

    2014-01-01

    MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play a key role in diverse plant biological processes. Jatropha curcas L. has received significant attention as a potential oilseed crop for the production of renewable oil. Here, a sRNA library of mature seeds and three mRNA libraries from three different seed development stages were generated by deep sequencing to identify and characterize the miRNAs and pre-miRNAs of J. curcas. Computational analysis was used for the identification of 180 conserved miRNAs and 41 precursors (pre-miRNAs) as well as 16 novel pre-miRNAs. The predicted miRNA target genes are involved in a broad range of physiological functions, including cellular structure, nuclear function, translation, transport, hormone synthesis, defense, and lipid metabolism. Some pre-miRNA and miRNA targets vary in abundance between the three stages of seed development. A search for sequences that produce siRNA was performed, and the results indicated that J. curcas siRNAs play a role in nuclear functions, transport, catalytic processes and disease resistance. This study presents the first large scale identification of J. curcas miRNAs and their targets in mature seeds based on deep sequencing, and it contributes to a functional understanding of these miRNAs. PMID:24551031

  7. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  8. Development of Probabilistic Socio-Economic Emissions Scenarios (2012)

    EPA Pesticide Factsheets

    The purpose of this analysis is to help overcome these limitations through the development of a publically available library of socio-economic-emissions projections derived from a systematic examination of uncertainty in key underlying model parameters, w

  9. Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less

  10. Nuclear Physical Uncertainties in Modeling X-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Regis, Eric; Amthor, A. Matthew

    2017-09-01

    Type I x-ray bursts occur when a neutron star accretes material from the surface of another star in a compact binary star system. For certain accretion rates and material compositions, much of the nuclear material is burned in short, explosive bursts. Using a one-dimensional stellar model, Kepler, and a comprehensive nuclear reaction rate library, ReacLib, we have simulated chains of type I x-ray bursts. Unfortunately, there are large remaining uncertainties in the nuclear reaction rates involved, since many of the isotopes reacting are unstable and have not yet been studied experimentally. Some individual reactions, when varied within their estimated uncertainty, alter the light curves dramatically. This limits our ability to understand the structure of the neutron star. Previous studies have looked at the effects of individual reaction rate uncertainties. We have applied a Monte Carlo method ``-simultaneously varying a set of reaction rates'' -in order to probe the expected uncertainty in x-ray burst behaviour due to the total uncertainty in all nuclear reaction rates. Furthermore, we aim to discover any nonlinear effects due to the coupling between different reaction rates. Early results show clear non-linear effects. This research was made possible by NSF-DUE Grant 1317446, BUScholars Program.

  11. Amplicon Sequencing Reveals Microbiological Signatures in Spent Nuclear Fuel Storage Basins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagwell, Christopher E.; Noble, Peter A.; Milliken, Charles E.

    Water quality is an important determinant for the structural integrity of alloy cladded fuels and assemblies during long-term wet storage. Detailed characterization of a water filled storage basin for spent nuclear reactor fuel was performed following the formation and proliferation of an amorphous white flocculent. White precipitant was sampled throughout the storage basin for chemical and spectroscopic characterization, and eDNA was extracted for pyrosequencing of bacterial rRNA gene diversity. Accordingly, spectroscopic analyses indicated the precipitant to be primarily amorphous to crystalline aluminum (oxy) hydroxides with minor associated elemental components including Fe, Si, Ti, and U. High levels of dissolved carbonmore » were co-localized with the precipitant relative to bulk water. Bacterial densities were highly variable between sampling locations and with depth; cell numbers (log scale) ranged from 5.6 to 4.89 cells / mL. Bacterial diversity that was physically associated with the aluminum (oxy) hydroxide complexes exceeded an estimated 4,000 OTUs / amplicon library (3% cutoff) and the greatest percent majority of sequences were aligned to the families Burkholderiales (23%), Nitrospiraceae (23%), Hyphomicrobiaceae (17%), and Comamonadaceae (6%). We surmise that episodic changes in the physical and chemical properties of the basin contribute to the polymerization of aluminum (oxy) hydroxides, which in turn can chemisorb nutrients, carbon ligands and bacterial cells from the surrounding bulk aqueous phase. As such, these precipitants should establish favorable microhabitats for bacterial colonization and growth. Comparative analyses of 16S rRNA gene amplicon libraries across diverse environmental landscapes were performed and microbiological signatures unique to the spent nuclear fuel storage basin environment were revealed. These insights could spur the development of tractable bioindicators that are specific of and diagnostic for water quality at discrete locations and finer scales of resolution, marking an important contribution for improved water quality and management of spent nuclear fuel storage facilities.« less

  12. Needs of Accurate Prompt and Delayed γ-spectrum and Multiplicity for Nuclear Reactor Designs

    NASA Astrophysics Data System (ADS)

    Rimpault, G.; Bernard, D.; Blanchet, D.; Vaglio-Gaudard, C.; Ravaux, S.; Santamarina, A.

    The local energy photon deposit must be accounted accurately for Gen-IV fast reactors, advanced light-water nuclear reactors (Gen-III+) and the new experimental Jules Horowitz Reactor (JHR). The γ energy accounts for about 10% of the total energy released in the core of a thermal or fast reactor. The γ-energy release is much greater in the core of the reactor than in its structural sub-assemblies (such as reflector, control rod followers, dummy sub-assemblies). However, because of the propagation of γ from the core regions to the neighboring fuel-free assemblies, the contribution of γ energy to the total heating can be dominant. For reasons related to their performance, power reactors require a 7.5% (1σ) uncertainty for the energy deposition in non-fuelled zones. For the JHR material-testing reactor, a 5% (1 s) uncertainty is required in experimental positions. In order to verify the adequacy of the calculation of γ-heating, TLD and γ-fission chambers were used to derive the experimental heating values. Experimental programs were and are still conducted in different Cadarache facilities such as MASURCA (for SFR), MINERVE and EOLE (for JHR and Gen-III+ reactors). The comparison of calculated and measured γ-heating values shows an underestimation in all experimental programs indicating that for the most γ-production data from 239Pu in current nuclear-data libraries is highly suspicious.The first evaluation priority is for prompt γ-multiplicity for U and Pu fission but similar values for otheractinides such as Pu and U are also required. The nuclear data library JEFF3.1.1 contains most of the photon production data. However, there are some nuclei for which there are missing or erroneous data which need to be completed or modified. A review of the data available shows a lack of measurements for conducting serious evaluation efforts. New measurements are needed to guide new evaluation efforts which benefit from consolidated modeling techniques.

  13. A cross-platform GUI to control instruments compliant with SCPI through VISA

    NASA Astrophysics Data System (ADS)

    Roach, Eric; Liu, Jing

    2015-10-01

    In nuclear physics experiments, it is necessary and important to control instruments from a PC, which automates many tasks that require human operations otherwise. Not only does this make long term measurements possible, but it also makes repetitive operations less error-prone. We created a graphical user interface (GUI) to control instruments connected to a PC through RS232, USB, LAN, etc. The GUI is developed using Qt Creator, a cross-platform integrated development environment, which makes it portable to various operating systems, including those commonly used in mobile devices. NI-VISA library is used in the back end so that the GUI can be used to control instruments connected through various I/O interfaces without any modification. Commonly used SCPI commands can be sent to different instruments using buttons, sliders, knobs, and other various widgets provided by Qt Creator. As an example, we demonstrate how we set and fetch parameters and how to retrieve and display data from an Agilent Digital Storage Oscilloscope X3034A with the GUI. Our GUI can be easily used for other instruments compliant with SCPI and VISA with little or no modification.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perret, Gregory

    The critical decay constant (B/A), delayed neutron fraction (B) and generation time (A) of the Minerve reactor were measured by the Paul Scherrer Institut (PSI) and the Commissariat a l'Energie Atomique (CEA) in September 2014 using the Feynman-alpha and Power Spectral Density neutron noise measurement techniques. Three slightly subcritical configuration were measured using two 1-g {sup 235}U fission chambers. This paper reports on the results obtained by PSI in the near critical configuration (-2g). The most reliable and precise results were obtained with the Cross-Power Spectral Density technique: B = 708.4±9.2 pcm, B/A = 79.0±0.6 s{sup -1} and A 89.7±1.4more » micros. Predictions of the same kinetic parameters were obtained with MCNP5-v1.6 and the JEFF-3.1 and ENDF/B-VII.1 nuclear data libraries. On average the predictions for B and B/A overestimate the experimental results by 5% and 11%, respectively. The discrepancy is suspected to come from either a corruption of the data or from the inadequacy of the point kinetic equations to interpret the measurements in the Minerve driven system. (authors)« less

  15. Fractal Model of Fission Product Release in Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Stankunas, Gediminas

    2012-09-01

    A model of fission gas migration in nuclear fuel pellet is proposed. Diffusion process of fission gas in granular structure of nuclear fuel with presence of inter-granular bubbles in the fuel matrix is simulated by fractional diffusion model. The Grunwald-Letnikov derivative parameter characterizes the influence of porous fuel matrix on the diffusion process of fission gas. A finite-difference method for solving fractional diffusion equations is considered. Numerical solution of diffusion equation shows correlation of fission gas release and Grunwald-Letnikov derivative parameter. Calculated profile of fission gas concentration distribution is similar to that obtained in the experimental studies. Diffusion of fission gas is modeled for real RBMK-1500 fuel operation conditions. A functional dependence of Grunwald-Letnikov derivative parameter with fuel burn-up is established.

  16. The effect of call libraries and acoustic filters on the identification of bat echolocation.

    PubMed

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-09-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  17. The effect of call libraries and acoustic filters on the identification of bat echolocation

    PubMed Central

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys. PMID:25535563

  18. The effect of call libraries and acoustic filters on the identification of bat echolocation

    USGS Publications Warehouse

    Clement, Matthew; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  19. PHITS-2.76, Particle and Heavy Ion Transport code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-08-01

    Version 03 PHITS can deal with the transport of almost all particles (nucleons, nuclei, mesons, photons, and electrons) over wide energy ranges, using several nuclear reaction models and nuclear data libraries. Geometrical configuration of the simulation can be set with GG (General Geometry) or CG (Combinatorial Geometry). Various quantities such as heat deposition, track length and production yields can be deduced from the simulation, using implemented estimator functions called "tally". The code also has a function to draw 2D and 3D figures of the calculated results as well as the setup geometries, using a code ANGEL. The physical processes includedmore » in PHITS can be divided into two categories, transport process and collision process. In the transport process, PHITS can simulate motion of particles under external fields such as magnetic and gravity. Without the external fields, neutral particles move along a straight trajectory with constant energy up to the next collision point. However, charge particles interact many times with electrons in the material losing energy and changing direction. PHITS treats ionization processes not as collision but as a transport process, using the continuous-slowing-down approximation. The average stopping power is given by the charge density of the material and the momentum of the particle taking into account the fluctuations of the energy loss and the angular deviation. In the collision process, PHITS can simulate the elastic and inelastic interactions as well as decay of particles. The total reaction cross section, or the life time of the particle is an essential quantity in the determination of the mean free path of the transport particle. According to the mean free path, PHITS chooses the next collision point using the Monte Carlo method. To generate the secondary particles of the collision, we need the information of the final states of the collision. For neutron induced reactions in low energy region, PHITS employs the cross sections from evaluated nuclear data libraries JENDL-4.0 (Shibata et al 2011). For high energy neutrons and other particles, we have incorporated several models such as JAM (Nara et al 1999), INCL (Cugnon et al 2011), INCL-ELF (Sawada et al 2012) and JQMD (Niita et al 1995) to simulate nuclear reactions up to 100 GeV/u. The special features of PHITS are the event generator mode (Iwamoto et al 2007) and the microdosimetric function (Sato et al 2009). Owing to the event generator mode, PHITS can determine the profiles of all secondary particles generated from a single nuclear interaction even using nuclear data libraries, taking the momentum and energy conservations into account. The microdosimetric function gives the probability densities of deposition energy in microscopic sites such as lineal energy y and specific energy z, using the mathematical model developed based on the results of the track structure simulation. These features are very important for various purposes such as the estimations of soft-error rates of semi-conductor devices induced by neutrons, and relative biological effectiveness of charged particles. From version 2.64, Prompt gamma spectrum and isomer production rates can be precisely estimated, owing to the implementation of EBITEM (ENSDF-Based Isomeric Transition and isomEr production Model). The photo-nuclear reaction model was improved up to 140 MeV. From version 2.76, electron and photon transport algorithm based on EGS5 (Hirayama et al. 2005) was incorporated. Models for describing photo-nuclear reaction above 140 MeV and muon-nuclear reaction were implemented. Event-generator mode version 2 was developed. Relativistic theory can be considered in the JQMD model.« less

  20. Uncertainty quantification and propagation in nuclear density functional theory

    DOE PAGES

    Schunck, N.; McDonnell, J. D.; Higdon, D.; ...

    2015-12-23

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less

  1. The State-of-the-Art of Materials Technology Used for Fossil and Nuclear Power Plants in China

    NASA Astrophysics Data System (ADS)

    Weng, Yuqing

    Combined with the development of energy in China during the past 30 years, this paper clarified that high steam parameters ultra-supercritical (USC) coal-fired power plants and 1000MW nuclear power plants are the most important method to optimize energy structure and achieve national goals of energy saving and CO2 emission in China. Additionally, requirement of materials technology in high steam parameters USC coal-fired power plants and 1000MW nuclear power plants, current research and major development of relevant materials technology in China were briefly described in this paper.

  2. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  3. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  4. mr: A C++ library for the matching and running of the Standard Model parameters

    NASA Astrophysics Data System (ADS)

    Kniehl, Bernd A.; Pikelner, Andrey F.; Veretin, Oleg L.

    2016-09-01

    We present the C++ program library mr that allows us to reliably calculate the values of the running parameters in the Standard Model at high energy scales. The initial conditions are obtained by relating the running parameters in the MS bar renormalization scheme to observables at lower energies with full two-loop precision. The evolution is then performed in accordance with the renormalization group equations with full three-loop precision. Pure QCD corrections to the matching and running are included through four loops. We also provide a Mathematica interface for this program library. Catalogue identifier: AFAI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 517613 No. of bytes in distributed program, including test data, etc.: 2358729 Distribution format: tar.gz Programming language: C++. Computer: IBM PC. Operating system: Linux, Mac OS X. RAM: 1 GB Classification: 11.1. External routines: TSIL [1], OdeInt [2], boost [3] Nature of problem: The running parameters of the Standard Model renormalized in the MS bar scheme at some high renormalization scale, which is chosen by the user, are evaluated in perturbation theory as precisely as possible in two steps. First, the initial conditions at the electroweak energy scale are evaluated from the Fermi constant GF and the pole masses of the W, Z, and Higgs bosons and the bottom and top quarks including the full two-loop threshold corrections. Second, the evolution to the high energy scale is performed by numerically solving the renormalization group evolution equations through three loops. Pure QCD corrections to the matching and running are included through four loops. Solution method: Numerical integration of analytic expressions Additional comments: Available for download from URL: http://apik.github.io/mr/. The MathLink interface is tested to work with Mathematica 7-9 and, with an additional flag, also with Mathematica 10 under Linux and with Mathematica 10 under Mac OS X. Running time: less than 1 second References: [1] S. P. Martin and D. G. Robertson, Comput. Phys. Commun. 174 (2006) 133-151 [hep-ph/0501132]. [2] K. Ahnert and M. Mulansky, AIP Conf. Proc. 1389 (2011) 1586-1589 [arxiv:1110.3397 [cs.MS

  5. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    NASA Astrophysics Data System (ADS)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  6. X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin

    NASA Astrophysics Data System (ADS)

    Laycock, Silas

    We propose to expand the scope of our successful project providing a multi-satellite library of X-ray Pulsar observations to the community. The library provides high-level products, activity monitoring, pulse-profiles, phased event files, spectra, and a unique pulse-profile modeling interface. The library's scientific footprint will expand in 4 key directions: (1) Update, by processing all new XMM-Newton and Chandra observations (2015-2017) of X-ray Binary Pulsars in the Magellanic Clouds. (2) Expand, by including all archival Suzaku, Swift and NuStar observations, and including Galactic pulsars. (3) Improve, by offering innovative data products that provide deeper insight. (4) Advance, by implementing a new generation of physically motivated emission and pulse-profile models. The library currently includes some 2000 individual RXTE-PCA, 200 Chandra ACIS-I, and 120 XMM-PN observations of the SMC spanning 15 years, creating an unrivaled record of pulsar temporal behavior. In Phase-2, additional observations of SMC pulsars will be added: 221 Chandra (ACIS-S and ACIS-I), 22 XMM-PN, 142 XMM-MOS, 92 Suzaku, 25 NuSTAR, and >10,000 Swift; leveraging our pipeline and analysis techniques already developed. With the addition of 7 Galactic pulsars each having many hundred multisatellite observations, these datasets cover the entire range of variability timescales and accretion regimes. We will model the pulse-profiles using state of the art techniques to parameterize their morphology and obtain the distribution of offsets between magnetic and spin axes, and create samples of profiles under specific accretion modes (whether pencil-beam or fan-beam dominated). These products are needed for the next generation of advances in neutron star theory and modeling. The long-duration of the dataset and “whole-galaxy" nature of the SMC sample make possible a new statistical approach to uncover the duty-cycle distribution and hence population demographics of transient High Mass X-ray Binary (HMXB) populations. Our unique library is already fueling progress on fundamental NS parameters and accretion physics.

  7. Nuclear power propulsion system for spacecraft

    NASA Astrophysics Data System (ADS)

    Koroteev, A. S.; Oshev, Yu. A.; Popov, S. A.; Karevsky, A. V.; Solodukhin, A. Ye.; Zakharenkov, L. E.; Semenkin, A. V.

    2015-12-01

    The proposed designs of high-power space tugs that utilize solar or nuclear energy to power an electric jet engine are reviewed. The conceptual design of a nuclear power propulsion system (NPPS) is described; its structural diagram, gas circuit, and electric diagram are discussed. The NPPS incorporates a nuclear reactor, a thermal-to-electric energy conversion system, a system for the conversion and distribution of electric energy, and an electric propulsion system. Two criterion parameters were chosen in the considered NPPS design: the temperature of gaseous working medium at the nuclear reactor outlet and the rotor speed of turboalternators. The maintenance of these parameters at a given level guarantees that the needed electric voltage is generated and allows for power mode control. The processes of startup/shutdown and increasing/reducing the power, the principles of distribution of electric energy over loads, and the probable emergencies for the proposed NPPS design are discussed.

  8. Influence of flow constraints on the properties of the critical endpoint of symmetric nuclear matter

    NASA Astrophysics Data System (ADS)

    Ivanytskyi, A. I.; Bugaev, K. A.; Sagun, V. V.; Bravina, L. V.; Zabrodin, E. E.

    2018-06-01

    We propose a novel family of equations of state for symmetric nuclear matter based on the induced surface tension concept for the hard-core repulsion. It is shown that having only four adjustable parameters the suggested equations of state can, simultaneously, reproduce not only the main properties of the nuclear matter ground state, but the proton flow constraint up its maximal particle number densities. Varying the model parameters we carefully examine the range of values of incompressibility constant of normal nuclear matter and its critical temperature, which are consistent with the proton flow constraint. This analysis allows us to show that the physically most justified value of nuclear matter critical temperature is 15.5-18 MeV, the incompressibility constant is 270-315 MeV and the hard-core radius of nucleons is less than 0.4 fm.

  9. Shape Memory Micro- and Nanowire Libraries for the High-Throughput Investigation of Scaling Effects.

    PubMed

    Oellers, Tobias; König, Dennis; Kostka, Aleksander; Xie, Shenqie; Brugger, Jürgen; Ludwig, Alfred

    2017-09-11

    The scaling behavior of Ti-Ni-Cu shape memory thin-film micro- and nanowires of different geometry is investigated with respect to its influence on the martensitic transformation properties. Two processes for the high-throughput fabrication of Ti-Ni-Cu micro- to nanoscale thin film wire libraries and the subsequent investigation of the transformation properties are reported. The libraries are fabricated with compositional and geometrical (wire width) variations to investigate the influence of these parameters on the transformation properties. Interesting behaviors were observed: Phase transformation temperatures change in the range from 1 to 72 °C (austenite finish, (A f ), 13 to 66 °C (martensite start, M s ) and the thermal hysteresis from -3.5 to 20 K. It is shown that a vanishing hysteresis can be achieved for special combinations of sample geometry and composition.

  10. CARS Spectral Fitting with Multiple Resonant Species using Sparse Libraries

    NASA Technical Reports Server (NTRS)

    Cutler, Andrew D.; Magnotti, Gaetano

    2010-01-01

    The dual pump CARS technique is often used in the study of turbulent flames. Fast and accurate algorithms are needed for fitting dual-pump CARS spectra for temperature and multiple chemical species. This paper describes the development of such an algorithm. The algorithm employs sparse libraries, whose size grows much more slowly with number of species than a conventional library. The method was demonstrated by fitting synthetic "experimental" spectra containing 4 resonant species (N2, O2, H2 and CO2), both with noise and without it, and by fitting experimental spectra from a H2-air flame produced by a Hencken burner. In both studies, weighted least squares fitting of signal, as opposed to least squares fitting signal or square-root signal, was shown to produce the least random error and minimize bias error in the fitted parameters.

  11. Engineering emergent multicellular behavior through synthetic adhesion

    NASA Astrophysics Data System (ADS)

    Glass, David; Riedel-Kruse, Ingmar

    In over a decade, synthetic biology has developed increasingly robust gene networks within single cells, but constructed very few systems that demonstrate multicellular spatio-temporal dynamics. We are filling this gap in synthetic biology's toolbox by developing an E. coli self-assembly platform based on modular cell-cell adhesion. We developed a system in which adhesive selectivity is provided by a library of outer membrane-displayed peptides with intra-library specificities, while affinity is provided by consistent expression across the entire library. We further provide a biophysical model to help understand the parameter regimes in which this tool can be used to self-assemble into cellular clusters, filaments, or meshes. The combined platform will enable future development of synthetic multicellular systems for use in consortia-based metabolic engineering, in living materials, and in controlled study of minimal multicellular systems. Stanford Bio-X Bowes Fellowship.

  12. Cytological Study of Breast Carcinoma Before and After Oncotherapy with Special Reference to Morphometry and Proliferative Activity.

    PubMed

    Koley, Sananda; Chakrabarti, Srabani; Pathak, Swapan; Manna, Asim Kumar; Basu, Siddhartha

    2015-12-01

    Our study was done to assess the cytological changes due to oncotherapy in breast carcinoma especially on morphometry and proliferative activity. Cytological aspirates were collected from a total of 32 cases of invasive ductal carcinoma both before and after oncotherapy. Morphometry was done on the stained cytological smears to assess the different morphological parameters of cell dimension by using the ocular morphometer and the software AutoCAD 2007. Staining was done with Ki-67 and proliferating cell nuclear antigen (PCNA) as proliferative markers. Different morphological parameters were compared before and after oncotherapy by unpaired Student's t test. Statistically significant differences were found in morphometric parameters, e.g., mean nuclear diameter, mean nuclear area, mean cell diameter, and mean cell area, and in the expression of proliferative markers (Ki-67 and PCNA). Statistical analysis was done by obtaining p values. There are statistically significant differences between morphological parameter of breast carcinoma cells before and after oncotherapy.

  13. A method to investigate the diffusion properties of nuclear calcium.

    PubMed

    Queisser, Gillian; Wittum, Gabriel

    2011-10-01

    Modeling biophysical processes in general requires knowledge about underlying biological parameters. The quality of simulation results is strongly influenced by the accuracy of these parameters, hence the identification of parameter values that the model includes is a major part of simulating biophysical processes. In many cases, secondary data can be gathered by experimental setups, which are exploitable by mathematical inverse modeling techniques. Here we describe a method for parameter identification of diffusion properties of calcium in the nuclei of rat hippocampal neurons. The method is based on a Gauss-Newton method for solving a least-squares minimization problem and was formulated in such a way that it is ideally implementable in the simulation platform uG. Making use of independently published space- and time-dependent calcium imaging data, generated from laser-assisted calcium uncaging experiments, here we could identify the diffusion properties of nuclear calcium and were able to validate a previously published model that describes nuclear calcium dynamics as a diffusion process.

  14. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, Martin J.

    This project was part of a coordinated software development effort which the nuclear physics lattice QCD community pursues in order to ensure that lattice calculations can make optimal use of present, and forthcoming leadership-class and dedicated hardware, including those of the national laboratories, and prepares for the exploitation of future computational resources in the exascale era. The UW team improved and extended software libraries used in lattice QCD calculations related to multi-nucleon systems, enhanced production running codes related to load balancing multi-nucleon production on large-scale computing platforms, and developed SQLite (addressable database) interfaces to efficiently archive and analyze multi-nucleon datamore » and developed a Mathematica interface for the SQLite databases.« less

  15. Measurement of activation cross-section of long-lived products in deuteron induced nuclear reactions on palladium in the 30-50MeV energy range.

    PubMed

    Ditrói, F; Tárkányi, F; Takács, S; Hermanne, A; Ignatyuk, A V

    2017-10-01

    Excitation functions were measured in the 31-49.2MeV energy range for the nat Pd(d,xn) 111,110m,106m,105,104g,103 Ag, nat Pd(d,x) 111m,109,101,100 Pd, nat Pd(d,x), 105,102m,102g,101m,101g,100,99m,99g Rh and nat Pd(d,x) 103,97 Ru nuclear reactions by using the stacked foil irradiation technique. The experimental results are compared with our previous results and with the theoretical predictions calculated with the ALICE-D, EMPIRE-D and TALYS (TENDL libraries) codes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Minimal nuclear energy density functional

    NASA Astrophysics Data System (ADS)

    Bulgac, Aurel; Forbes, Michael McNeil; Jin, Shi; Perez, Rodrigo Navarro; Schunck, Nicolas

    2018-04-01

    We present a minimal nuclear energy density functional (NEDF) called "SeaLL1" that has the smallest number of possible phenomenological parameters to date. SeaLL1 is defined by seven significant phenomenological parameters, each related to a specific nuclear property. It describes the nuclear masses of even-even nuclei with a mean energy error of 0.97 MeV and a standard deviation of 1.46 MeV , two-neutron and two-proton separation energies with rms errors of 0.69 MeV and 0.59 MeV respectively, and the charge radii of 345 even-even nuclei with a mean error ɛr=0.022 fm and a standard deviation σr=0.025 fm . SeaLL1 incorporates constraints on the equation of state (EoS) of pure neutron matter from quantum Monte Carlo calculations with chiral effective field theory two-body (NN ) interactions at the next-to-next-to-next-to leading order (N3LO) level and three-body (NNN ) interactions at the next-to-next-to leading order (N2LO) level. Two of the seven parameters are related to the saturation density and the energy per particle of the homogeneous symmetric nuclear matter, one is related to the nuclear surface tension, two are related to the symmetry energy and its density dependence, one is related to the strength of the spin-orbit interaction, and one is the coupling constant of the pairing interaction. We identify additional phenomenological parameters that have little effect on ground-state properties but can be used to fine-tune features such as the Thomas-Reiche-Kuhn sum rule, the excitation energy of the giant dipole and Gamow-Teller resonances, the static dipole electric polarizability, and the neutron skin thickness.

  17. Minimal nuclear energy density functional

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bulgac, Aurel; Forbes, Michael McNeil; Jin, Shi

    Inmore » this paper, we present a minimal nuclear energy density functional (NEDF) called “SeaLL1” that has the smallest number of possible phenomenological parameters to date. SeaLL1 is defined by seven significant phenomenological parameters, each related to a specific nuclear property. It describes the nuclear masses of even-even nuclei with a mean energy error of 0.97 MeV and a standard deviation of 1.46 MeV , two-neutron and two-proton separation energies with rms errors of 0.69 MeV and 0.59 MeV respectively, and the charge radii of 345 even-even nuclei with a mean error ε r = 0.022 fm and a standard deviation σ r = 0.025 fm . SeaLL1 incorporates constraints on the equation of state (EoS) of pure neutron matter from quantum Monte Carlo calculations with chiral effective field theory two-body ( NN ) interactions at the next-to-next-to-next-to leading order (N3LO) level and three-body ( NNN ) interactions at the next-to-next-to leading order (N2LO) level. Two of the seven parameters are related to the saturation density and the energy per particle of the homogeneous symmetric nuclear matter, one is related to the nuclear surface tension, two are related to the symmetry energy and its density dependence, one is related to the strength of the spin-orbit interaction, and one is the coupling constant of the pairing interaction. Finally, we identify additional phenomenological parameters that have little effect on ground-state properties but can be used to fine-tune features such as the Thomas-Reiche-Kuhn sum rule, the excitation energy of the giant dipole and Gamow-Teller resonances, the static dipole electric polarizability, and the neutron skin thickness.« less

  18. Minimal nuclear energy density functional

    DOE PAGES

    Bulgac, Aurel; Forbes, Michael McNeil; Jin, Shi; ...

    2018-04-17

    Inmore » this paper, we present a minimal nuclear energy density functional (NEDF) called “SeaLL1” that has the smallest number of possible phenomenological parameters to date. SeaLL1 is defined by seven significant phenomenological parameters, each related to a specific nuclear property. It describes the nuclear masses of even-even nuclei with a mean energy error of 0.97 MeV and a standard deviation of 1.46 MeV , two-neutron and two-proton separation energies with rms errors of 0.69 MeV and 0.59 MeV respectively, and the charge radii of 345 even-even nuclei with a mean error ε r = 0.022 fm and a standard deviation σ r = 0.025 fm . SeaLL1 incorporates constraints on the equation of state (EoS) of pure neutron matter from quantum Monte Carlo calculations with chiral effective field theory two-body ( NN ) interactions at the next-to-next-to-next-to leading order (N3LO) level and three-body ( NNN ) interactions at the next-to-next-to leading order (N2LO) level. Two of the seven parameters are related to the saturation density and the energy per particle of the homogeneous symmetric nuclear matter, one is related to the nuclear surface tension, two are related to the symmetry energy and its density dependence, one is related to the strength of the spin-orbit interaction, and one is the coupling constant of the pairing interaction. Finally, we identify additional phenomenological parameters that have little effect on ground-state properties but can be used to fine-tune features such as the Thomas-Reiche-Kuhn sum rule, the excitation energy of the giant dipole and Gamow-Teller resonances, the static dipole electric polarizability, and the neutron skin thickness.« less

  19. Technical Transfer Report on a TNT Enzyluminescent Vapor Detection System

    DTIC Science & Technology

    1991-02-01

    Library) ATN: DELSD-L Aberdeen Proving Ground , MD 21005 Fort Monouth, NJ 07703-5301 1 Comm er President US Army Aberden Proving Ground US Army Aation Ten...I Director, Technical Information ATTN: AMXSY-MP Defense Advanced Research Projects Agency Aberdeen Proving Ground , MD 21005-5071 1400 Wilson...Blvd.Arlington, VA 22209 1 Director US Ballistics Research Laboratory I Director ATN: AMXBR-OD-ST (STINFO) Defense Nuclear Agency Aberdeen Proving Ground , MD

  20. Feasibility of Nanoparticle-Guided Radiation Therapy (NGRT) Using a Conventional CT Scanner

    DTIC Science & Technology

    2010-10-01

    deliverability of plan on CT scanner 2c. Calibrate dosimeters ( TLDs ) in phantom material 2d. Deliver dose distribution to phantom with TLDs in...phantom (SOW 2a). Next, small thermoluminescent dosimeters ( TLDs ) are placed within the tumor cavity. The TLDs are irradiated both with and without...nuclear data files. Electron interaction data is taken from the RSICC-EL03 library. The tumor volume was simulated as a small cavity containing

  1. AGAMA: Action-based galaxy modeling framework

    NASA Astrophysics Data System (ADS)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  2. Monte-Carlo Simulations of the Nuclear Energy Deposition Inside the CARMEN-1P Differential Calorimeter Irradiated into OSIRIS Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amharrak, H.; Reynard-Carette, C.; Carette, M.

    The nuclear heating measurements in Material Testing Reactors (MTRs) are crucial for the study of nuclear materials and fuels under irradiation. The reference measurements of this nuclear heating are especially performed by a differential calorimeter including a graphite sample material. These measurements are then used for other experimental conditions in order to predict the nuclear heating and thermal conditions induced in the irradiation devices. Nuclear heating is a great deal of interest at the moment as the measurement of such heating is an important issue for MTRs reactors. This need is especially generated by the new Jules Horowitz Reactor (JHR),more » under construction at CEA/Cadarache 'French Alternative Energies and Atomic Energy Commission'. This new reactor, that will be operational in late 2019, is a new facility for the nuclear research on materials and fuels. Indeed the expected nuclear heating rate is about 20 W/g for nominal capacity of 100 MW. The present Monte Carlo calculation works belong to the IN-CORE (Instrumentation for Nuclear radiation and Calorimetry On line in Reactor): a joint research program between the CEA and Aix- Marseille University in 2009. One scientific aim of this program is to design and develop a multi-sensors device, called CARMEN, dedicated to the measurements of main physical parameters simultaneously encountered inside JHR's experimental channels (core and reflector) such as neutron fluxes, photon fluxes, temperature, and nuclear heating. A first prototype was already developed. This prototype includes two mock-ups dedicated respectively to neutronic measurements (CARMEN-1N) and to photonic measurements (CARMEN-1P) with in particular a specific differential calorimeter. Two irradiation campaigns were performed successfully in the periphery of OSIRIS reactor (a MTR located at Saclay, France) in 2012 for nuclear heating levels up to 2 W/g. First Monte Carlo calculations reduced to the graphite sample of the calorimeter were carried out. A preliminary analysis shows that the numerical results overestimate the measurements by about 20 %. A new approach has been developed in order to estimate the nuclear heating by two methods (energy deposition or KERMA) by considering the whole complete geometry of the sensor. This new approach will contribute to the interpretation of the irradiation campaign and will be useful to improve the out-of-pile calibration procedure of the sensor and its thermal response during irradiations. The aim of this paper is to present simulations made by using MCNP5 Monte-Carlo transport code (using ENDF/B-VI nuclear data library) for the nuclear heating inside the different parts of the calorimeter (head, rod and base). Calculations into two steps will be realized. We will use as an input source in the model new spectra (neutrons, prompt-photons and delayed-photons) calculated with the Monte Carlo code TRIPOLI-4{sup R} inside different experimental channels (water) located into the OSIRIS periphery and used during the CARMEN-1P irradiation campaign. We will consider Neutrons- Photons-Electrons and Photons-Electrons modes. We will begin by a brief description of the differential-calorimeter device geometry. Then the MCNP5 model used for the calculations of nuclear heating inside the calorimeter elements will be introduced. The energy deposition due to the prompt-gamma, delayed-gamma and neutrons, the neutron-activation of the device will be considered. The different components of the nuclear heating inside the different parts of the calorimeter will be detailed. Moreover, a comparison between KERMA and nuclear energy deposition estimations will be given. Finally, a comparison between this total nuclear heating Calculation and Experiment in graphite sample will be determined. (authors)« less

  3. Variants of closing the nuclear fuel cycle

    NASA Astrophysics Data System (ADS)

    Andrianova, E. A.; Davidenko, V. D.; Tsibulskiy, V. F.; Tsibulskiy, S. V.

    2015-12-01

    Influence of the nuclear energy structure, the conditions of fuel burnup, and accumulation of new fissile isotopes from the raw isotopes on the main parameters of a closed fuel cycle is considered. The effects of the breeding ratio, the cooling time of the spent fuel in the external fuel cycle, and the separation of the breeding area and the fissile isotope burning area on the parameters of the fuel cycle are analyzed.

  4. Sensitivity analysis of TRX-2 lattice parameters with emphasis on epithermal /sup 238/U capture. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomlinson, E.T.; deSaussure, G.; Weisbin, C.R.

    1977-03-01

    The main purpose of the study is the determination of the sensitivity of TRX-2 thermal lattice performance parameters to nuclear cross section data, particularly the epithermal resonance capture cross section of /sup 238/U. An energy-dependent sensitivity profile was generated for each of the performance parameters, to the most important cross sections of the various isotopes in the lattice. Uncertainties in the calculated values of the performance parameters due to estimated uncertainties in the basic nuclear data, deduced in this study, were shown to be small compared to the uncertainties in the measured values of the performance parameter and compared tomore » differences among calculations based upon the same data but with different methodologies.« less

  5. NanoTopoChip: High-throughput nanotopographical cell instruction.

    PubMed

    Hulshof, Frits F B; Zhao, Yiping; Vasilevich, Aliaksei; Beijer, Nick R M; de Boer, Meint; Papenburg, Bernke J; van Blitterswijk, Clemens; Stamatialis, Dimitrios; de Boer, Jan

    2017-10-15

    Surface topography is able to influence cell phenotype in numerous ways and offers opportunities to manipulate cells and tissues. In this work, we develop the Nano-TopoChip and study the cell instructive effects of nanoscale topographies. A combination of deep UV projection lithography and conventional lithography was used to fabricate a library of more than 1200 different defined nanotopographies. To illustrate the cell instructive effects of nanotopography, actin-RFP labeled U2OS osteosarcoma cells were cultured and imaged on the Nano-TopoChip. Automated image analysis shows that of many cell morphological parameters, cell spreading, cell orientation and actin morphology are mostly affected by the nanotopographies. Additionally, by using modeling, the changes of cell morphological parameters could by predicted by several feature shape parameters such as lateral size and spacing. This work overcomes the technological challenges of fabricating high quality defined nanoscale features on unprecedented large surface areas of a material relevant for tissue culture such as PS and the screening system is able to infer nanotopography - cell morphological parameter relationships. Our screening platform provides opportunities to identify and study the effect of nanotopography with beneficial properties for the culture of various cell types. The nanotopography of biomaterial surfaces can be modified to influence adhering cells with the aim to improve the performance of medical implants and tissue culture substrates. However, the necessary knowledge of the underlying mechanisms remains incomplete. One reason for this is the limited availability of high-resolution nanotopographies on relevant biomaterials, suitable to conduct systematic biological studies. The present study shows the fabrication of a library of nano-sized surface topographies with high fidelity. The potential of this library, called the 'NanoTopoChip' is shown in a proof of principle HTS study which demonstrates how cells are affected by nanotopographies. The large dataset, acquired by quantitative high-content imaging, allowed us to use predictive modeling to describe how feature dimensions affect cell morphology. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  6. Empirical Performance Model-Driven Data Layout Optimization and Library Call Selection for Tensor Contraction Expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram

    Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less

  7. The WAGGS project - I. The WiFeS Atlas of Galactic Globular cluster Spectra

    NASA Astrophysics Data System (ADS)

    Usher, Christopher; Pastorello, Nicola; Bellstedt, Sabine; Alabi, Adebusola; Cerulo, Pierluigi; Chevalier, Leonie; Fraser-McKelvie, Amelia; Penny, Samantha; Foster, Caroline; McDermid, Richard M.; Schiavon, Ricardo P.; Villaume, Alexa

    2017-07-01

    We present the WiFeS Atlas of Galactic Globular cluster Spectra, a library of integrated spectra of Milky Way and Local Group globular clusters. We used the WiFeS integral field spectrograph on the Australian National University 2.3 m telescope to observe the central regions of 64 Milky Way globular clusters and 22 globular clusters hosted by the Milky Way's low-mass satellite galaxies. The spectra have wider wavelength coverage (3300-9050 Å) and higher spectral resolution (R = 6800) than existing spectral libraries of Milky Way globular clusters. By including Large and Small Magellanic Cloud star clusters, we extend the coverage of parameter space of existing libraries towards young and intermediate ages. While testing stellar population synthesis models and analysis techniques is the main aim of this library, the observations may also further our understanding of the stellar populations of Local Group globular clusters and make possible the direct comparison of extragalactic globular cluster integrated light observations with well-understood globular clusters in the Milky Way. The integrated spectra are publicly available via the project website.

  8. Report of the Nuclear Propulsion Mission Analysis, Figures of Merit Subpanel: Quantifiable figures of merit for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Haynes, Davy A.

    1991-01-01

    The results of an inquiry by the Nuclear Propulsion Mission Analysis, Figures of Merit subpanel are given. The subpanel was tasked to consider the question of what are the appropriate and quantifiable parameters to be used in the definition of an overall figure of merit (FoM) for Mars transportation system (MTS) nuclear thermal rocket engines (NTR). Such a characterization is needed to resolve the NTR engine design trades by a logical and orderly means, and to provide a meaningful method for comparison of the various NTR engine concepts. The subpanel was specifically tasked to identify the quantifiable engine parameters which would be the most significant engine factors affecting an overall FoM for a MTS and was not tasked with determining 'acceptable' or 'recommended' values for the identified parameters. In addition, the subpanel was asked not to define an overall FoM for a MTS. Thus, the selection of a specific approach, applicable weighting factors, to any interrelationships, for establishing an overall numerical FoM were considered beyond the scope of the subpanel inquiry.

  9. Double β-decay nuclear matrix elements for the A=48 and A=58 systems

    NASA Astrophysics Data System (ADS)

    Skouras, L. D.; Vergados, J. D.

    1983-11-01

    The nuclear matrix elements entering the double β decays of the 48Ca-48Ti and 58Ni-58Fe systems have been calculated using a realistic two nucleon interaction and realistic shell model spaces. Effective transition operators corresponding to a variety of gauge theory models have been considered. The stability of such matrix elements against variations of the nuclear parameters is examined. Appropriate lepton violating parameters are extracted from the A=48 data and predictions are made for the lifetimes of the positron decays of the A=58 system. RADIOACTIVITY Double β decay. Gauge theories. Lepton nonconservation. Neutrino mass. Shell model calculations.

  10. Design and implementation of a simple nuclear power plant simulator

    NASA Astrophysics Data System (ADS)

    Miller, William H.

    1983-02-01

    A simple PWR nuclear power plant simulator has been designed and implemented on a minicomputer system. The system is intended for students use in understanding the power operation of a nuclear power plant. A PDP-11 minicomputer calculates reactor parameters in real time, uses a graphics terminal to display the results and a keyboard and joystick for control functions. Plant parameters calculated by the model include the core reactivity (based upon control rod positions, soluble boron concentration and reactivity feedback effects), the total core power, the axial core power distribution, the temperature and pressure in the primary and secondary coolant loops, etc.

  11. Electronic structures of elements according to ionization energies.

    PubMed

    Zadeh, Dariush H

    2017-11-28

    The electronic structures of elements in the periodic table were analyzed using available experimental ionization energies. Two new parameters were defined to carry out the study. The first parameter-apparent nuclear charge (ANC)-quantified the overall charge of the nucleus and inner electrons observed by an outer electron during the ionization process. This parameter was utilized to define a second parameter, which presented the shielding ability of an electron against the nuclear charge. This second parameter-electron shielding effect (ESE)-provided an insight into the electronic structure of atoms. This article avoids any sort of approximation, interpolation or extrapolation. First experimental ionization energies were used to obtain the two aforementioned parameters. The second parameter (ESE) was then graphed against the electron number of each element, and was used to read the corresponding electronic structure. The ESE showed spikes/peaks at the end of each electronic shell, providing insight into when an electronic shell closes and a new one starts. The electronic structures of elements in the periodic table were mapped using this methodology. These graphs did not show complete agreement with the previously known "Aufbau" filling rule. A new filling rule was suggested based on the present observations. Finally, a new way to organize elements in the periodic table is suggested. Two earlier topics of effective nuclear charge, and shielding factor were also briefly discussed and compared numerically to demonstrate the capability of the new approach.

  12. Assessment of the Effects of Entrainment and Wind Shear on Nuclear Cloud Rise Modeling

    NASA Astrophysics Data System (ADS)

    Zalewski, Daniel; Jodoin, Vincent

    2001-04-01

    Accurate modeling of nuclear cloud rise is critical in hazard prediction following a nuclear detonation. This thesis recommends improvements to the model currently used by DOD. It considers a single-term versus a three-term entrainment equation, the value of the entrainment and eddy viscous drag parameters, as well as the effect of wind shear in the cloud rise following a nuclear detonation. It examines departures from the 1979 version of the Department of Defense Land Fallout Interpretive Code (DELFIC) with the current code used in the Hazard Prediction and Assessment Capability (HPAC) code version 3.2. The recommendation for a single-term entrainment equation, with constant value parameters, without wind shear corrections, and without cloud oscillations is based on both a statistical analysis using 67 U.S. nuclear atmospheric test shots and the physical representation of the modeling. The statistical analysis optimized the parameter values of interest for four cases: the three-term entrainment equation with wind shear and without wind shear as well as the single-term entrainment equation with and without wind shear. The thesis then examines the effect of cloud oscillations as a significant departure in the code. Modifications to user input atmospheric tables are identified as a potential problem in the calculation of stabilized cloud dimensions in HPAC.

  13. Sensitivity study of experimental measures for the nuclear liquid-gas phase transition in the statistical multifragmentation model

    NASA Astrophysics Data System (ADS)

    Lin, W.; Ren, P.; Zheng, H.; Liu, X.; Huang, M.; Wada, R.; Qu, G.

    2018-05-01

    The experimental measures of the multiplicity derivatives—the moment parameters, the bimodal parameter, the fluctuation of maximum fragment charge number (normalized variance of Zmax, or NVZ), the Fisher exponent (τ ), and the Zipf law parameter (ξ )—are examined to search for the liquid-gas phase transition in nuclear multifragmention processes within the framework of the statistical multifragmentation model (SMM). The sensitivities of these measures are studied. All these measures predict a critical signature at or near to the critical point both for the primary and secondary fragments. Among these measures, the total multiplicity derivative and the NVZ provide accurate measures for the critical point from the final cold fragments as well as the primary fragments. The present study will provide a guide for future experiments and analyses in the study of the nuclear liquid-gas phase transition.

  14. Gene Expression Profiling in the Thiamethoxam Resistant and Susceptible B-biotype Sweetpotato Whitefly, Bemisia tabaci

    PubMed Central

    Xie, Wen; Yang, Xin; Wang, Shao-Ii; Wu, Qing-jun; Yang, Ni-na; Li, Ru-mei; Jiao, Xiaoguo; Pan, Hui-peng; Liu, Bai-ming; Feng, Yun-tao; Xu, Bao-yun; Zhou, Xu-guo; Zhang, You-jun

    2012-01-01

    Thiamethoxam has been used as a major insecticide to control the B-biotype sweetpotato whitefly, Bemisia tabaci (Gennadius) (Hemiptera: Aleyrodidae). Due to its excessive use, a high level of resistance to thiamethoxam has developed worldwide over the past several years. To better understand the molecular mechanisms underlying this resistance in B. tabaci, gene profiles between the thiamethoxam-resistant and thiamethoxam-susceptible strains were investigated using the suppression subtractive hybridization (SSH) library approach. A total of 72 and 52 upand down-regulated genes were obtained from the forward and reverse SSH libraries, respectively. These expressed sequence tags (ESTs) belong to several functional categories based on their gene ontology annotation. Some categories such as cell communication, response to abiotic stimulus, lipid particle, and nuclear envelope were identified only in the forward library of thiamethoxam-resistant strains. In contrast, categories such as behavior, cell proliferation, nutrient reservoir activity, sequence-specific DNA binding transcription factor activity, and signal transducer activity were identified solely in the reverse library. To study the validity of the SSH method, 16 differentially expressed genes from both forward and reverse SSH libraries were selected randomly for further analyses using quantitative realtime PCR (qRT-PCR). The qRT-PCR results were fairly consistent with the SSH results; however, only 50% of the genes showed significantly different expression profiles between the thiamethoxam-resistant and thiamethoxam-susceptible whiteflies. Among these genes, a putative NAD-dependent methanol dehydrogenase was substantially over-expressed in the thiamethoxamresistant adults compared to their susceptible counterparts. The distributed profiles show that it was highly expressed during the egg stage, and was most abundant in the abdomen of adult females. PMID:22957505

  15. Orphan nuclear receptor TR3 acts in autophagic cell death via mitochondrial signaling pathway.

    PubMed

    Wang, Wei-jia; Wang, Yuan; Chen, Hang-zi; Xing, Yong-zhen; Li, Feng-wei; Zhang, Qian; Zhou, Bo; Zhang, Hong-kui; Zhang, Jie; Bian, Xue-li; Li, Li; Liu, Yuan; Zhao, Bi-xing; Chen, Yan; Wu, Rong; Li, An-zhong; Yao, Lu-ming; Chen, Ping; Zhang, Yi; Tian, Xu-yang; Beermann, Friedrich; Wu, Mian; Han, Jiahuai; Huang, Pei-qiang; Lin, Tianwei; Wu, Qiao

    2014-02-01

    Autophagy is linked to cell death, yet the associated mechanisms are largely undercharacterized. We discovered that melanoma, which is generally resistant to drug-induced apoptosis, can undergo autophagic cell death with the participation of orphan nuclear receptor TR3. A sequence of molecular events leading to cellular demise is launched by a specific chemical compound, 1-(3,4,5-trihydroxyphenyl)nonan-1-one, newly acquired from screening a library of TR3-targeting compounds. The autophagic cascade comprises TR3 translocation to mitochondria through interaction with the mitochondrial outer membrane protein Nix, crossing into the mitochondrial inner membrane through Tom40 and Tom70 channel proteins, dissipation of mitochondrial membrane potential by the permeability transition pore complex ANT1-VDAC1 and induction of autophagy. This process leads to excessive mitochondria clearance and irreversible cell death. It implicates a new approach to melanoma therapy through activation of a mitochondrial signaling pathway that integrates a nuclear receptor with autophagy for cell death.

  16. Hyb-Seq: Combining target enrichment and genome skimming for plant phylogenomics1

    PubMed Central

    Weitemier, Kevin; Straub, Shannon C. K.; Cronn, Richard C.; Fishbein, Mark; Schmickl, Roswitha; McDonnell, Angela; Liston, Aaron

    2014-01-01

    • Premise of the study: Hyb-Seq, the combination of target enrichment and genome skimming, allows simultaneous data collection for low-copy nuclear genes and high-copy genomic targets for plant systematics and evolution studies. • Methods and Results: Genome and transcriptome assemblies for milkweed (Asclepias syriaca) were used to design enrichment probes for 3385 exons from 768 genes (>1.6 Mbp) followed by Illumina sequencing of enriched libraries. Hyb-Seq of 12 individuals (10 Asclepias species and two related genera) resulted in at least partial assembly of 92.6% of exons and 99.7% of genes and an average assembly length >2 Mbp. Importantly, complete plastomes and nuclear ribosomal DNA cistrons were assembled using off-target reads. Phylogenomic analyses demonstrated signal conflict between genomes. • Conclusions: The Hyb-Seq approach enables targeted sequencing of thousands of low-copy nuclear exons and flanking regions, as well as genome skimming of high-copy repeats and organellar genomes, to efficiently produce genome-scale data sets for phylogenomics. PMID:25225629

  17. Metrology of deep trench etched memory structures using 3D scatterometry

    NASA Astrophysics Data System (ADS)

    Reinig, Peter; Dost, Rene; Moert, Manfred; Hingst, Thomas; Mantz, Ulrich; Moffitt, Jasen; Shakya, Sushil; Raymond, Christopher J.; Littau, Mike

    2005-05-01

    Scatterometry is receiving considerable attention as an emerging optical metrology in the silicon industry. One area of progress in deploying these powerful measurements in process control is performing measurements on real device structures, as opposed to limiting scatterometry measurements to periodic structures, such as line-space gratings, placed in the wafer scribe. In this work we will discuss applications of 3D scatterometry to the measurement of advanced trench memory devices. This is a challenging and complex scatterometry application that requires exceptionally high-performance computational abilities. In order to represent the physical device, the relatively tall structures require a high number of slices in the rigorous coupled wave analysis (RCWA) theoretical model. This is complicated further by the presence of an amorphous silicon hard mask on the surface, which is highly sensitive to reflectance scattering and therefore needs to be modeled in detail. The overall structure is comprised of several layers, with the trenches presenting a complex bow-shape sidewall that must be measured. Finally, the double periodicity in the structures demands significantly greater computational capabilities. Our results demonstrate that angular scatterometry is sensitive to the key parameters of interest. The influence of further model parameters and parameter cross correlations have to be carefully taken into account. Profile results obtained by non-library optimization methods compare favorably with cross-section SEM images. Generating a model library suitable for process control, which is preferred for precision, presents numerical throughput challenges. Details will be discussed regarding library generation approaches and strategies for reducing the numerical overhead. Scatterometry and SEM results will be compared, leading to conclusions about the feasibility of this advanced application.

  18. Uncertainties for Swiss LWR spent nuclear fuels due to nuclear data

    NASA Astrophysics Data System (ADS)

    Rochman, Dimitri A.; Vasiliev, Alexander; Dokhane, Abdelhamid; Ferroukhi, Hakim

    2018-05-01

    This paper presents a study of the impact of the nuclear data (cross sections, neutron emission and spectra) on different quantities for spent nuclear fuels (SNF) from Swiss power plants: activities, decay heat, neutron and gamma sources and isotopic vectors. Realistic irradiation histories are considered using validated core follow-up models based on CASMO and SIMULATE. Two Pressurized and one Boiling Water Reactors (PWR and BWR) are considered over a large number of operated cycles. All the assemblies at the end of the cycles are studied, being reloaded or finally discharged, allowing spanning over a large range of exposure (from 4 to 60 MWd/kgU for ≃9200 assembly-cycles). Both UO2 and MOX fuels were used during the reactor cycles, with enrichments from 1.9 to 4.7% for the UO2 and 2.2 to 5.8% Pu for the MOX. The SNF characteristics presented in this paper are calculated with the SNF code. The calculated uncertainties, based on the ENDF/B-VII.1 library are obtained using a simple Monte Carlo sampling method. It is demonstrated that the impact of nuclear data is relatively important (e.g. up to 17% for the decay heat), showing the necessity to consider them for safety analysis of the SNF handling and disposal.

  19. Interviewing a Silent (Radioactive) Witness through Nuclear Forensic Analysis.

    PubMed

    Mayer, Klaus; Wallenius, Maria; Varga, Zsolt

    2015-12-01

    Nuclear forensics is a relatively young discipline in science which aims at providing information on nuclear material of unknown origin. The determination of characteristic parameters through tailored analytical techniques enables establishing linkages to the material's processing history and hence provides hints on its place and date of production and on the intended use.

  20. Radiation Evaluation of the AM2901A Microprocessor.

    DTIC Science & Technology

    1980-08-01

    Fitz , Jr. ATTN: DRDAR-LCA-PD 4 cy ATTN: TITL U.S. Army Communications R&D Command Field Command ATTN: D. Huewe Defense Nuclear Agency ATTN: FCPR U.S...Stahl ATTN: Technical Library ATTN: T. Flanagan Mission Research Corp, Santa Barbara Johns Hopkins University, Laurel ATTN: C. Longmire Applied...Bernardino Sperry Rand Corp, Phoenix ATTN: F. Fay ATTN: D. Schow ATTN: M. Gorman ATTN: P. Kitter Sperry Univac, St. Paul ATTN: J. Inda TRW Systems and

Top