Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barani, T.; Bruschi, E.; Pizzocri, D.
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
Barani, T.; Bruschi, E.; Pizzocri, D.; ...
2017-01-03
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
MCNP Version 6.2 Release Notes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.
Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less
Code Pulse: Software Assurance (SWA) Visual Analytics for Dynamic Analysis of Code
2014-09-01
31 4.5.1 Market Analysis...competitive market analysis to assess the tool potential. The final transition targets were selected and expressed along with our research on the topic...public release milestones. Details of our testing methodology is in our Software Test Plan deliv- erable, CP- STP -0001. A summary of this approach is
Langley Stability and Transition Analysis Code (LASTRAC) Version 1.2 User Manual
NASA Technical Reports Server (NTRS)
Chang, Chau-Lyan
2004-01-01
LASTRAC is a general-purposed, physics-based transition prediction code released by NASA for Laminar Flow Control studies and transition research. The design and development of the LASTRAC code is aimed at providing an engineering tool that is easy to use and yet capable of dealing with a broad range of transition related issues. It was written from scratch based on the state-of-the-art numerical methods for stability analysis and modern software technologies. At low fidelity, it allows users to perform linear stability analysis and N-factor transition correlation for a broad range of flow regimes and configurations by using either the linear stability theory or linear parabolized stability equations method. At high fidelity, users may use nonlinear PSE to track finite-amplitude disturbances until the skin friction rise. This document describes the governing equations, numerical methods, code development, detailed description of input/output parameters, and case studies for the current release of LASTRAC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
NASA Technical Reports Server (NTRS)
Radhadrishnan, Krishnan
1993-01-01
A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.
Initial verification and validation of RAZORBACK - A research reactor transient analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2015-09-01
This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less
Selection of a computer code for Hanford low-level waste engineered-system performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGrail, B.P.; Mahoney, L.A.
Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected tomore » affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.« less
Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan C.; Gauntt, Randall O.
Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less
Bidirectional automatic release of reserve for low voltage network made with low capacity PLCs
NASA Astrophysics Data System (ADS)
Popa, I.; Popa, G. N.; Diniş, C. M.; Deaconu, S. I.
2018-01-01
The article presents the design of a bidirectional automatic release of reserve made on two types low capacity programmable logic controllers: PS-3 from Klöckner-Moeller and Zelio from Schneider. It analyses the electronic timing circuits that can be used for making the bidirectional automatic release of reserve: time-on delay circuit and time-off delay circuit (two types). In the paper are present the sequences code for timing performed on the PS-3 PLC, the logical functions for the bidirectional automatic release of reserve, the classical control electrical diagram (with contacts, relays, and time relays), the electronic control diagram (with logical gates and timing circuits), the code (in IL language) made for the PS-3 PLC, and the code (in FBD language) made for Zelio PLC. A comparative analysis will be carried out on the use of the two types of PLC and will be present the advantages of using PLCs.
VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaperow, J.H.; Bixler, N.E.
1996-12-31
VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less
Porcupine: A visual pipeline tool for neuroimaging analysis
Snoek, Lukas; Knapen, Tomas
2018-01-01
The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461
Grizzly Usage and Theory Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Backman, M.; Chakraborty, P.
2016-03-01
Grizzly is a multiphysics simulation code for characterizing the behavior of nuclear power plant (NPP) structures, systems and components (SSCs) subjected to a variety of age-related aging mechanisms. Grizzly simulates both the progression of aging processes, as well as the capacity of aged components to safely perform. This initial beta release of Grizzly includes capabilities for engineering-scale thermo-mechanical analysis of reactor pressure vessels (RPVs). Grizzly will ultimately include capabilities for a wide range of components and materials. Grizzly is in a state of constant development, and future releases will broaden the capabilities of this code for RPV analysis, as wellmore » as expand it to address degradation in other critical NPP components.« less
Application of Aeroelastic Solvers Based on Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Srivastava, Rakesh
1998-01-01
A pre-release version of the Navier-Stokes solver (TURBO) was obtained from MSU. Along with Dr. Milind Bakhle of the University of Toledo, subroutines for aeroelastic analysis were developed and added to the TURBO code to develop versions 1 and 2 of the TURBO-AE code. For specified mode shape, frequency and inter-blade phase angle the code calculates the work done by the fluid on the rotor for a prescribed sinusoidal motion. Positive work on the rotor indicates instability of the rotor. The version 1 of the code calculates the work for in-phase blade motions only. In version 2 of the code, the capability for analyzing all possible inter-blade phase angles, was added. The version 2 of TURBO-AE code was validated and delivered to NASA and the industry partners of the AST project. The capabilities and the features of the code are summarized in Refs. [1] & [2]. To release the version 2 of TURBO-AE, a workshop was organized at NASA Lewis, by Dr. Srivastava and Dr. M. A. Bakhle, both of the University of Toledo, in October of 1996 for the industry partners of NASA Lewis. The workshop provided the potential users of TURBO-AE, all the relevant information required in preparing the input data, executing the code, interpreting the results and bench marking the code on their computer systems. After the code was delivered to the industry partners, user support was also provided. A new version of the Navier-Stokes solver (TURBO) was later released by MSU. This version had significant changes and upgrades over the previous version. This new version was merged with the TURBO-AE code. Also, new boundary conditions for 3-D unsteady non-reflecting boundaries, were developed by researchers from UTRC, Ref. [3]. Time was spent on understanding, familiarizing, executing and implementing the new boundary conditions into the TURBO-AE code. Work was started on the phase lagged (time-shifted) boundary condition version (version 4) of the code. This will allow the users to calculate non-zero interblade phase angles using, only one blade passage for analysis.
CFL3D Version 6.4-General Usage and Aeroelastic Analysis
NASA Technical Reports Server (NTRS)
Bartels, Robert E.; Rumsey, Christopher L.; Biedron, Robert T.
2006-01-01
This document contains the course notes on the computational fluid dynamics code CFL3D version 6.4. It is intended to provide from basic to advanced users the information necessary to successfully use the code for a broad range of cases. Much of the course covers capability that has been a part of previous versions of the code, with material compiled from a CFL3D v5.0 manual and from the CFL3D v6 web site prior to the current release. This part of the material is presented to users of the code not familiar with computational fluid dynamics. There is new capability in CFL3D version 6.4 presented here that has not previously been published. There are also outdated features no longer used or recommended in recent releases of the code. The information offered here supersedes earlier manuals and updates outdated usage. Where current usage supersedes older versions, notation of that is made. These course notes also provides hints for usage, code installation and examples not found elsewhere.
A Rocket Engine Design Expert System
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1989-01-01
The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state of the art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the H2-O2 coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One dimensional equilibrium chemistry was used in the energy release analysis of the combustion chamber. A 3-D conduction and/or 1-D advection analysis is used to predict heat transfer and coolant channel wall temperature distributions, in addition to coolant temperature and pressure drop. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.
A rocket engine design expert system
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1989-01-01
The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state-of-the-art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the hydrogen-oxygen coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One-dimensional equilibrium chemistry was employed in the energy release analysis of the combustion chamber and three-dimensional finite-difference analysis of the regenerative cooling channels was used to calculate the pressure drop along the channels and the coolant temperature as it exits the coolant circuit. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.
Radio controlled release apparatus for animal data acquisition devices
Stamps, James Frederick
2000-01-01
A novel apparatus for reliably and selectively releasing a data acquisition package from an animal for recovery. The data package comprises two parts: 1) an animal data acquisition device and 2) a co-located release apparatus. One embodiment, which is useful for land animals, the release apparatus includes two major components: 1) an electronics package, comprising a receiver; a decoder comparator, having at plurality of individually selectable codes; and an actuator circuit and 2) a release device, which can be a mechanical device, which acts to release the data package from the animal. To release a data package from a particular animal, a radio transmitter sends a coded signal which is decoded to determine if the code is valid for that animal data package. Having received a valid code, the release device is activated to release the data package from the animal for subsequent recovery. A second embodiment includes floatation means and is useful for releasing animal data acquisition devices attached to sea animals. This embodiment further provides for releasing a data package underwater by employing an acoustic signal.
Design of an Orbital Inspection Satellite
1986-12-01
ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNITELEMENT NO. NO. NO. CCESSION NO. 11. TITLE (include...Captain, USAF Dh t ibutioni Availabiity Codes Avail adlor Dist [Special December 1986 Approved for public release; distribution...lends itself to the technique of multi -objective analysis. The final step is planning for action. This communicates the entire systems engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryding, Kristen E.; Skalski, John R.
1999-06-01
The purpose of this report is to illustrate the development of a stochastic model using coded wire-tag (CWT) release and age-at-return data, in order to regress first year ocean survival probabilities against coastal ocean conditions and climate covariates.
NASA Technical Reports Server (NTRS)
Valisetty, R. R.; Chamis, C. C.
1987-01-01
A computer code is presented for the sublaminate/ply level analysis of composite structures. This code is useful for obtaining stresses in regions affected by delaminations, transverse cracks, and discontinuities related to inherent fabrication anomalies, geometric configurations, and loading conditions. Particular attention is focussed on those layers or groups of layers (sublaminates) which are immediately affected by the inherent flaws. These layers are analyzed as homogeneous bodies in equilibrium and in isolation from the rest of the laminate. The theoretical model used to analyze the individual layers allows the relevant stresses and displacements near discontinuities to be represented in the form of pure exponential-decay-type functions which are selected to eliminate the exponential-precision-related difficulties in sublaminate/ply level analysis. Thus, sublaminate analysis can be conducted without any restriction on the maximum number of layers, delaminations, transverse cracks, or other types of discontinuities. In conjunction with the strain energy release rate (SERR) concept and composite micromechanics, this computational procedure is used to model select cases of end-notch and mixed-mode fracture specimens. The computed stresses are in good agreement with those from a three-dimensional finite element analysis. Also, SERRs compare well with limited available experimental data.
NOAA/DOE CWP structural analysis package. [CWPFLY, CWPEXT, COTEC, and XOTEC codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pompa, J.A.; Lunz, D.F.
1979-09-01
The theoretical development and computer code user's manual for analysis of the Ocean Thermal Energy Conversion (OTEC) plant cold water pipe (CWP) are presented. The analysis of the CWP includes coupled platform/CWP loadngs and dynamic responses. This report with the exception of the Introduction and Appendix F was orginally published as Hydronautics, Inc., Technical Report No. 7825-2 (by Barr, Chang, and Thasanatorn) in November 1978. A detailed theoretical development of the equations describing the coupled platform/CWP system and preliminary validation efforts are described. The appendices encompass a complete user's manual, describing the inputs, outputs and operation of the four componentmore » programs, and detail changes and updates implemented since the original release of the code by Hydronautics. The code itself is available through NOAA's Office of Ocean Technology and Engineering Services.« less
Modeling of thermo-mechanical and irradiation behavior of mixed oxide fuel for sodium fast reactors
NASA Astrophysics Data System (ADS)
Karahan, Aydın; Buongiorno, Jacopo
2010-01-01
An engineering code to model the irradiation behavior of UO2-PuO2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named fuel engineering and structural analysis tool (FEAST-OXIDE). FEAST-OXIDE has several modules working in coupled form with an explicit numerical algorithm. These modules describe: (1) fission gas release and swelling, (2) fuel chemistry and restructuring, (3) temperature distribution, (4) fuel-clad chemical interaction and (5) fuel-clad mechanical analysis. Given the fuel pin geometry, composition and irradiation history, FEAST-OXIDE can analyze fuel and cladding thermo-mechanical behavior at both steady-state and design-basis transient scenarios. The code was written in FORTRAN-90 program language. The mechanical analysis module implements the LIFE algorithm. Fission gas release and swelling behavior is described by the OGRES and NEFIG models. However, the original OGRES model has been extended to include the effects of joint oxide gain (JOG) formation on fission gas release and swelling. A detailed fuel chemistry model has been included to describe the cesium radial migration and JOG formation, oxygen and plutonium radial distribution and the axial migration of cesium. The fuel restructuring model includes the effects of as-fabricated porosity migration, irradiation-induced fuel densification, grain growth, hot pressing and fuel cracking and relocation. Finally, a kinetics model is included to predict the clad wastage formation. FEAST-OXIDE predictions have been compared to the available FFTF, EBR-II and JOYO databases, as well as the LIFE-4 code predictions. The agreement was found to be satisfactory for steady-state and slow-ramp over-power accidents.
Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murata, K.K.; Williams, D.C.; Griffith, R.O.
1997-12-01
The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less
Consequence analysis in LPG installation using an integrated computer package.
Ditali, S; Colombi, M; Moreschini, G; Senni, S
2000-01-07
This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.
Application of the DART Code for the Assessment of Advanced Fuel Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Totev, T.
2007-07-01
The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO{sub 2}more » fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)« less
RTE: A computer code for Rocket Thermal Evaluation
NASA Technical Reports Server (NTRS)
Naraghi, Mohammad H. N.
1995-01-01
The numerical model for a rocket thermal analysis code (RTE) is discussed. RTE is a comprehensive thermal analysis code for thermal analysis of regeneratively cooled rocket engines. The input to the code consists of the composition of fuel/oxidant mixture and flow rates, chamber pressure, coolant temperature and pressure. dimensions of the engine, materials and the number of nodes in different parts of the engine. The code allows for temperature variation in axial, radial and circumferential directions. By implementing an iterative scheme, it provides nodal temperature distribution, rates of heat transfer, hot gas and coolant thermal and transport properties. The fuel/oxidant mixture ratio can be varied along the thrust chamber. This feature allows the user to incorporate a non-equilibrium model or an energy release model for the hot-gas-side. The user has the option of bypassing the hot-gas-side calculations and directly inputting the gas-side fluxes. This feature is used to link RTE to a boundary layer module for the hot-gas-side heat flux calculations.
Roland, Carl L; Lake, Joanita; Oderda, Gary M
2016-12-01
We conducted a systematic review to evaluate worldwide human English published literature from 2009 to 2014 on prevalence of opioid misuse/abuse in retrospective databases where International Classification of Diseases (ICD) codes were used. Inclusion criteria for the studies were use of a retrospective database, measured abuse, dependence, and/or poisoning using ICD codes, stated prevalence or it could be derived, and documented time frame. A meta-analysis was not performed. A qualitative narrative synthesis was used, and 16 studies were included for data abstraction. ICD code use varies; 10 studies used ICD codes that encompassed all three terms: abuse, dependence, or poisoning. Eight studies limited determination of misuse/abuse to an opioid user population. Abuse prevalence among opioid users in commercial databases using all three terms of ICD codes varied depending on the opioid; 21 per 1000 persons (reformulated extended-release oxymorphone; 2011-2012) to 113 per 1000 persons (immediate-release opioids; 2010-2011). Abuse prevalence in general populations using all three ICD code terms ranged from 1.15 per 1000 persons (commercial; 6 months 2010) to 8.7 per 1000 persons (Medicaid; 2002-2003). Prevalence increased over time. When similar ICD codes are used, the highest prevalence is in US government-insured populations. Limiting population to continuous opioid users increases prevalence. Prevalence varies depending on ICD codes used, population, time frame, and years studied. Researchers using ICD codes to determine opioid abuse prevalence need to be aware of cautions and limitations.
Computer models and output, Spartan REM: Appendix B
NASA Technical Reports Server (NTRS)
Marlowe, D. S.; West, E. J.
1984-01-01
A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordienko, P. V., E-mail: gorpavel@vver.kiae.ru; Kotsarev, A. V.; Lizorkin, M. P.
2014-12-15
The procedure of recovery of pin-by-pin energy-release fields for the BIPR-8 code and the algorithm of the BIPR-8 code which is used in nodal computation of the reactor core and on which the recovery of pin-by-pin fields of energy release is based are briefly described. The description and results of the verification using the module of recovery of pin-by-pin energy-release fields and the TVS-M program are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J; Gehl, S M
1979-01-01
GRASS-SST and FASTGRASS are mechanistic computer codes for predicting fission-gas behavior in UO/sub 2/-base fuels during steady-state and transient conditions. FASTGRASS was developed in order to satisfy the need for a fast-running alternative to GRASS-SST. Althrough based on GRASS-SST, FASTGRASS is approximately an order of magnitude quicker in execution. The GRASS-SST transient analysis has evolved through comparisons of code predictions with the fission-gas release and physical phenomena that occur during reactor operation and transient direct-electrical-heating (DEH) testing of irradiated light-water reactor fuel. The FASTGRASS calculational procedure is described in this paper, along with models of key physical processes included inmore » both FASTGRASS and GRASS-SST. Predictions of fission-gas release obtained from GRASS-SST and FASTGRASS analyses are compared with experimental observations from a series of DEH tests. The major conclusions is that the computer codes should include an improved model for the evolution of the grain-edge porosity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel; Vesselinov, Velimir V.
MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less
clusterProfiler: an R package for comparing biological themes among gene clusters.
Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu
2012-05-01
Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.
NASA Technical Reports Server (NTRS)
2005-01-01
This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.
NASA Astrophysics Data System (ADS)
Karahan, Aydın; Buongiorno, Jacopo
2010-01-01
An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO2-PuO2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium reactors. FEAST-METAL was benchmarked against the open-literature EBR-II database for steady state and furnace tests (transients). The results show that the code is able to predict important phenomena such as clad strain, fission gas release, clad wastage, clad failure time, axial fuel slug deformation and fuel constituent redistribution, satisfactorily.
Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, Eugene H.; Bihl, Donald E.
2008-01-07
The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less
A Business Case Analysis of the M4/AR-15 Market
2015-09-01
release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT This research provides a business case analysis of the M4/AR-15... research provides a business case analysis of the M4/AR-15 market. The market analysis was conducted to fill missing gaps on the M4/AR-15 market...2 B. PROBLEM STATEMENT .............................................................................3 C. RESEARCH OBJECTIVES
Coupled Physics Environment (CouPE) library - Design, Implementation, and Release
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.
Over several years, high fidelity, validated mono-physics solvers with proven scalability on peta-scale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a unified mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. In this report, we present details on the design decisions and developments on CouPE, an acronym that stands for Coupled Physics Environment that orchestrates a coupled physics solver through the interfaces exposed by MOAB array-based unstructured mesh, both of which are part of SIGMA (Scalable Interfaces for Geometry and Mesh-Based Applications) toolkit.more » The SIGMA toolkit contains libraries that enable scalable geometry and unstructured mesh creation and handling in a memory and computationally efficient implementation. The CouPE version being prepared for a full open-source release along with updated documentation will contain several useful examples that will enable users to start developing their applications natively using the native MOAB mesh and couple their models to existing physics applications to analyze and solve real world problems of interest. An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is also being investigated as part of the NEAMS RPL, to tightly couple neutron transport, thermal-hydraulics and structural mechanics physics under the SHARP framework. This report summarizes the efforts that have been invested in CouPE to bring together several existing physics applications namely PROTEUS (neutron transport code), Nek5000 (computational fluid-dynamics code) and Diablo (structural mechanics code). The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The design of CouPE along with motivations that led to implementation choices are also discussed. The first release of the library will be different from the current version of the code that integrates the components in SHARP and explanation on the need for forking the source base will also be provided. Enhancements in the functionality and improved user guides will be available as part of the release. CouPE v0.1 is scheduled for an open-source release in December 2014 along with SIGMA v1.1 components that provide support for language-agnostic mesh loading, traversal and query interfaces along with scalable solution transfer of fields between different physics codes. The coupling methodology and software interfaces of the library are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the CouPE library.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
Sandia National Laboratories analysis code data base
NASA Astrophysics Data System (ADS)
Peterson, C. W.
1994-11-01
Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.
Analysis of Photonic Phase-Shifting Technique Employing Amplitude-Controlled Fiber-Optic Delay Lines
2012-01-13
Controlled Fiber-Optic Delay Lines January 13, 2012 Approved for public release; distribution is unlimited. Meredith N. draa ViNceNt J. Urick keith J...Draa, Vincent J. Urick , and Keith J. Williams Naval Research Laboratory, Code 5652 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL/MR/5650--12...9376 Approved for public release; distribution is unlimited. Unclassified Unclassified Unclassified UU 29 Vincent J. Urick (202) 767-9352 Fiber optics
Seals Flow Code Development 1993
NASA Technical Reports Server (NTRS)
Liang, Anita D. (Compiler); Hendricks, Robert C. (Compiler)
1994-01-01
Seals Workshop of 1993 code releases include SPIRALI for spiral grooved cylindrical and face seal configurations; IFACE for face seals with pockets, steps, tapers, turbulence, and cavitation; GFACE for gas face seals with 'lift pad' configurations; and SCISEAL, a CFD code for research and design of seals of cylindrical configuration. GUI (graphical user interface) and code usage was discussed with hands on usage of the codes, discussions, comparisons, and industry feedback. Other highlights for the Seals Workshop-93 include environmental and customer driven seal requirements; 'what's coming'; and brush seal developments including flow visualization, numerical analysis, bench testing, T-700 engine testing, tribological pairing and ceramic configurations, and cryogenic and hot gas facility brush seal results. Also discussed are seals for hypersonic engines and dynamic results for spiral groove and smooth annular seals.
GENCODE: the reference human genome annotation for The ENCODE Project.
Harrow, Jennifer; Frankish, Adam; Gonzalez, Jose M; Tapanari, Electra; Diekhans, Mark; Kokocinski, Felix; Aken, Bronwen L; Barrell, Daniel; Zadissa, Amonida; Searle, Stephen; Barnes, If; Bignell, Alexandra; Boychenko, Veronika; Hunt, Toby; Kay, Mike; Mukherjee, Gaurab; Rajan, Jeena; Despacio-Reyes, Gloria; Saunders, Gary; Steward, Charles; Harte, Rachel; Lin, Michael; Howald, Cédric; Tanzer, Andrea; Derrien, Thomas; Chrast, Jacqueline; Walters, Nathalie; Balasubramanian, Suganthi; Pei, Baikang; Tress, Michael; Rodriguez, Jose Manuel; Ezkurdia, Iakes; van Baren, Jeltje; Brent, Michael; Haussler, David; Kellis, Manolis; Valencia, Alfonso; Reymond, Alexandre; Gerstein, Mark; Guigó, Roderic; Hubbard, Tim J
2012-09-01
The GENCODE Consortium aims to identify all gene features in the human genome using a combination of computational analysis, manual annotation, and experimental validation. Since the first public release of this annotation data set, few new protein-coding loci have been added, yet the number of alternative splicing transcripts annotated has steadily increased. The GENCODE 7 release contains 20,687 protein-coding and 9640 long noncoding RNA loci and has 33,977 coding transcripts not represented in UCSC genes and RefSeq. It also has the most comprehensive annotation of long noncoding RNA (lncRNA) loci publicly available with the predominant transcript form consisting of two exons. We have examined the completeness of the transcript annotation and found that 35% of transcriptional start sites are supported by CAGE clusters and 62% of protein-coding genes have annotated polyA sites. Over one-third of GENCODE protein-coding genes are supported by peptide hits derived from mass spectrometry spectra submitted to Peptide Atlas. New models derived from the Illumina Body Map 2.0 RNA-seq data identify 3689 new loci not currently in GENCODE, of which 3127 consist of two exon models indicating that they are possibly unannotated long noncoding loci. GENCODE 7 is publicly available from gencodegenes.org and via the Ensembl and UCSC Genome Browsers.
Publicly Releasing a Large Simulation Dataset with NDS Labs
NASA Astrophysics Data System (ADS)
Goldbaum, Nathan
2016-03-01
Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.
Comparative Analysis of Fusion Center Outreach to Fire and EMS Agencies
2015-12-01
ANALYSIS OF FUSION CENTER OUTREACH TO FIRE AND EMS AGENCIES by Scott E. Goldstein December 2015 Thesis Advisor: Fathali Moghaddam Second...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE COMPARATIVE ANALYSIS OF FUSION CENTER OUTREACH TO FIRE AND EMS AGENCIES 5...public release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Fire and EMS responders have had little
Delamination modeling of laminate plate made of sublaminates
NASA Astrophysics Data System (ADS)
Kormaníková, Eva; Kotrasová, Kamila
2017-07-01
The paper presents the mixed-mode delamination of plates made of sublaminates. To this purpose an opening load mode of delamination is proposed as failure model. The failure model is implemented in ANSYS code to calculate the mixed-mode delamination response as energy release rate. The analysis is based on interface techniques. Within the interface finite element modeling there are calculated the individual components of damage parameters as spring reaction forces, relative displacements and energy release rates along the lamination front.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talley, Darren G.
2017-04-01
This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code showsmore » good agreement between simulation and actual ACRR operations.« less
Striatal dopamine release codes uncertainty in pathological gambling.
Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka; Møller, Arne; Doudet, Doris Jeanne; Gjedde, Albert
2012-10-30
Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain-striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [(11)C]raclopride to measure dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand, dopamine release coded uncertainty, we would find an inversely U-shaped function. The data supported an inverse U-shaped relation between striatal dopamine release and IGT performance if the pathological gambling group, but not in the healthy control group. These results are consistent with the hypothesis of dopaminergic sensitivity toward uncertainty, and suggest that dopaminergic sensitivity to uncertainty is pronounced in pathological gambling, but not among non-gambling healthy controls. The findings have implications for understanding dopamine dysfunctions in pathological gambling and addictive behaviors. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Hu, Junjie; Liu, Fei; Ju, Huangxian
2015-04-21
A peptide-encoded microplate was proposed for MALDI-TOF mass spectrometric (MS) analysis of protease activity. The peptide codes were designed to contain a coding region and the substrate of protease for enzymatic cleavage, respectively, and an internal standard method was proposed for the MS quantitation of the cleavage products of these peptide codes. Upon the cleavage reaction in the presence of target proteases, the coding regions were released from the microplate, which were directly quantitated by using corresponding peptides with one-amino acid difference as the internal standards. The coding region could be used as the unique "Protease ID" for the identification of corresponding protease, and the amount of the cleavage product was used for protease activity analysis. Using trypsin and chymotrypsin as the model proteases to verify the multiplex protease assay, the designed "Trypsin ID" and "Chymotrypsin ID" occurred at m/z 761.6 and 711.6. The logarithm value of the intensity ratio of "Protease ID" to internal standard was proportional to trypsin and chymotrypsin concentration in a range from 5.0 to 500 and 10 to 500 nM, respectively. The detection limits for trypsin and chymotrypsin were 2.3 and 5.2 nM, respectively. The peptide-encoded microplate showed good selectivity. This proposed method provided a powerful tool for convenient identification and activity analysis of multiplex proteases.
CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Flach, G.; BROWN, K.
2013-06-01
This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less
Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)
NASA Astrophysics Data System (ADS)
Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.
2013-10-01
Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.
Topological Analysis of Wireless Networks (TAWN)
2016-05-31
transmissions from any other node. Definition 1. A wireless network vulnerability is its susceptibility to becoming disconnected when a single source of...19b. TELEPHONE NUMBER (Include area code) 31-05-2016 FINAL REPORT 12-02-2015 -- 31-05-2016 Topological Analysis of Wireless Networks (TAWN) Robinson...Release, Distribution Unlimited) N/A The goal of this project was to develop topological methods to detect and localize vulnerabilities of wireless
High Productivity Computing Systems Analysis and Performance
2005-07-01
cubic grid Discrete Math Global Updates per second (GUP/S) RandomAccess Paper & Pencil Contact Bob Lucas (ISI) Multiple Precision none...can be found at the web site. One of the HPCchallenge codes, RandomAccess, is derived from the HPCS discrete math benchmarks that we released, and...Kernels Discrete Math … Graph Analysis … Linear Solvers … Signal Processi ng Execution Bounds Execution Indicators 6 Scalable Compact
DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1998-04-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
NASA Astrophysics Data System (ADS)
Ferland, G. J.; Chatzikos, M.; Guzmán, F.; Lykins, M. L.; van Hoof, P. A. M.; Williams, R. J. R.; Abel, N. P.; Badnell, N. R.; Keenan, F. P.; Porter, R. L.; Stancil, P. C.
2017-10-01
We describe the 2017 release of the spectral synthesis code Cloudy, summarizing the many improvements to the scope and accuracy of the physics which have been made since the previous release. Exporting the atomic data into external data files has enabled many new large datasets to be incorporated into the code. The use of the complete datasets is not realistic for most calculations, so we describe the limited subset of data used by default, which predicts significantly more lines than the previous release of Cloudy. This version is nevertheless faster than the previous release, as a result of code optimizations. We give examples of the accuracy limits using small models, and the performance requirements of large complete models. We summarize several advances in the H- and He-like iso-electronic sequences and use our complete collisional-radiative models to establish the densities where the coronal and local thermodynamic equilibrium approximations work.
CIRMIS Data system. Volume 2. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less
SHARP pre-release v1.0 - Current Status and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.; Rahaman, Ronald O.
The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less
NASA Technical Reports Server (NTRS)
Towne, Charles E.
1999-01-01
The WIND code is a general-purpose, structured, multizone, compressible flow solver that can be used to analyze steady or unsteady flow for a wide range of geometric configurations and over a wide range of flow conditions. WIND is the latest product of the NPARC Alliance, a formal partnership between the NASA Lewis Research Center and the Air Force Arnold Engineering Development Center (AEDC). WIND Version 1.0 was released in February 1998, and Version 2.0 will be released in February 1999. The WIND code represents a merger of the capabilities of three existing computational fluid dynamics codes--NPARC (the original NPARC Alliance flow solver), NXAIR (an Air Force code used primarily for unsteady store separation problems), and NASTD (the primary flow solver at McDonnell Douglas, now part of Boeing).
Direct simulations of chemically reacting turbulent mixing layers, part 2
NASA Technical Reports Server (NTRS)
Metcalfe, Ralph W.; Mcmurtry, Patrick A.; Jou, Wen-Huei; Riley, James J.; Givi, Peyman
1988-01-01
The results of direct numerical simulations of chemically reacting turbulent mixing layers are presented. This is an extension of earlier work to a more detailed study of previous three dimensional simulations of cold reacting flows plus the development, validation, and use of codes to simulate chemically reacting shear layers with heat release. Additional analysis of earlier simulations showed good agreement with self similarity theory and laboratory data. Simulations with a two dimensional code including the effects of heat release showed that the rate of chemical product formation, the thickness of the mixing layer, and the amount of mass entrained into the layer all decrease with increasing rates of heat release. Subsequent three dimensional simulations showed similar behavior, in agreement with laboratory observations. Baroclinic torques and thermal expansion in the mixing layer were found to produce changes in the flame vortex structure that act to diffuse the pairing vortices, resulting in a net reduction in vorticity. Previously unexplained anomalies observed in the mean velocity profiles of reacting jets and mixing layers were shown to result from vorticity generation by baroclinic torques.
CFD Based Computations of Flexible Helicopter Blades for Stability Analysis
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2011-01-01
As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.
New Tool Released for Engine-Airframe Blade-Out Structural Simulations
NASA Technical Reports Server (NTRS)
Lawrence, Charles
2004-01-01
Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.
15 CFR 740.7 - Computers (APP).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... License Exception. (2) Access and release restrictions. (i)[Reserved] (ii) Technology and source code. Technology and source code eligible for License Exception APP may not be released to nationals of Cuba, Iran...
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. L. Williamson
A powerful multidimensional fuels performance analysis capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth, gap heat transfer, and gap/plenum gas behavior during irradiation. This new capability is demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multipellet fuel rod, during both steady and transient operation. Comparisons are made between discrete andmore » smeared-pellet simulations. Computational results demonstrate the importance of a multidimensional, multipellet, fully-coupled thermomechanical approach. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermomechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less
Enhancing the ABAQUS Thermomechanics Code to Simulate Steady and Transient Fuel Rod Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. L. Williamson; D. A. Knoll
2009-09-01
A powerful multidimensional fuels performance capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth , gap heat transfer, and gap/plenum gas behavior during irradiation. The various modeling capabilities are demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multi-pellet fuel rod, during both steady and transient operation. Computational results demonstrate the importancemore » of a multidimensional fully-coupled thermomechanics treatment. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermo-mechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less
Recent MELCOR and VICTORIA Fission Product Research at the NRC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, N.E.; Cole, R.K.; Gauntt, R.O.
1999-01-21
The MELCOR and VICTORIA severe accident analysis codes, which were developed at Sandia National Laboratories for the U. S. Nuclear Regulatory Commission, are designed to estimate fission product releases during nuclear reactor accidents in light water reactors. MELCOR is an integrated plant-assessment code that models the key phenomena in adequate detail for risk-assessment purposes. VICTORIA is a more specialized fission- product code that provides detailed modeling of chemical reactions and aerosol processes under the high-temperature conditions encountered in the reactor coolant system during a severe reactor accident. This paper focuses on recent enhancements and assessments of the two codes inmore » the area of fission product chemistry modeling. Recently, a model for iodine chemistry in aqueous pools in the containment building was incorporated into the MELCOR code. The model calculates dissolution of iodine into the pool and releases of organic and inorganic iodine vapors from the pool into the containment atmosphere. The main purpose of this model is to evaluate the effect of long-term revolatilization of dissolved iodine. Inputs to the model include dose rate in the pool, the amount of chloride-containing polymer, such as Hypalon, and the amount of buffering agents in the containment. Model predictions are compared against the Radioiodine Test Facility (RTF) experiments conduced by Atomic Energy of Canada Limited (AECL), specifically International Standard Problem 41. Improvements to VICTORIA's chemical reactions models were implemented as a result of recommendations from a peer review of VICTORIA that was completed last year. Specifically, an option is now included to model aerosols and deposited fission products as three condensed phases in addition to the original option of a single condensed phase. The three-condensed-phase model results in somewhat higher predicted fission product volatilities than does the single-condensed-phase model. Modeling of U02 thermochemistry was also improved, and results in better prediction of vaporization of uranium from fuel, which can react with released fission products to affect their volatility. This model also improves the prediction of fission product release rates from fuel. Finally, recent comparisons of MELCOR and VICTORIA with International Standard Problem 40 (STORM) data are presented. These comparisons focus on predicted therrnophoretic deposition, which is the dominant deposition mechanism. Sensitivity studies were performed with the codes to examine experimental and modeling uncertainties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wessel, Silvia; Harvey, David
2013-06-28
The durability of PEM fuel cells is a primary requirement for large scale commercialization of these power systems in transportation and stationary market applications that target operational lifetimes of 5,000 hours and 40,000 hours by 2015, respectively. Key degradation modes contributing to fuel cell lifetime limitations have been largely associated with the platinum-based cathode catalyst layer. Furthermore, as fuel cells are driven to low cost materials and lower catalyst loadings in order to meet the cost targets for commercialization, the catalyst durability has become even more important. While over the past few years significant progress has been made in identifyingmore » the underlying causes of fuel cell degradation and key parameters that greatly influence the degradation rates, many gaps with respect to knowledge of the driving mechanisms still exist; in particular, the acceleration of the mechanisms due to different structural compositions and under different fuel cell conditions remains an area not well understood. The focus of this project was to address catalyst durability by using a dual path approach that coupled an extensive range of experimental analysis and testing with a multi-scale modeling approach. With this, the major technical areas/issues of catalyst and catalyst layer performance and durability that were addressed are: 1. Catalyst and catalyst layer degradation mechanisms (Pt dissolution, agglomeration, Pt loss, e.g. Pt in the membrane, carbon oxidation and/or corrosion). a. Driving force for the different degradation mechanisms. b. Relationships between MEA performance, catalyst and catalyst layer degradation and operational conditions, catalyst layer composition, and structure. 2. Materials properties a. Changes in catalyst, catalyst layer, and MEA materials properties due to degradation. 3. Catalyst performance a. Relationships between catalyst structural changes and performance. b. Stability of the three-phase boundary and its effect on performance/catalyst degradation. The key accomplishments of this project are: • The development of a molecular-dynamics based description of the carbon supported-Pt and ionomer system • The development of a composition-based, 1D-statistical Unit Cell Performance model • A modified and improved multi-pathway ORR model • An extension of the existing micro-structural catalyst model to transient operation • The coupling of a Pt Dissolution model to the modified ORR pathway model • The Development A Semi-empirical carbon corrosion model • The integration and release of an open-source forward predictive MEA performance and degradation model • Completion of correlations of BOT (beginning of test) and EOT (end of test) performance loss breakdown with cathode catalyst layer composition, morphology, material properties, and operational conditions • Catalyst layer durability windows and design curves • A design flow path of interactions from materials properties and catalyst layer effective properties to performance loss breakdown for virgin and degraded catalyst layers In order to ensure the best possible user experience we will perform a staged release of the software leading up to the webinar scheduled in October 2013. The release schedule will be as follows (please note that the manual will be released with the beta release as direct support is provided in Stage 1): • Stage 0 - Internal Ballard Release o Cross check of compilation and installation to ensure machine independence o Implement code on portable virtual machine to allow for non-UNIX use (pending) • Stage 1 - Alpha Release o The model code will be made available via a GIT, sourceforge, or other repository (under discussion at Ballard) for download and installation by a small pre-selected group of users o Users will be given three weeks to install, apply, and evaluate features of the code, providing feedback on issues or software bugs that require correction prior to beta release • Stage 2 - Beta Release o The model code repository is opened to the general public on a beta release concept, with a mechanism for bug tracking and feedback from a large user group o Code will be tracked and patched for any discovered bugs or relevant feedback from the user community, upon the completion of three months without a major bug submission the code will be moved to a full version release • Stage 3 - Full Version Release o Code is version to revision 1.0 and that version is frozen in development/patching« less
Status of VICTORIA: NRC peer review and recent code applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, N.E.; Schaperow, J.H.
1997-12-01
VICTORIA is a mechanistic computer code designed to analyze fission product behavior within a nuclear reactor coolant system (RCS) during a severe accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS. A summary of the results and recommendations of an independent peer review of VICTORIA by the US Nuclear Regulatory Commission (NRC) is presented, along with recent applications of the code. The latter include analyses of a temperature-induced steam generator tube rupture sequence and post-test analyses of the Phebus FPT-1 test. Themore » next planned Phebus test, FTP-4, will focus on fission product releases from a rubble bed, especially those of the less-volatile elements, and on the speciation of the released elements. Pretest analyses using VICTORIA to estimate the magnitude and timing of releases are presented. The predicted release of uranium is a matter of particular importance because of concern about filter plugging during the test.« less
An Evaluation of Four Methods of Numerical Analysis for Two-Dimensional Airfoil Flows. Revision.
1985-07-06
distribution as determined by the Eppler and Chang potential codes for the four airfoil geometries is shown in Figures 3-6. Here, 2 n-- Cp (P-Po)/.5pUo where...SPD- 1037-01. 2) Eppler , R., and D.M. Somers. A Computer Program for the Design and Analysis of Low Speed Airfoils . NASA Technical Memorandum 80210. 3...OF NUMERICAL n ANALYSIS FORI TWO-DIMENSIONAL AIRFOIL FLOWS Roger Burke APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED DAVID TAYLOR NAVAL SHIP R
An Analysis of the Navys Fiscal Year 2016 Shipbuilding Plan
2015-12-01
19b. TELEPHONE NUMBER (Include area code) 12/01/2014 Technical Report - Congressional Testimony An Analysis of the Navy’s Fiscal Year 2016 Shipbuilding ...Release 12/4/2015 No U U U CONGRESS OF THE UNITED STATES Testimony An Analysis of the Navy’s Fiscal Year 2016 Shipbuilding Plan Eric J. Labs Senior...Subcommittee, thank you for the opportunity to testify on the Navy’s 2016 shipbuilding plan and the 2014 update to the service’s 2012 force structure
An Open Source Agenda for Research Linking Text and Image Content Features.
ERIC Educational Resources Information Center
Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi
2001-01-01
Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
Annual Coded Wire Tag Program; Missing Production Groups, 1996 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastor, Stephen M.
1997-01-01
In 1989 the Bonneville Power Administration (BPA) began funding the evaluation of production groups of juvenile anadromous fish not being coded-wire tagged for other programs. These groups were the ''Missing Production Groups''. Production fish released by the U.S. Fish and Wildlife Service (USFWS) without representative coded-wire tags during the 1980's are indicated as blank spaces on the survival graphs in this report. The objectives of the ''Missing Production Groups'' program are: (1) to estimate the total survival of each production group, (2) to estimate the contribution of each production group to various fisheries, and (3) to prepare an annual reportmore » for all USFWS hatcheries in the Columbia River basin. Coded-wire tag recovery information will be used to evaluate the relative success of individual brood stocks. This information can also be used by salmon harvest managers to develop plans to allow the harvest of excess hatchery fish while protecting threatened, endangered, or other stocks of concern. In order to meet these objectives, a minimum of one marked group of fish is necessary for each production release. The level of marking varies according to location, species, and age at release. In general, 50,000 fish are marked with a coded-wire tag (CWT) to represent each production release group at hatcheries below John Day Dam. More than 100,000 fish per group are usually marked at hatcheries above John Day Dam. All fish release information, including marked/unmarked ratios, is reported to the Pacific States Marine Fisheries Commission (PSMFC). Fish recovered in the various fisheries or at the hatcheries are sampled to recover coded-wire tags. This recovery information is also reported to PSMFC.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Kumar, Jitendra; Hoffman, Forrest M.
Statement of the Problem: ASHRAE releases updates to 90.1 “Energy Standard for Buildings except Low-Rise Residential Buildings” every three years resulting in a 3.7%-17.3% increase in energy efficiency for buildings with each release. This is adopted by or informs building codes in nations across the globe, is the National Standard for the US, and individual states elect which release year of the standard they will enforce. These codes are built upon Standard 169 “Climatic Data for Building Design Standards,” the latest 2017 release of which defines climate zones based on 8, 118 weather stations throughout the world and data frommore » the past 8-25 years. This data may not be indicative of the weather that new buildings built today, will see during their upcoming 30-120 year lifespan. Methodology & Theoretical Orientation: Using more modern, high-resolution datasets from climate satellites, IPCC climate models (PCM and HadGCM), high performance computing resources (Titan) and new capabilities for clustering and optimization the authors briefly analyzed different methods for redefining climate zones. Using bottom-up analysis of multiple meteorological variables which were the subject matter, experts selected as being important to energy consumption, rather than the heating/cooling degree days currently used. Findings: We analyzed the accuracy of redefined climate zones, compared to current climate zones and how the climate zones moved under different climate change scenarios, and quantified the accuracy of these methods on a local level, at a national scale for the US. Conclusion & Significance: There is likely to be a significant annual, national energy and cost (billions USD) savings that could be realized by adjusting climate zones to take into account anticipated trends or scenarios in regional weather patterns.« less
The DRG shift: a new twist for ICD-10 preparation.
Long, Peri L
2012-06-01
Analysis of your specific business is a key component of ICD-10 implementation. An understanding of your organization's current reimbursement trends will go a long way to assessing and preparing for the impact of ICD-10 in your environment. If you cannot be prepared for each detailed scenario, remember that much of the analysis and resolution requires familiar coding, DRG analysis, and claims processing best practices. Now, they simply have the new twist of researching new codes and some new concepts. The news of a delay in the implementation compliance date, along with the release of grouper Version 29, should encourage your educational and business analysis efforts. This is a great opportunity to maintain open communication with the Centers for Medicare & Medicaid Services, Department of Health and Human Services, and Centers for Disease Control. This is also a key time to report any unusual or discrepant findings in order to provide input to the final rule.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
DYNA3D Code Practices and Developments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, L.; Zywicz, E.; Raboin, P.
2000-04-21
DYNA3D is an explicit, finite element code developed to solve high rate dynamic simulations for problems of interest to the engineering mechanics community. The DYNA3D code has been under continuous development since 1976[1] by the Methods Development Group in the Mechanical Engineering Department of Lawrence Livermore National Laboratory. The pace of code development activities has substantially increased in the past five years, growing from one to between four and six code developers. This has necessitated the use of software tools such as CVS (Concurrent Versions System) to help manage multiple version updates. While on-line documentation with an Adobe PDF manualmore » helps to communicate software developments, periodically a summary document describing recent changes and improvements in DYNA3D software is needed. The first part of this report describes issues surrounding software versions and source control. The remainder of this report details the major capability improvements since the last publicly released version of DYNA3D in 1996. Not included here are the many hundreds of bug corrections and minor enhancements, nor the development in DYNA3D between the manual release in 1993[2] and the public code release in 1996.« less
Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics
NASA Technical Reports Server (NTRS)
Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.
1995-01-01
We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.
Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis.
Buldyrev, S V; Goldberger, A L; Havlin, S; Mantegna, R N; Matsa, M E; Peng, C K; Simons, M; Stanley, H E
1995-05-01
An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.
Long-range correlation properties of coding and noncoding DNA sequences: GenBank analysis
NASA Technical Reports Server (NTRS)
Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Mantegna, R. N.; Matsa, M. E.; Peng, C. K.; Simons, M.; Stanley, H. E.
1995-01-01
An open question in computational molecular biology is whether long-range correlations are present in both coding and noncoding DNA or only in the latter. To answer this question, we consider all 33301 coding and all 29453 noncoding eukaryotic sequences--each of length larger than 512 base pairs (bp)--in the present release of the GenBank to dtermine whether there is any statistically significant distinction in their long-range correlation properties. Standard fast Fourier transform (FFT) analysis indicates that coding sequences have practically no correlations in the range from 10 bp to 100 bp (spectral exponent beta=0.00 +/- 0.04, where the uncertainty is two standard deviations). In contrast, for noncoding sequences, the average value of the spectral exponent beta is positive (0.16 +/- 0.05) which unambiguously shows the presence of long-range correlations. We also separately analyze the 874 coding and the 1157 noncoding sequences that have more than 4096 bp and find a larger region of power-law behavior. We calculate the probability that these two data sets (coding and noncoding) were drawn from the same distribution and we find that it is less than 10(-10). We obtain independent confirmation of these findings using the method of detrended fluctuation analysis (DFA), which is designed to treat sequences with statistical heterogeneity, such as DNA's known mosaic structure ("patchiness") arising from the nonstationarity of nucleotide concentration. The near-perfect agreement between the two independent analysis methods, FFT and DFA, increases the confidence in the reliability of our conclusion.
Numerical studies of the deposition of material released from fixed and rotary wing aircraft
NASA Technical Reports Server (NTRS)
Bilanin, A. J.; Teske, M. E.
1984-01-01
The computer code AGDISP (AGricultural DISPersal) has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern. In this report, the equations governing the motion of aerially released particles are developed, including a description of the evaporation model used. A series of case studies, using AGDISP, are included.
Geant4 Computing Performance Benchmarking and Monitoring
Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...
2015-12-23
Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less
2015 TRI National Analysis: Toxics Release Inventory Releases at Various Summary Levels
The TRI National Analysis is EPA's annual interpretation of TRI data at various summary levels. It highlights how toxic chemical wastes were managed, where toxic chemicals were released and how the 2015 TRI data compare to data from previous years. This dataset reports US state, county, large aquatic ecosystem, metro/micropolitan statistical area, and facility level statistics from 2015 TRI releases, including information on: number of 2015 TRI facilities in the geographic area and their releases (total, water, air, land); population information, including populations living within 1 mile of TRI facilities (total, minority, in poverty); and Risk Screening Environmental Indicators (RSEI) model related pounds, toxicity-weighted pounds, and RSEI score. The source of administrative boundary data is the 2013 cartographic boundary shapefiles. Location of facilities is provided by EPA's Facility Registry Service (FRS). Large Aquatic Ecosystems boundaries were dissolved from the hydrologic unit boundaries and codes for the United States, Puerto Rico, and the U.S. Virgin Islands. It was revised for inclusion in the National Atlas of the United States of America (November 2002), and updated to match the streams file created by the USGS National Mapping Division (NMD) for the National Atlas of the United States of America.
Diffusive deposition of aerosols in Phebus containment during FPT-2 test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontautas, A.; Urbonavicius, E.
2012-07-01
At present the lumped-parameter codes is the main tool to investigate the complex response of the containment of Nuclear Power Plant in case of an accident. Continuous development and validation of the codes is required to perform realistic investigation of the processes that determine the possible source term of radioactive products to the environment. Validation of the codes is based on the comparison of the calculated results with the measurements performed in experimental facilities. The most extensive experimental program to investigate fission product release from the molten fuel, transport through the cooling circuit and deposition in the containment is performedmore » in PHEBUS test facility. Test FPT-2 performed in this facility is considered for analysis of processes taking place in containment. Earlier performed investigations using COCOSYS code showed that the code could be successfully used for analysis of thermal-hydraulic processes and deposition of aerosols, but there was also noticed that diffusive deposition on the vertical walls does not fit well with the measured results. In the CPA module of ASTEC code there is implemented different model for diffusive deposition, therefore the PHEBUS containment model was transferred from COCOSYS code to ASTEC-CPA to investigate the influence of the diffusive deposition modelling. Analysis was performed using PHEBUS containment model of 16 nodes. The calculated thermal-hydraulic parameters are in good agreement with measured results, which gives basis for realistic simulation of aerosol transport and deposition processes. Performed investigations showed that diffusive deposition model has influence on the aerosol deposition distribution on different surfaces in the test facility. (authors)« less
Centrifuge Modeling of Explosion-Induced Craters in Unsaturated Sand
1992-11-01
under the Air F.rce Palace Knight Program 12a. DISTRIBUTION ’AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for Public Release Distribution...This report was submitted as a thesis to Colorado State University. Funding was provided by the U.S. Air Force Palace Knight program and by the U.S...analysis is used to generate a list of pi terms. Dimensional analysis is an extension of the Buckingham pi theorem ( Buckingham , 1914) which states that given
2018-03-29
www.apl.washington.edu 29 Mar 2018 To: Dr. Robert H. Headrick Office of Naval Research (Code 322) 875 North Randolph Street Arlington, VA 22203-1995...Benjamin Blake Naval Research Laboratory Defense Technical Information Center DISTRIBUTION STATEMENT A. Approved for public release; distribution is... quantitatively impact sound behavior. To gain quantitative knowledge, TREX13 was designed to contemporaneously measure acoustics quantities and environmental
Updated User's Guide for Sammy: Multilevel R-Matrix Fits to Neutron Data Using Bayes' Equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Nancy M
2008-10-01
In 1980 the multilevel multichannel R-matrix code SAMMY was released for use in analysis of neutron-induced cross section data at the Oak Ridge Electron Linear Accelerator. Since that time, SAMMY has evolved to the point where it is now in use around the world for analysis of many different types of data. SAMMY is not limited to incident neutrons but can also be used for incident protons, alpha particles, or other charged particles; likewise, Coulomb exit hannels can be included. Corrections for a wide variety of experimental conditions are available in the code: Doppler and resolution broadening, multiple-scattering corrections formore » capture or reaction yields, normalizations and backgrounds, to name but a few. The fitting procedure is Bayes' method, and data and parameter covariance matrices are properly treated within the code. Pre- and post-processing capabilities are also available, including (but not limited to) connections with the Evaluated Nuclear Data Files. Though originally designed for use in the resolved resonance region, SAMMY also includes a treatment for data analysis in the unresolved resonance region.« less
ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning
NASA Astrophysics Data System (ADS)
Sadeh, I.; Abdalla, F. B.; Lahav, O.
2016-10-01
We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.
NASA Astrophysics Data System (ADS)
Ivanov, A. S.; Rusinkevich, A. A.; Taran, M. D.
2018-01-01
The FP Kinetics computer code [1] designed for calculation of fission products release from HTGR coated fuel particles was modified to allow consideration of chemical bonding, effects of limited solubility and component concentration jumps at interfaces between coating layers. Curves of Cs release from coated particles calculated with the FP Kinetics and PARFUME [2] codes were compared. It has been found that the consideration of concentration jumps at silicon carbide layer interfaces allows giving an explanation of some experimental data on Cs release obtained from post-irradiation heating tests. The need to perform experiments for measurement of solubility limits in coating materials was noted.
Stryker, Jo Ellen
2002-11-01
Characteristics defining newsworthiness of journal articles appearing in JAMA and NEJM were examined to determine if they affect visibility in the news media. It was also hypothesized that press releases affected the amount of news coverage of a journal article due to the fact that the most newsworthy journal articles are selected for press releases. Journal articles (N = 95) were coded for characteristics believed to describe the "newsworthiness" of journal articles. Quantity of news coverage of the journal articles was estimated using the LEXIS-NEXIS database. Bivariate associations were examined using one-way analysis of variance, and multivariate analyses utilized OLS regression. Characteristics of the newsworthiness of medical journal articles predicted their visibility in newspapers. The issuing of press releases also predicted newspaper coverage. However, press releases predicted newspaper coverage largely because more newsworthy journal articles had accompanying press releases rather than because the press release itself was influential. Journalists report on medical information that is topical, stratifies risk based on demographic and lifestyle variables, and has lifestyle rather than medical implications. Medical journals issue press releases for articles that possess the characteristics journalists are looking for, thereby further highlighting their importance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew; Phipps, Eric; Ostien, Jakob
2016-01-13
The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less
Altered Standards of Care: An Analysis of Existing Federal, State, and Local Guidelines
2011-12-01
Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words ) A...data systems for communications and the transference of data. Losing data systems during disasters cuts off access to electronic medical records...emergency procedures as mouth - to- mouth resuscitation, external chest compression, electric shock, insertion of a tube to open the patient’s airway
A Logical Design of the Naval Postgraduate School Housing Office.
1985-03-01
34 March 1985 L C -:0 Thesis Advisor: Barry A. Frew LU Approved for public release; distribution is unlimited 85 6 3 057...Information Systems Development : Analysis and Design, South-Western, 1984. Pressman , R. S. , Software Engineering A Practitioner’s Approach, McGraw...Postgraduate School Monterey, California 93943 3. Lt. Barry A. Frew 2 Code 54 Fw Administrative Services Department Naval Postgraduate School Monterey
19 CFR 142.45 - Use of bar code by entry filer.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Use of bar code by entry filer. 142.45 Section 142... THE TREASURY (CONTINUED) ENTRY PROCESS Line Release § 142.45 Use of bar code by entry filer. (a... with instructions from the port director, shall preprint invoices with the C-4 Code in bar code and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less
NASA Technical Reports Server (NTRS)
Teske, M. E.
1984-01-01
This is a user manual for the computer code ""AGDISP'' (AGricultural DISPersal) which has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern.
Aeroacoustic Analysis of Turbofan Noise Generation
NASA Technical Reports Server (NTRS)
Meyer, Harold D.; Envia, Edmane
1996-01-01
This report provides an updated version of analytical documentation for the V072 Rotor Wake/Stator Interaction Code. It presents the theoretical derivation of the equations used in the code and, where necessary, it documents the enhancements and changes made to the original code since its first release. V072 is a package of FORTRAN computer programs which calculate the in-duct acoustic modes excited by a fan/stator stage operating in a subsonic mean flow. Sound is generated by the stator vanes interacting with the mean wakes of the rotor blades. In this updated version, only the tonal noise produced at the blade passing frequency and its harmonics, is described. The broadband noise component analysis, which was part of the original report, is not included here. The code provides outputs of modal pressure and power amplitudes generated by the rotor-wake/stator interaction. The rotor/stator stage is modeled as an ensemble of blades and vanes of zero camber and thickness enclosed within an infinite hard-walled annular duct. The amplitude of each propagating mode is computed and summed to obtain the harmonics of sound power flux within the duct for both upstream and downstream propagating modes.
Evans, A.F.; Roby, D.D.; Collis, K.; Cramer, B.M.; Sheggeby, J.A.; Adrean, L.J.; Battaglia, D.S.; Lyons, Donald E.
2011-01-01
We recovered coded wire tags (CWTs) from a colony of Caspian terns Hydroprogne caspia on Brooks Island in San Francisco Bay, California, to evaluate predation on juvenile salmonids originating from the Sacramento and San Joaquin rivers. Subsamples of colony substrate representing 11.7% of the nesting habitat used by the terns yielded 2,079 salmonid CWTs from fish released and subsequently consumed by terns in 2008. The estimated number of CWTs deposited on the entire tern colony was 40,143 (ranging from 26,763 to 80,288), once adjustments were made to account for tag loss and the total amount of nesting habitat used by terns. Tags ingested by terns and then egested on the colony were undamaged, and the tags' complete numeric codes were still identifiable. The CWTs found on the tern colony indicated that hatchery Chinook salmon Oncorhynchus tshawytscha trucked to and released in San Pablo Bay were significantly more likely to be consumed by Caspian terns than Chinook salmon that migrated in-river to the bay; 99.7% of all tags recovered were from bay-released Chinook salmon. Of the CWTs recovered on the tern colony, 98.0% were from fall-run Chinook salmon, indicating a higher susceptibility to tern predation than for the spring run type. None of the approximately 518,000 wild Chinook salmon that were coded-wire-tagged and released in the basin were recovered on the tern colony, suggesting that the impacts on wild, U.S. Endangered Species Act-listed Chinook salmon populations were minimal in 2008. Overall, we estimate that 0.3% of the approximately 12.3 million coded-wire-tagged Chinook salmon released in the basin in 2008 were subsequently consumed by Caspian terns from the Brooks Island colony. These results indicate that CWTs implanted in juvenile salmon can be recovered from a piscivorous waterbird colony and used to evaluate smolt losses for runs that are tagged. Abstract We recovered coded wire tags (CWTs) from a colony of Caspian terns Hydroprogne caspia on Brooks Island in San Francisco Bay, California, to evaluate predation on juvenile salmonids originating from the Sacramento and San Joaquin rivers. Subsamples of colony substrate representing 11.7% of the nesting habitat used by the terns yielded 2,079 salmonid CWTs from fish released and subsequently consumed by terns in 2008. The estimated number of CWTs deposited on the entire tern colony was 40,143 (ranging from 26,763 to 80,288), once adjustments were made to account for tag loss and the total amount of nesting habitat used by terns. Tags ingested by terns and then egested on the colony were undamaged, and the tags' complete numeric codes were still identifiable. The CWTs found on the tern colony indicated that hatchery Chinook salmon Oncorhynchus tshawytscha trucked to and released in San Pablo Bay were significantly more likely to be consumed by Caspian terns than Chinook salmon that migrated in-river to the bay; 99.7% of all tags recovered were from bay-released Chinook salmon. Of the CWTs recovered on the tern colony, 98.0% were from fall-run Chinook salmon, indicating a higher susceptibility to tern predation than for the spring run type. None of the approximately 518,000 wild Chinook salmon that were coded-wire-tagged and released in the basin were recovered on the tern colony, suggesting that the impacts on wild, U.S. Endangered Species Act-listed Chinook salmon populations were minimal in 2008. Overall, we estimate that 0.3% of the approximately 12.3 million coded-wire-tagged Chinook salmon released in the basin in 2008 were subsequently consumed by Caspian terns from the Brooks Island colony. These results indicate that CWTs implanted in juvenile salmon can be recovered from a piscivorous waterbird colony and used to evaluate smolt losses for runs that are tagged ?? American Fisheries Society 2011.
Radioactive release during nuclear accidents in Chernobyl and Fukushima
NASA Astrophysics Data System (ADS)
Nur Ain Sulaiman, Siti; Mohamed, Faizal; Rahim, Ahmad Nabil Ab
2018-01-01
Nuclear accidents that occurred in Chernobyl and Fukushima have initiated many research interests to understand the cause and mechanism of radioactive release within reactor compound and to the environment. Common types of radionuclide release are the fission products from the irradiated fuel rod itself. In case of nuclear accident, the focus of monitoring will be mostly on the release of noble gases, I-131 and Cs-137. As these are the only accidents have been rated within International Nuclear Events Scale (INES) Level 7, the radioactive release to the environment was one of the critical insights to be monitored. It was estimated that the release of radioactive material to the atmosphere due to Fukushima accident was approximately 10% of the Chernobyl accident. By referring to the previous reports using computational code systems to model the release rate, the release activity of I-131 and Cs-137 in Chernobyl was significantly higher compare to Fukushima. The simulation code also showed that Chernobyl had higher release rate of both radionuclides on the day of accident. Other factors affecting the radioactive release for Fukushima and Chernobyl accidents such as the current reactor technology and safety measures are also compared for discussion.
Hu, Zhitao; Tong, Xia-Jing; Kaplan, Joshua M
2013-01-01
Synaptic transmission consists of fast and slow components of neurotransmitter release. Here we show that these components are mediated by distinct exocytic proteins. The Caenorhabditis elegans unc-13 gene is required for SV exocytosis, and encodes long and short isoforms (UNC-13L and S). Fast release was mediated by UNC-13L, whereas slow release required both UNC-13 proteins and was inhibited by Tomosyn. The spatial location of each protein correlated with its effect. Proteins adjacent to the dense projection mediated fast release, while those controlling slow release were more distal or diffuse. Two UNC-13L domains accelerated release. C2A, which binds RIM (a protein associated with calcium channels), anchored UNC-13 at active zones and shortened the latency of release. A calmodulin binding site accelerated release but had little effect on UNC-13’s spatial localization. These results suggest that UNC-13L, UNC-13S, and Tomosyn form a molecular code that dictates the timing of neurotransmitter release. DOI: http://dx.doi.org/10.7554/eLife.00967.001 PMID:23951547
Irma 5.1 multisensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Edwards, Dave; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2006-05-01
The Irma synthetic signature prediction code is being developed to facilitate the research and development of multi-sensor systems. Irma was one of the first high resolution, physics-based Infrared (IR) target and background signature models to be developed for tactical weapon applications. Originally developed in 1980 by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN), the Irma model was used exclusively to generate IR scenes. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser (or active) channel. This two-channel version was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model, which supported correlated frame-to-frame imagery. A passive IR/millimeter wave (MMW) code was completed in 1994. This served as the cornerstone for the development of the co-registered active/passive IR/MMW model, Irma 4.0. In 2000, Irma version 5.0 was released which encompassed several upgrades to both the physical models and software. Circular polarization was added to the passive channel, and a Doppler capability was added to the active MMW channel. In 2002, the multibounce technique was added to the Irma passive channel. In the ladar channel, a user-friendly Ladar Sensor Assistant (LSA) was incorporated which provides capability and flexibility for sensor modeling. Irma 5.0 runs on several platforms including Windows, Linux, Solaris, and SGI Irix. Irma is currently used to support a number of civilian and military applications. The Irma user base includes over 130 agencies within the Air Force, Army, Navy, DARPA, NASA, Department of Transportation, academia, and industry. In 2005, Irma version 5.1 was released to the community. In addition to upgrading the Ladar channel code to an object oriented language (C++) and providing a new graphical user interface to construct scenes, this new release significantly improves the modeling of the ladar channel and includes polarization effects, time jittering, speckle effect, and atmospheric turbulence. More importantly, the Munitions Directorate has funded three field tests to verify and validate the re-engineered ladar channel. Each of the field tests was comprehensive and included one month of sensor characterization and a week of data collection. After each field test, the analysis included comparisons of Irma predicted signatures with measured signatures, and if necessary, refining the model to produce realistic imagery. This paper will focus on two areas of the Irma 5.1 development effort: report on the analysis results of the validation and verification of the Irma 5.1 ladar channel, and the software development plan and validation efforts of the Irma passive channel. As scheduled, the Irma passive code is being re-engineered using object oriented language (C++), and field data collection is being conducted to validate the re-engineered passive code. This software upgrade will remove many constraints and limitations of the legacy code including limits on image size and facet counts. The field test to validate the passive channel is expected to be complete in the second quarter of 2006.
Constitutive relations in TRAC-P1A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, U.S.; Saha, P.
1980-08-01
The purpose of this document is to describe the basic thermal-hydraulic models and correlations that are in the TRAC-P1A code, as released in March 1979. It is divided into two parts, A and B. Part A describes the models in the three-dimensional vessel module of TRAC, whereas Part B focuses on the loop components that are treated by one-dimensional formulations. The report follows the format of the questions prepared by the Analysis Development Branch of USNRC and the questionnaire has been attached to this document for completeness. Concerted efforts have been made in understanding the present models in TRAC-P1A bymore » going through the FORTRAN listing of the code. Some discrepancies between the code and the TRAC-P1A manual have been found. These are pointed out in this document. Efforts have also been made to check the TRAC references for the range of applicability of the models and correlations used in the code. 26 refs., 5 figs., 1 tab.« less
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Physics of Shock Compression and Release: NEMD Simulations of Tantalum and Silicon
NASA Astrophysics Data System (ADS)
Hahn, Eric; Meyers, Marc; Zhao, Shiteng; Remington, Bruce; Bringa, Eduardo; Germann, Tim; Ravelo, Ramon; Hammerberg, James
2015-06-01
Shock compression and release allow us to evaluate physical deformation and damage mechanisms occurring in extreme environments. SPaSM and LAMMPS molecular dynamics codes were employed to simulate single and polycrystalline tantalum and silicon at strain rates above 108 s-1. Visualization and analysis was accomplished using OVITO, Crystal Analysis Tool, and a redesigned orientation imaging function implemented into SPaSM. A comparison between interatomic potentials for both Si and Ta (as pertaining to shock conditions) is conducted and the influence on phase transformation and plastic relaxation is discussed. Partial dislocations, shear induced disordering, and metastable phase changes are observed in compressed silicon. For tantalum, the role of grain boundary and twin intersections are evaluated for their role in ductile spallation. Finally, the temperature dependent response of both Ta and Si is investigated.
Arabic Natural Language Processing System Code Library
2014-06-01
Code Compilation 2 4. Training Instructions 2 5. Applying the System to New Examples 2 6. License 3 7. History 3 8. Important Note 4 9. Papers to...a slightly different English dependency scheme and contained a variety of improvements. However, the PropBank-style SRL module was not maintained...than those in the http://sourceforge.net/projects/miacp/ release.) 8. Important Note This release contains a variety of bug fixes and other generally
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perez, R. Navarro; Schunck, N.; Lasseri, R.
2017-03-09
HFBTHO is a physics computer code that is used to model the structure of the nucleus. It is an implementation of the nuclear energy Density Functional Theory (DFT), where the energy of the nucleus is obtained by integration over space of some phenomenological energy density, which is itself a functional of the neutron and proton densities. In HFBTHO, the energy density derives either from the zero-range Dkyrme or the finite-range Gogny effective two-body interaction between nucleons. Nuclear superfluidity is treated at the Hartree-Fock-Bogoliubov (HFB) approximation, and axial-symmetry of the nuclear shape is assumed. This version is the 3rd release ofmore » the program; the two previous versions were published in Computer Physics Communications [1,2]. The previous version was released at LLNL under GPL 3 Open Source License and was given release code LLNL-CODE-573953.« less
User's manual for the Composite HTGR Analysis Program (CHAP-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.
1977-03-01
CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Audren, Benjamin; Lesgourgues, Julien; Benabed, Karim
Models for the latest stages of the cosmological evolution rely on a less solid theoretical and observational ground than the description of earlier stages like BBN and recombination. As suggested in a previous work by Vonlanthen et al., it is possible to tweak the analysis of CMB data in such way to avoid making assumptions on the late evolution, and obtain robust constraints on ''early cosmology parameters''. We extend this method in order to marginalise the results over CMB lensing contamination, and present updated results based on recent CMB data. Our constraints on the minimal early cosmology model are weakermore » than in a standard ΛCDM analysis, but do not conflict with this model. Besides, we obtain conservative bounds on the effective neutrino number and neutrino mass, showing no hints for extra relativistic degrees of freedom, and proving in a robust way that neutrinos experienced their non-relativistic transition after the time of photon decoupling. This analysis is also an occasion to describe the main features of the new parameter inference code MONTE PYTHON, that we release together with this paper. MONTE PYTHON is a user-friendly alternative to other public codes like COSMOMC, interfaced with the Boltzmann code CLASS.« less
2017-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. ECONOMIC ...Leave blank) 2. REPORT DATE June 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ECONOMIC PREPARATION OF THE...DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Over the past decade, the People’s Republic of China has increasingly used its economic might
Fourteen Years of R/qtl: Just Barely Sustainable
Broman, Karl W.
2014-01-01
R/qtl is an R package for mapping quantitative trait loci (genetic loci that contribute to variation in quantitative traits) in experimental crosses. Its development began in 2000. There have been 38 software releases since 2001. The latest release contains 35k lines of R code and 24k lines of C code, plus 15k lines of code for the documentation. Challenges in the development and maintenance of the software are discussed. A key to the success of R/qtl is that it remains a central tool for the chief developer's own research work, and so its maintenance is of selfish importance. PMID:25364504
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenzie-Carter, M.A.; Lyon, R.E.; Rope, S.K.
This report contains information to support the Environmental Assessment for the Burning Plasma Experiment (BPX) Project proposed for the Princeton Plasma Physics Laboratory (PPPL). The assumptions and methodology used to assess the impact to members of the public from operational and accidental releases of radioactive material from the proposed BPX during the operational period of the project are described. A description of the tracer release tests conducted at PPPL by NOAA is included; dispersion values from these tests are used in the dose calculations. Radiological releases, doses, and resulting health risks are calculated and summarized. The computer code AIRDOS- EPA,more » which is part of the computer code system CAP-88, is used to calculate the individual and population doses for routine releases; FUSCRAC3 is used to calculate doses resulting from off-normal releases where direct application of the NOAA tracer test data is not practical. Where applicable, doses are compared to regulatory limits and guideline values. 48 refs., 16 tabs.« less
Seals Code Development Workshop
NASA Technical Reports Server (NTRS)
Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)
1996-01-01
Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.
28 CFR 2.12 - Initial hearings: Setting presumptive release dates.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Initial hearings: Setting presumptive release dates. 2.12 Section 2.12 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS, YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.
This report presents a summary of the status of research activities associated with fission product behavior (release and transport) under severe accident conditions within the primary systems of water-moderated and water-cooled nuclear reactors. For each of the areas of fission product release and fission product transport, the report summarizes relevant information on important phenomena, major experiments performed, relevant computer models and codes, comparisons of computer code calculations with experimental results, and general conclusions on the overall state of the art. Finally, the report provides an assessment of the overall importance and knowledge of primary system release and transport phenomena andmore » presents major conclusions on the state of the art.« less
Kinetics of silver release from microfuel with taking into account the limited-solubility effect
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanov, A. S., E-mail: asi.kiae@gmail.com; Rusinkevich, A. A., E-mail: rusinkevich_andr@mail.ru
2014-12-15
The effect of a limited solubility of silver in silicon carbide on silver release from a microfuel with a TRISO coating is studied. It is shown that a limited solubility affects substantially both concentration profiles and silver release from a microfuel over a broad range of temperatures. A procedure is developed for obtaining fission-product concentration profiles in a microfuel and graphs representing the flow and integrated release of fission products on the basis of data from neutron-physics calculations and results obtained by calculating thermodynamics with the aid of the Ivtanthermo code and kinetics with the aid of the FP-Kinetics code.more » This procedure takes into account a limited solubility of fission products in protective coatings of microfuel.« less
2012-02-01
x Approved for public release; distribution unlimited. I-DEAS/ TMG Thermal analysis software IR Initial Review ITAR International Traffic in Arms...the finite element code I- DEAS/ TMG . A mesh refinement study was conducted on the first panel to determine the mesh density required to accurately...ng neer ng, pera ons ec no ogy oe ng esearc ec no ogy • heat transfer analysis conducted with I-DEAS/ TMG exercises mapping of temperatures to
Cost-Minimization Analysis of Open and Endoscopic Carpal Tunnel Release.
Zhang, Steven; Vora, Molly; Harris, Alex H S; Baker, Laurence; Curtin, Catherine; Kamal, Robin N
2016-12-07
Carpal tunnel release is the most common upper-limb surgical procedure performed annually in the U.S. There are 2 surgical methods of carpal tunnel release: open or endoscopic. Currently, there is no clear clinical or economic evidence supporting the use of one procedure over the other. We completed a cost-minimization analysis of open and endoscopic carpal tunnel release, testing the null hypothesis that there is no difference between the procedures in terms of cost. We conducted a retrospective review using a private-payer and Medicare Advantage database composed of 16 million patient records from 2007 to 2014. The cohort consisted of records with an ICD-9 (International Classification of Diseases, Ninth Revision) diagnosis of carpal tunnel syndrome and a CPT (Current Procedural Terminology) code for carpal tunnel release. Payer fees were used to define cost. We also assessed other associated costs of care, including those of electrodiagnostic studies and occupational therapy. Bivariate comparisons were performed using the chi-square test and the Student t test. Data showed that 86% of the patients underwent open carpal tunnel release. Reimbursement fees for endoscopic release were significantly higher than for open release. Facility fees were responsible for most of the difference between the procedures in reimbursement: facility fees averaged $1,884 for endoscopic release compared with $1,080 for open release (p < 0.0001). Endoscopic release also demonstrated significantly higher physician fees than open release (an average of $555 compared with $428; p < 0.0001). Occupational therapy fees associated with endoscopic release were less than those associated with open release (an average of $237 per session compared with $272; p = 0.07). The total average annual reimbursement per patient for endoscopic release (facility, surgeon, and occupational therapy fees) was significantly higher than for open release ($2,602 compared with $1,751; p < 0.0001). Our data showed that the total average fees per patient for endoscopic release were significantly higher than those for open release, although there currently is no strong evidence supporting better clinical outcomes of either technique. Value-based health-care models that favor delivering high-quality care and improving patient health, while also minimizing costs, may favor open carpal tunnel release.
An Approach for Assessing Delamination Propagation Capabilities in Commercial Finite Element Codes
NASA Technical Reports Server (NTRS)
Krueger, Ronald
2007-01-01
An approach for assessing the delamination propagation capabilities in commercial finite element codes is presented and demonstrated for one code. For this investigation, the Double Cantilever Beam (DCB) specimen and the Single Leg Bending (SLB) specimen were chosen for full three-dimensional finite element simulations. First, benchmark results were created for both specimens. Second, starting from an initially straight front, the delamination was allowed to propagate. Good agreement between the load-displacement relationship obtained from the propagation analysis results and the benchmark results could be achieved by selecting the appropriate input parameters. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Qualitatively, the delamination front computed for the DCB specimen did not take the shape of a curved front as expected. However, the analysis of the SLB specimen yielded a curved front as may be expected from the distribution of the energy release rate and the failure index across the width of the specimen. Overall, the results are encouraging but further assessment on a structural level is required.
Sić, Siniša; Maier, Norbert M; Rizzi, Andreas M
2016-09-07
The potential and benefits of isotope-coded labeling in the context of MS-based glycan profiling are evaluated focusing on the analysis of O-glycans. For this purpose, a derivatization strategy using d0/d5-1-phenyl-3-methyl-5-pyrazolone (PMP) is employed, allowing O-glycan release and derivatization to be achieved in one single step. The paper demonstrates that this release and derivatization reaction can be carried out also in-gel with only marginal loss in sensitivity compared to in-solution derivatization. Such an effective in-gel reaction allows one to extend this release/labeling method also to glycoprotein/glycoform samples pre-separated by gel-electrophoresis without the need of extracting the proteins/digested peptides from the gel. With highly O-glycosylated proteins (e.g. mucins) LODs in the range of 0.4 μg glycoprotein (100 fmol) loaded onto the electrophoresis gel can be attained, with minor glycosylated proteins (like IgAs, FVII, FIX) the LODs were in the range of 80-100 μg (250 pmol-1.5 nmol) glycoprotein loaded onto the gel. As second aspect, the potential of isotope coded labeling as internal standardization strategy for the reliable determination of quantitative glycan profiles via MALDI-MS is investigated. Towards this goal, a number of established and emerging MALDI matrices were tested for PMP-glycan quantitation, and their performance is compared with that of ESI-based measurements. The crystalline matrix 2,6-dihydroxyacetophenone (DHAP) and the ionic liquid matrix N,N-diisopropyl-ethyl-ammonium 2,4,6-trihydroxyacetophenone (DIEA-THAP) showed potential for MALDI-based quantitation of PMP-labeled O-glycans. We also provide a comprehensive overview on the performance of MS-based glycan quantitation approaches by comparing sensitivity, LOD, accuracy and repeatability data obtained with RP-HPLC-ESI-MS, stand-alone nano-ESI-MS with a spray-nozzle chip, and MALDI-MS. Finally, the suitability of the isotope-coded PMP labeling strategy for O-glycan profiling of biological important proteins is demonstrated by comparative analysis of IgA immunoglobulins and two coagulation factors. Copyright © 2016 Elsevier B.V. All rights reserved.
CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, III, F. G.
2016-07-29
One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less
Runtime Detection of C-Style Errors in UPC Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pirkelbauer, P; Liao, C; Panas, T
2011-09-29
Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the globalmore » address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.« less
Structural integrity of a confinement vessel for testing nuclear fuels for space propulsion
NASA Astrophysics Data System (ADS)
Bergmann, V. L.
Nuclear propulsion systems for rockets could significantly reduce the travel time to distant destinations in space. However, long before such a concept can become reality, a significant effort must be invested in analysis and ground testing to guide the development of nuclear fuels. Any testing in support of development of nuclear fuels for space propulsion must be safely contained to prevent the release of radioactive materials. This paper describes analyses performed to assess the structural integrity of a test confinement vessel. The confinement structure, a stainless steel pressure vessel with bolted flanges, was designed for operating static pressures in accordance with the ASME Boiler and Pressure Vessel Code. In addition to the static operating pressures, the confinement barrier must withstand static overpressures from off-normal conditions without releasing radioactive material. Results from axisymmetric finite element analyses are used to evaluate the response of the confinement structure under design and accident conditions. For the static design conditions, the stresses computed from the ASME code are compared with the stresses computed by the finite element method.
Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling
Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; ...
2014-10-12
The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less
77 FR 13061 - Electronic Reporting of Toxics Release Inventory Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-05
...--Reporting Year SIC--Standard Industrial Code TRI--Toxics Release Inventory TRI-ME--TRI-Made Easy Desktop... EPA to ``publish a uniform toxic chemical release form for facilities covered'' by the TRI Program. 42... practicable. Similarly, EPA's Cross-Media Electronic Reporting Regulation (CROMERR) (40 CFR Part 3), published...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shumaker, Dana E.; Steefel, Carl I.
The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.
Lessons Learned through the Development and Publication of AstroImageJ
NASA Astrophysics Data System (ADS)
Collins, Karen
2018-01-01
As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strenge, D.L.; Peloquin, R.A.
The computer code HADOC (Hanford Acute Dose Calculations) is described and instructions for its use are presented. The code calculates external dose from air submersion and inhalation doses following acute radionuclide releases. Atmospheric dispersion is calculated using the Hanford model with options to determine maximum conditions. Building wake effects and terrain variation may also be considered. Doses are calculated using dose conversion factor supplied in a data library. Doses are reported for one and fifty year dose commitment periods for the maximum individual and the regional population (within 50 miles). The fractional contribution to dose by radionuclide and exposure modemore » are also printed if requested.« less
Hou, Zhouhua; Xu, Xuwen; Zhou, Ledu; Fu, Xiaoyu; Tao, Shuhui; Zhou, Jiebin; Tan, Deming; Liu, Shuiping
2017-07-01
Increasing evidence supports the significance of long non-coding RNA in cancer development. Several recent studies suggest the oncogenic activity of long non-coding RNA metastasis-associated lung adenocarcinoma transcript 1 (MALAT1) in hepatocellular carcinoma. In this study, we explored the molecular mechanisms by which MALAT1 modulates hepatocellular carcinoma biological behaviors. We found that microRNA-204 was significantly downregulated in sh-MALAT1 HepG2 cell and 15 hepatocellular carcinoma tissues by quantitative real-time polymerase chain reaction analysis. Through bioinformatic screening, luciferase reporter assay, RNA-binding protein immunoprecipitation, and RNA pull-down assay, we identified microRNA-204 as a potential interacting partner for MALAT1. Functionally, wound-healing and transwell assays revealed that microRNA-204 significantly inhibited the migration and invasion of hepatocellular carcinoma cells. Notably, sirtuin 1 was recognized as a direct downstream target of microRNA-204 in HepG2 cells. Moreover, si-SIRT1 significantly inhibited cell invasion and migration process. These data elucidated, by sponging and competitive binding to microRNA-204, MALAT1 releases the suppression on sirtuin 1, which in turn promotes hepatocellular carcinoma migration and invasion. This study reveals a novel mechanism by which MALAT1 stimulates hepatocellular carcinoma progression and justifies targeting metastasis-associated lung adenocarcinoma transcript 1 as a potential therapy for hepatocellular carcinoma.
Comparison of LEWICE 1.6 and LEWICE/NS with IRT experimental data from modern air foil tests
DOT National Transportation Integrated Search
1998-01-01
A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This code is modular in ...
Testing and Life Prediction for Composite Rotor Hub Flexbeams
NASA Technical Reports Server (NTRS)
Murri, Gretchen B.
2004-01-01
A summary of several studies of delamination in tapered composite laminates with internal ply-drops is presented. Initial studies used 2D FE models to calculate interlaminar stresses at the ply-ending locations in linear tapered laminates under tension loading. Strain energy release rates for delamination in these laminates indicated that delamination would likely start at the juncture of the tapered and thin regions and grow unstably in both directions. Tests of glass/epoxy and graphite/epoxy linear tapered laminates under axial tension delaminated as predicted. Nonlinear tapered specimens were cut from a full-size helicopter rotor hub and were tested under combined constant axial tension and cyclic transverse bending loading to simulate the loading experienced by a rotorhub flexbeam in flight. For all the tested specimens, delamination began at the tip of the outermost dropped ply group and grew first toward the tapered region. A 2D FE model was created that duplicated the test flexbeam layup, geometry, and loading. Surface strains calculated by the model agreed very closely with the measured surface strains in the specimens. The delamination patterns observed in the tests were simulated in the model by releasing pairs of MPCs along those interfaces. Strain energy release rates associated with the delamination growth were calculated for several configurations and using two different FE analysis codes. Calculations from the codes agreed very closely. The strain energy release rate results were used with material characterization data to predict fatigue delamination onset lives for nonlinear tapered flexbeams with two different ply-dropping schemes. The predicted curves agreed well with the test data for each case studied.
A new code for modelling the near field diffusion releases from the final disposal of nuclear waste
NASA Astrophysics Data System (ADS)
Vopálka, D.; Vokál, A.
2003-01-01
The canisters with spent nuclear fuel produced during the operation of WWER reactors at the Czech power plants are planned, like in other countries, to be disposed of in an underground repository. Canisters will be surrounded by compacted bentonite that will retard the migration of safety-relevant radionuclides into the host rock. A new code that enables the modelling of the critical radionuclides transport from the canister through the bentonite layer in the cylindrical geometry was developed. The code enables to solve the diffusion equation for various types of initial and boundary conditions by means of the finite difference method and to take into account the non-linear shape of the sorption isotherm. A comparison of the code reported here with code PAGODA, which is based on analytical solution of the transport equation, was made for the actinide chain 4N+3 that includes 239Pu. A simple parametric study of the releases of 239Pu, 129I, and 14C into geosphere is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Mitchell T.; Johnson, Seth R.; Prokopenko, Andrey V.
With the development of a Fortran Interface to Trilinos, ForTrilinos, modelers using modern Fortran will beable to provide their codes the capability to use solvers and other capabilities on exascale machines via astraightforward infrastructure that accesses Trilinos. This document outlines what Fortrilinos does andexplains briefly how it works. We show it provides a general access to packages via an entry point and usesan xml file from fortran code. With the first release, ForTrilinos will enable Teuchos to take xml parameterlists from Fortran code and set up data structures. It will provide access to linear solvers and eigensolvers.Several examples are providedmore » to illustrate the capabilities in practice. We explain what the user shouldhave already with their code and what Trilinos provides and returns to the Fortran code. We provideinformation about the build process for ForTrilinos, with a practical example. In future releases, nonlinearsolvers, time iteration, advanced preconditioning techniques, and inversion of control (IoC), to enablecallbacks to Fortran routines, will be available.« less
Open source tools and toolkits for bioinformatics: significance, and where are we?
Stajich, Jason E; Lapp, Hilmar
2006-09-01
This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.
RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations
NASA Astrophysics Data System (ADS)
Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy
RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.
Open source clustering software.
de Hoon, M J L; Imoto, S; Nolan, J; Miyano, S
2004-06-12
We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
Leadership Class Configuration Interaction Code - Status and Opportunities
NASA Astrophysics Data System (ADS)
Vary, James
2011-10-01
With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L.
2017-05-01
MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.
Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.
Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying
2018-04-01
A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8 ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.
Spartan Release Engagement Mechanism (REM) stress and fracture analysis
NASA Technical Reports Server (NTRS)
Marlowe, D. S.; West, E. J.
1984-01-01
The revised stress and fracture analysis of the Spartan REM hardware for current load conditions and mass properties is presented. The stress analysis was performed using a NASTRAN math model of the Spartan REM adapter, base, and payload. Appendix A contains the material properties, loads, and stress analysis of the hardware. The computer output and model description are in Appendix B. Factors of safety used in the stress analysis were 1.4 on tested items and 2.0 on all other items. Fracture analysis of the items considered fracture critical was accomplished using the MSFC Crack Growth Analysis code. Loads and stresses were obtaind from the stress analysis. The fracture analysis notes are located in Appendix A and the computer output in Appendix B. All items analyzed met design and fracture criteria.
2015-09-30
DISTRIBUTION STATEMENT A: Distribution approved for public release; distribution is unlimited. NPS-NRL- Rice -UIUC Collaboration on Navy Atmosphere...portability. There is still a gap in the OCCA support for Fortran programmers who do not have accelerator experience. Activities at Rice /Virginia Tech are...for automated data movement and for kernel optimization using source code analysis and run-time detective work. In this quarter the Rice /Virginia
2013-09-01
2012.0002- IR -EP7-A 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE A...extremist web forums is directed at Western audiences and supports Homeland attacks. (U.S. Department of Homeland Security Office of Intelligence and...23 In this context, “before the event.” 24 Yung and Benichou’s paper originally was presented at the 5th Fire
2014-09-01
High Fructose Corn Syrup Diluted 1 to 10 percent by weight 50 to 500 mg/l Slow Release Whey (fresh/powered) Dissolved (powdered form) or injected...the assessment of remedial progress and functioning. This project also addressed several high priority needs from the Navy Environmental Quality...memory high -performance computing systems. For instance, as of March 2012 the code has been successfully executed on 2 cpu’s for an inversion problem
FROG: Time Series Analysis for the Web Service Era
NASA Astrophysics Data System (ADS)
Allan, A.
2005-12-01
The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).
2014-03-27
VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
RNAcentral: an international database of ncRNA sequences
Williams, Kelly Porter
2014-10-28
The field of non-coding RNA biology has been hampered by the lack of availability of a comprehensive, up-to-date collection of accessioned RNA sequences. Here we present the first release of RNAcentral, a database that collates and integrates information from an international consortium of established RNA sequence databases. The initial release contains over 8.1 million sequences, including representatives of all major functional classes. A web portal (http://rnacentral.org) provides free access to data, search functionality, cross-references, source code and an integrated genome browser for selected species.
Stochastic Plume Simulations for the Fukushima Accident and the Deep Water Horizon Oil Spill
NASA Astrophysics Data System (ADS)
Coelho, E.; Peggion, G.; Rowley, C.; Hogan, P.
2012-04-01
The Fukushima Dai-ichi power plant suffered damage leading to radioactive contamination of coastal waters. Major issues in characterizing the extent of the affected waters were a poor knowledge of the radiation released to the coastal waters and the rather complex coastal dynamics of the region, not deterministically captured by the available prediction systems. Equivalently, during the Gulf of Mexico Deep Water Horizon oil platform accident in April 2010, significant amounts of oil and gas were released from the ocean floor. For this case, issues in mapping and predicting the extent of the affected waters in real-time were a poor knowledge of the actual amounts of oil reaching the surface and the fact that coastal dynamics over the region were not deterministically captured by the available prediction systems. To assess the ocean regions and times that were most likely affected by these accidents while capturing the above sources of uncertainty, ensembles of the Navy Coastal Ocean Model (NCOM) were configured over the two regions (NE Japan and Northern Gulf of Mexico). For the Fukushima case tracers were released on each ensemble member; their locations at each instant provided reference positions of water volumes where the signature of water released from the plant could be found. For the Deep Water Horizon oil spill case each ensemble member was coupled with a diffusion-advection solution to estimate possible scenarios of oil concentrations using perturbed estimates of the released amounts as the source terms at the surface. Stochastic plumes were then defined using a Risk Assessment Code (RAC) analysis that associates a number from 1 to 5 to each grid point, determined by the likelihood of having tracer particle within short ranges (for the Fukushima case), hence defining the high risk areas and those recommended for monitoring. For the Oil Spill case the RAC codes were determined by the likelihood of reaching oil concentrations as defined in the Bonn Agreement Oil Appearance Code. The likelihoods were taken in both cases from probability distribution functions derived from the ensemble runs. Results were compared with a control-deterministic solution and checked against available reports to assess their skill in capturing the actual observed plumes and other in-situ data, as well as their relevance for planning surveys and reconnaissance flights for both cases.
News Releases, Press Releases, Tip Sheet Statements
United States Census Bureau Topics Population Latest Information Age and Sex Ancestry Children Mobility Population Estimates Population Projections Race Veterans Economy Latest Information Portal Other Economic Programs Business Latest Information Business Characteristics Classification Codes
Petersen, James H.; Gadomski, Dena M.; Poe, Thomas P.
1994-01-01
Juvenile salmonids (Oncorhynchus spp.) that have been killed or injured during dam passage may be highly vulnerable or preferred prey of predators that aggregate below dams. Salmonid loss due to predation will be overestimated using gut content analysis if some prey were dead or moribund when consumed. To examine this issue, field experiments were conducted in the Bonneville Dam tailrace (Columbia River) to compare rates of capture of live and dead juvenile salmonids by northern squawfish (Ptychocheilus oregonensis). Known numbers of coded-wire-tagged live and dead chinook salmon (O. tshawytscha) were released into the tailrace on six nights. Northern squawfish were collected after each release and their gut contents were examined for tags. When 50% of salmon released were dead, northern squawfish consumed 62% dead salmon. When 10% of salmon released were dead, comparable with dam passage mortality, 22% of the tags found in northern squawfish digestive tracts were from dead salmon. These results indicate that predator feeding behavior and prey condition are important considerations when estimating the impact of predation on a prey population.
TRACE/PARCS analysis of the OECD/NEA Oskarshamn-2 BWR stability benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozlowski, T.; Downar, T.; Xu, Y.
2012-07-01
On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event which culminated in diverging power oscillations with a decay ratio of about 1.4. The event was successfully modeled by the TRACE/PARCS coupled code system, and further analysis of the event is described in this paper. The results show very good agreement with the plant data, capturing the entire behavior of the transient including the onset of instability, growth of the oscillations (decay ratio) and oscillation frequency. This provides confidence in the prediction of other parameters which are not available from the plant records. The event provides coupled code validationmore » for a challenging BWR stability event, which involves the accurate simulation of neutron kinetics (NK), thermal-hydraulics (TH), and TH/NK. coupling. The success of this work has demonstrated the ability of the 3-D coupled systems code TRACE/PARCS to capture the complex behavior of BWR stability events. The problem was released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (authors)« less
NEQAIRv14.0 Release Notes: Nonequilibrium and Equilibrium Radiative Transport Spectra Program
NASA Technical Reports Server (NTRS)
Brandis, Aaron Michael; Cruden, Brett A.
2014-01-01
NEQAIR v14.0 is the first parallelized version of NEQAIR. Starting from the last version of the code that went through the internal software release process at NASA Ames (NEQAIR 2008), there have been significant updates to the physics in the code and the computational efficiency. NEQAIR v14.0 supersedes NEQAIR v13.2, v13.1 and the suite of NEQAIR2009 versions. These updates have predominantly been performed by Brett Cruden and Aaron Brandis from ERC Inc at NASA Ames Research Center in 2013 and 2014. A new naming convention is being adopted with this current release. The current and future versions of the code will be named NEQAIR vY.X. The Y will refer to a major release increment. Minor revisions and update releases will involve incrementing X. This is to keep NEQAIR more in line with common software release practices. NEQAIR v14.0 is a standalone software tool for line-by-line spectral computation of radiative intensities and/or radiative heat flux, with one-dimensional transport of radiation. In order to accomplish this, NEQAIR v14.0, as in previous versions, requires the specification of distances (in cm), temperatures (in K) and number densities (in parts/cc) of constituent species along lines of sight. Therefore, it is assumed that flow quantities have been extracted from flow fields computed using other tools, such as CFD codes like DPLR or LAURA, and that lines of sight have been constructed and written out in the format required by NEQAIR v14.0. There are two principal modes for running NEQAIR v14.0. In the first mode NEQAIR v14.0 is used as a tool for creating synthetic spectra of any desired resolution (including convolution with a specified instrument/slit function). The first mode is typically exercised in simulating/interpreting spectroscopic measurements of different sources (e.g. shock tube data, plasma torches, etc.). In the second mode, NEQAIR v14.0 is used as a radiative heat flux prediction tool for flight projects. Correspondingly, NEQAIR has also been used to simulate the radiance measured on previous flight missions. This report summarizes the database updates, corrections that have been made to the code, changes to input files, parallelization, the current usage recommendations, including test cases, and an indication of the performance enhancements achieved.
JEnsembl: a version-aware Java API to Ensembl data systems.
Paterson, Trevor; Law, Andy
2012-11-01
The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing 'through time' comparative analyses to be performed. Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net).
Force Identification from Structural Response
1999-12-01
STUDENT AT (If applicable) AFIT/CIA Univ of New Mexico A 6c. ADDRESS (City, State, and ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Wright...ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (h,,clude...FOR PUBLIC RELEASE IAW AFR 190-1 ERNEST A. HAYGOOD, 1st Lt, USAF Executive Officer, Civilian Institution Programs 17. COSATI CODES 18. SUBJECT TERMS
Adding kinetics and hydrodynamics to the CHEETAH thermochemical code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, L.E., Howard, W.M., Souers, P.C.
1997-01-15
In FY96 we released CHEETAH 1.40, which made extensive improvements on the stability and user friendliness of the code. CHEETAH now has over 175 users in government, academia, and industry. Efforts have also been focused on adding new advanced features to CHEETAH 2.0, which is scheduled for release in FY97. We have added a new chemical kinetics capability to CHEETAH. In the past, CHEETAH assumed complete thermodynamic equilibrium and independence of time. The addition of a chemical kinetic framework will allow for modeling of time-dependent phenomena, such as partial combustion and detonation in composite explosives with large reaction zones. Wemore » have implemented a Wood-Kirkwood detonation framework in CHEETAH, which allows for the treatment of nonideal detonations and explosive failure. A second major effort in the project this year has been linking CHEETAH to hydrodynamic codes to yield an improved HE product equation of state. We have linked CHEETAH to 1- and 2-D hydrodynamic codes, and have compared the code to experimental data. 15 refs., 13 figs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
NASA Astrophysics Data System (ADS)
Prabhu Verleker, Akshay; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M.
2015-03-01
The purpose of this study is to develop an alternate empirical approach to estimate near-infra-red (NIR) photon propagation and quantify optically induced drug release in brain metastasis, without relying on computationally expensive Monte Carlo techniques (gold standard). Targeted drug delivery with optically induced drug release is a noninvasive means to treat cancers and metastasis. This study is part of a larger project to treat brain metastasis by delivering lapatinib-drug-nanocomplexes and activating NIR-induced drug release. The empirical model was developed using a weighted approach to estimate photon scattering in tissues and calibrated using a GPU based 3D Monte Carlo. The empirical model was developed and tested against Monte Carlo in optical brain phantoms for pencil beams (width 1mm) and broad beams (width 10mm). The empirical algorithm was tested against the Monte Carlo for different albedos along with diffusion equation and in simulated brain phantoms resembling white-matter (μs'=8.25mm-1, μa=0.005mm-1) and gray-matter (μs'=2.45mm-1, μa=0.035mm-1) at wavelength 800nm. The goodness of fit between the two models was determined using coefficient of determination (R-squared analysis). Preliminary results show the Empirical algorithm matches Monte Carlo simulated fluence over a wide range of albedo (0.7 to 0.99), while the diffusion equation fails for lower albedo. The photon fluence generated by empirical code matched the Monte Carlo in homogeneous phantoms (R2=0.99). While GPU based Monte Carlo achieved 300X acceleration compared to earlier CPU based models, the empirical code is 700X faster than the Monte Carlo for a typical super-Gaussian laser beam.
1993-03-01
REQUIREMENTS OF THE AIR FORCE WARGAMING CENTER THESIS Scott Goehring Captain, USAF AFIT/GST/ENS/93M-04 93-06903 Approved for public release; distribution...Education Requirements of the Air Force Wargaming Center THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of...distribution unlimited Distribution I Availdbilhty Codes AvdIl and I or Dist Special THESIS APPROVAL STUDENT: Capt Scott E. Goehring CLASS: GST-93M-04
2013-05-01
contract or a PhD di sse rtation typically are a " proo f- of-concept" code base that can onl y read a single set of inputs and are not designed ...AFRL-RX-WP-TR-2013-0210 COLLABORATIVE RESEARCH AND DEVELOPMENT (CR&D) III Task Order 0090: Image Processing Framework: From...public release; distribution unlimited. See additional restrictions described on inside pages. STINFO COPY AIR FORCE RESEARCH LABORATORY
NONCODE v2.0: decoding the non-coding.
He, Shunmin; Liu, Changning; Skogerbø, Geir; Zhao, Haitao; Wang, Jie; Liu, Tao; Bai, Baoyan; Zhao, Yi; Chen, Runsheng
2008-01-01
The NONCODE database is an integrated knowledge database designed for the analysis of non-coding RNAs (ncRNAs). Since NONCODE was first released 3 years ago, the number of known ncRNAs has grown rapidly, and there is growing recognition that ncRNAs play important regulatory roles in most organisms. In the updated version of NONCODE (NONCODE v2.0), the number of collected ncRNAs has reached 206 226, including a wide range of microRNAs, Piwi-interacting RNAs and mRNA-like ncRNAs. The improvements brought to the database include not only new and updated ncRNA data sets, but also an incorporation of BLAST alignment search service and access through our custom UCSC Genome Browser. NONCODE can be found under http://www.noncode.org or http://noncode.bioinfo.org.cn.
Facility Targeting, Protection and Mission Decision Making Using the VISAC Code
NASA Technical Reports Server (NTRS)
Morris, Robert H.; Sulfredge, C. David
2011-01-01
The Visual Interactive Site Analysis Code (VISAC) has been used by DTRA and several other agencies to aid in targeting facilities and to predict the associated collateral effects for the go, no go mission decision making process. VISAC integrates the three concepts of target geometric modeling, damage assessment capabilities, and an event/fault tree methodology for evaluating accident/incident consequences. It can analyze a variety of accidents/incidents at nuclear or industrial facilities, ranging from simple component sabotage to an attack with military or terrorist weapons. For nuclear facilities, VISAC predicts the facility damage, estimated downtime, amount and timing of any radionuclides released. Used in conjunction with DTRA's HPAC code, VISAC also can analyze transport and dispersion of the radionuclides, levels of contamination of the surrounding area, and the population at risk. VISAC has also been used by the NRC to aid in the development of protective measures for nuclear facilities that may be subjected to attacks by car/truck bombs.
Recommended Parameter Values for GENII Modeling of Radionuclides in Routine Air and Water Releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snyder, Sandra F.; Arimescu, Carmen; Napier, Bruce A.
The GENII v2 code is used to estimate dose to individuals or populations from the release of radioactive materials into air or water. Numerous parameter values are required for input into this code. User-defined parameters cover the spectrum from chemical data, meteorological data, agricultural data, and behavioral data. This document is a summary of parameter values that reflect conditions in the United States. Reasonable regional and age-dependent data is summarized. Data availability and quality varies. The set of parameters described address scenarios for chronic air emissions or chronic releases to public waterways. Considerations for the special tritium and carbon-14 modelsmore » are briefly addressed. GENIIv2.10.0 is the current software version that this document supports.« less
Bedell, Precious; Wilson, John L; White, Ann Marie; Morse, Diane S
Re-entry community health workers (CHWs) are individuals who connect diverse community residents at risk for chronic health issues such as Hepatitis C virus and cardiovascular disease with post-prison healthcare and re-entry services. While the utilization of CHWs has been documented in other marginalized populations, there is little knowledge surrounding the work of re-entry CHWs with individuals released from incarceration. Specifically, CHWs' experiences and perceptions of the uniqueness of their efforts to link individuals to healthcare have not been documented systematically. This study explored what is meaningful to formerly incarcerated CHWs as they work with released individuals. The authors conducted a qualitative thematic analysis of twelve meaningful experiences written by re-entry CHWs employed by the Transitions Clinic Network who attended a CHW training program during a conference in San Francisco, CA. Study participants were encouraged to recount meaningful CHW experiences and motivations for working with re-entry populations in a manner consistent with journal-based qualitative analysis techniques. Narratives were coded using an iterative process and subsequently organized according to themes in ATLAS.ti. Study personnel came to consensus with coding and major themes. The narratives highlighted thought processes and meaning related to re-entry CHWs' work helping patients navigate complex social services for successful re-integration. Six major themes emerged from the analysis: advocacy and support, empathy relating to a personal history of incarceration, giving back, professional satisfaction and responsibilities, resiliency and educational advancement, and experiences of social inequities related to race. Re-entry CHWs described former incarceration, employment, and social justice as sources of meaning for assisting justice-involved individuals receive effective, efficient, and high-quality healthcare. Health clinics for individuals released from incarceration provide a unique setting that links high risk patients to needed care and professionalizes career opportunities for formerly incarcerated re-entry CHWs. The commonality of past correctional involvement is a strong indicator of the meaning and perceived effectiveness re-entry CHWs find in working with individuals leaving prison. Expansion of reimbursable visits with re-entry CHWs in transitions clinics designed for re-entering individuals is worthy of further consideration.
Extended capability of the integrated transport analysis suite, TASK3D-a, for LHD experiment
NASA Astrophysics Data System (ADS)
Yokoyama, M.; Seki, R.; Suzuki, C.; Sato, M.; Emoto, M.; Murakami, S.; Osakabe, M.; Tsujimura, T. Ii.; Yoshimura, Y.; Ido, T.; Ogawa, K.; Satake, S.; Suzuki, Y.; Goto, T.; Ida, K.; Pablant, N.; Gates, D.; Warmer, F.; Vincenzi, P.; Simulation Reactor Research Project, Numerical; LHD Experiment Group
2017-12-01
The integrated transport analysis suite, TASK3D-a (Analysis), has been developed to be capable for routine whole-discharge analyses of plasmas confined in three-dimensional (3D) magnetic configurations such as the LHD. The routine dynamic energy balance analysis for NBI-heated plasmas was made possible in the first version released in September 2012. The suite has been further extended through implementing additional modules for neoclassical transport and ECH deposition for 3D configurations. A module has also been added for creating systematic data for the International Stellarator-Heliotron Confinement and Profile Database. Improvement of neutral beam injection modules for multiple-ion species plasmas and loose coupling with a large-simulation code are also highlights of recent developments.
A mathematical model of diffusion from a steady source of short duration in a finite mixing layer
NASA Astrophysics Data System (ADS)
Bianconi, Roberto; Tamponi, Matteo
This paper presents an analytical unsteady-state solution to the atmospheric dispersion equation for substances subject to chemical-physical decay in a finite mixing layer for releases of short duration. This solution is suitable for describing critical events relative to accidental release of toxic, flammable or explosive substances. To implement the solution, the Modello per Rilasci a Breve Termine (MRBT) code has been developed, for some characteristics parameters of which the results of the sensitivity analysis are presented. Moreover some examples of application to the calculation of exposure to toxic substances and to the determination of the ignition field of flammable substances are described. Finally, the mathematical model described can be used to interpret the phenomenon of pollutant accumulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmonds, M. J.; Yu, J. H.; Wang, Y. Q.
Simulating the implantation and thermal desorption evolution in a reaction-diffusion model requires solving a set of coupled differential equations that describe the trapping and release of atomic species in Plasma Facing Materials (PFMs). These fundamental equations are well outlined by the Tritium Migration Analysis Program (TMAP) which can model systems with no more than three active traps per atomic species. To overcome this limitation, we have developed a Pseudo Trap and Temperature Partition (PTTP) scheme allowing us to lump multiple inactive traps into one pseudo trap, simplifying the system of equations to be solved. For all temperatures, we show themore » trapping of atoms from solute is exactly accounted for when using a pseudo trap. However, a single effective pseudo trap energy can not well replicate the release from multiple traps, each with its own detrapping energy. However, atoms held in a high energy trap will remain trapped at relatively low temperatures, and thus there is a temperature range in which release from high energy traps is effectively inactive. By partitioning the temperature range into segments, a pseudo trap can be defined for each segment to account for multiple high energy traps that are actively trapping but are effectively not releasing atoms. With increasing temperature, as in controlled thermal desorption, the lowest energy trap is nearly emptied and can be removed from the set of coupled equations, while the next higher energy trap becomes an actively releasing trap. Each segment is thus calculated sequentially, with the last time step of a given segment solution being used as an initial input for the next segment as only the pseudo and actively releasing traps are modeled. This PTTP scheme is then applied to experimental thermal desorption data for tungsten (W) samples damaged with heavy ions, which display six distinct release peaks during thermal desorption. Without modifying the TMAP7 source code the PTTP scheme is shown to successfully model the D retention in all six traps. In conclusion, we demonstrate the full reconstruction from the plasma implantation phase through the controlled thermal desorption phase with detrapping energies near 0.9, 1.1, 1.4, 1.7, 1.9 and 2.1 eV for a W sample damaged at room temperature.« less
Simmonds, M. J.; Yu, J. H.; Wang, Y. Q.; ...
2018-06-04
Simulating the implantation and thermal desorption evolution in a reaction-diffusion model requires solving a set of coupled differential equations that describe the trapping and release of atomic species in Plasma Facing Materials (PFMs). These fundamental equations are well outlined by the Tritium Migration Analysis Program (TMAP) which can model systems with no more than three active traps per atomic species. To overcome this limitation, we have developed a Pseudo Trap and Temperature Partition (PTTP) scheme allowing us to lump multiple inactive traps into one pseudo trap, simplifying the system of equations to be solved. For all temperatures, we show themore » trapping of atoms from solute is exactly accounted for when using a pseudo trap. However, a single effective pseudo trap energy can not well replicate the release from multiple traps, each with its own detrapping energy. However, atoms held in a high energy trap will remain trapped at relatively low temperatures, and thus there is a temperature range in which release from high energy traps is effectively inactive. By partitioning the temperature range into segments, a pseudo trap can be defined for each segment to account for multiple high energy traps that are actively trapping but are effectively not releasing atoms. With increasing temperature, as in controlled thermal desorption, the lowest energy trap is nearly emptied and can be removed from the set of coupled equations, while the next higher energy trap becomes an actively releasing trap. Each segment is thus calculated sequentially, with the last time step of a given segment solution being used as an initial input for the next segment as only the pseudo and actively releasing traps are modeled. This PTTP scheme is then applied to experimental thermal desorption data for tungsten (W) samples damaged with heavy ions, which display six distinct release peaks during thermal desorption. Without modifying the TMAP7 source code the PTTP scheme is shown to successfully model the D retention in all six traps. In conclusion, we demonstrate the full reconstruction from the plasma implantation phase through the controlled thermal desorption phase with detrapping energies near 0.9, 1.1, 1.4, 1.7, 1.9 and 2.1 eV for a W sample damaged at room temperature.« less
Predicting the Where and the How Big of Solar Flares
NASA Astrophysics Data System (ADS)
Barnes, Graham; Leka, K. D.; Gilchrist, Stuart
2017-08-01
The approach to predicting solar flares generally characterizes global properties of a solar active region, for example the total magnetic flux or the total length of a sheared magnetic neutral line, and compares new data (from which to make a prediction) to similar observations of active regions and their associated propensity for flare production. We take here a different tack, examining solar active regions in the context of their energy storage capacity. Specifically, we characterize not the region as a whole, but summarize the energy-release prospects of different sub-regions within, using a sub-area analysis of the photospheric boundary, the CFIT non-linear force-free extrapolation code, and the Minimum Current Corona model. We present here early results from this approach whose objective is to understand the different pathways available for regions to release stored energy, thus eventually providing better estimates of the where (what sub-areas are storing how much energy) and the how big (how much energy is stored, and how much is available for release) of solar flares.
Annual Report of the ECSU Home-Institution Support Program (1993)
1993-09-30
summer of 1992. Stephanie plans to attend graduate school at the University of Alabama at Birmingham. r 3 . Deborah Jones has attended the ISSP program for...computer equipment Component #2 A visiting lecturer series Component # 3 : Students pay & faculty release time Component #4 Student/sponsor travel program...DTXC QUA, ty rNpBT 3 S. 0. CODE: 1133 DISBURSING CODE: N001 79 AGO CODE: N66005 CAGE CODE: OJLKO 3 PART I: A succinct narrative which should
Parser for Sabin-to-Mahoney Transition Model of Quasispecies Replication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ecale Zhou, Carol
2016-01-03
This code is a data parse for preparing output from the Qspp agent-based stochastic simulation model for plotting in Excel. This code is specific to a set of simulations that were run for the purpose of preparing data for a publication. It is necessary to make this code open-source in order to publish the model code (Qspp), which has already been released. There is a necessity of assuring that results from using Qspp for a publication
Subgroup A : nuclear model codes report to the Sixteenth Meeting of the WPEC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talou, P.; Chadwick, M. B.; Dietrich, F. S.
2004-01-01
The Subgroup A activities focus on the development of nuclear reaction models and codes, used in evaluation work for nuclear reactions from the unresolved energy region up to the pion threshold production limit, and for target nuclides from the low teens and heavier. Much of the efforts are devoted by each participant to the continuing development of their own Institution codes. Progresses in this arena are reported in detail for each code in the present document. EMPIRE-II is of public access. The release of the TALYS code has been announced for the ND2004 Conference in Santa Fe, NM, October 2004.more » McGNASH is still under development and is not expected to be released in the very near future. In addition, Subgroup A members have demonstrated a growing interest in working on common modeling and codes capabilities, which would significantly reduce the amount of duplicate work, help manage efficiently the growing lines of existing codes, and render codes inter-comparison much easier. A recent and important activity of the Subgroup A has therefore been to develop the framework and the first bricks of the ModLib library, which is constituted of mostly independent pieces of codes written in Fortran 90 (and above) to be used in existing and future nuclear reaction codes. Significant progresses in the development of ModLib have been made during the past year. Several physics modules have been added to the library, and a few more have been planned in detail for the coming year.« less
Capabilities of LEWICE 1.6 and Comparison With Experimental Data
DOT National Transportation Integrated Search
1996-01-01
A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This paper will demonstr...
Code of Sustainable Practice in Occupational and Environmental Health and Safety for Corporations.
Castleman, Barry; Allen, Barbara; Barca, Stefania; Bohme, Susanna Rankin; Henry, Emmanuel; Kaur, Amarjit; Massard-Guilbaud, Genvieve; Melling, Joseph; Menendez-Navarro, Alfredo; Renfrew, Daniel; Santiago, Myrna; Sellers, Christopher; Tweedale, Geoffrey; Zalik, Anna; Zavestoski, Stephen
2008-01-01
At a conference held at Stony Brook University in December 2007, "Dangerous Trade: Histories of Industrial Hazard across a Globalizing World," participants endorsed a Code of Sustainable Practice in Occupational and Environmental Health and Safety for Corporations. The Code outlines practices that would ensure corporations enact the highest health and environmentally protective measures in all the locations in which they operate. Corporations should observe international guidelines on occupational exposure to air contaminants, plant safety, air and water pollutant releases, hazardous waste disposal practices, remediation of polluted sites, public disclosure of toxic releases, product hazard labeling, sale of products for specific uses, storage and transport of toxic intermediates and products, corporate safety and health auditing, and corporate environmental auditing. Protective measures in all locations should be consonant with the most protective measures applied anywhere in the world, and should apply to the corporations' subsidiaries, contractors, suppliers, distributors, and licensees of technology. Key words: corporations, sustainability, environmental protection, occupational health, code of practice.
A Sequential Fluid-mechanic Chemical-kinetic Model of Propane HCCI Combustion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aceves, S M; Flowers, D L; Martinez-Frias, J
2000-11-29
We have developed a methodology for predicting combustion and emissions in a Homogeneous Charge Compression Ignition (HCCI) Engine. This methodology combines a detailed fluid mechanics code with a detailed chemical kinetics code. Instead of directly linking the two codes, which would require an extremely long computational time, the methodology consists of first running the fluid mechanics code to obtain temperature profiles as a function of time. These temperature profiles are then used as input to a multi-zone chemical kinetics code. The advantage of this procedure is that a small number of zones (10) is enough to obtain accurate results. Thismore » procedure achieves the benefits of linking the fluid mechanics and the chemical kinetics codes with a great reduction in the computational effort, to a level that can be handled with current computers. The success of this procedure is in large part a consequence of the fact that for much of the compression stroke the chemistry is inactive and thus has little influence on fluid mechanics and heat transfer. Then, when chemistry is active, combustion is rather sudden, leaving little time for interaction between chemistry and fluid mixing and heat transfer. This sequential methodology has been capable of explaining the main characteristics of HCCI combustion that have been observed in experiments. In this paper, we use our model to explore an HCCI engine running on propane. The paper compares experimental and numerical pressure traces, heat release rates, and hydrocarbon and carbon monoxide emissions. The results show an excellent agreement, even in parameters that are difficult to predict, such as chemical heat release rates. Carbon monoxide emissions are reasonably well predicted, even though it is intrinsically difficult to make good predictions of CO emissions in HCCI engines. The paper includes a sensitivity study on the effect of the heat transfer correlation on the results of the analysis. Importantly, the paper also shows a numerical study on how parameters such as swirl rate, crevices and ceramic walls could help in reducing HC and CO emissions from HCCI engines.« less
The historical, ethical, and legal background of human-subjects research.
Rice, Todd W
2008-10-01
The current system of human-subject-research oversight and protections has developed over the last 5 decades. The principles of conducting human research were first developed as the Nuremberg code to try Nazi war criminals. The 3 basic elements of the Nuremberg Code (voluntary informed consent, favorable risk/benefit analysis, and right to withdraw without repercussions) became the foundation for subsequent ethical codes and research regulations. In 1964 the World Medical Association released the Declaration of Helsinki, which built on the principles of the Nuremberg Code. Numerous research improprieties between 1950 and 1974 in the United States prompted Congressional deliberations about human-subject-research oversight. Congress's first legislation to protect the rights and welfare of human subjects was the National Research Act of 1974, which created the National Commission for Protection of Human Subjects of Biomedical and Behavioral Research, which issued the Belmont Report. The Belmont Report stated 3 fundamental principles for conducting human-subjects research: respect for persons, beneficence, and justice. The Office of Human Research Protections oversees Title 45, Part 46 of the Code for Federal Regulations, which pertains to human-subjects research. That office indirectly oversees human-subjects research through local institutional review boards (IRB). Since their inception, the principles of conducting human research, IRBs, and the Code for Federal Regulations have all advanced substantially. This paper describes the history and current status of human-subjects-research regulations.
Review of heavy charged particle transport in MCNP6.2
NASA Astrophysics Data System (ADS)
Zieb, K.; Hughes, H. G.; James, M. R.; Xu, X. G.
2018-04-01
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. This paper discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models' theories are included as well.
Review of Heavy Charged Particle Transport in MCNP6.2
Zieb, Kristofer James Ekhart; Hughes, Henry Grady III; Xu, X. George; ...
2018-01-05
The release of version 6.2 of the MCNP6 radiation transport code is imminent. To complement the newest release, a summary of the heavy charged particle physics models used in the 1 MeV to 1 GeV energy regime is presented. Several changes have been introduced into the charged particle physics models since the merger of the MCNP5 and MCNPX codes into MCNP6. Here, this article discusses the default models used in MCNP6 for continuous energy loss, energy straggling, and angular scattering of heavy charged particles. Explanations of the physics models’ theories are included as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca
Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less
PolyPole-1: An accurate numerical algorithm for intra-granular fission gas release
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pizzocri, D.; Rabiti, C.; Luzzi, L.
2016-09-01
This paper describes the development of a new numerical algorithm (called PolyPole-1) to efficiently solve the equation for intra-granular fission gas release in nuclear fuel. The work was carried out in collaboration with Politecnico di Milano and Institute for Transuranium Elements. The PolyPole-1 algorithms is being implemented in INL's fuels code BISON code as part of BISON's fission gas release model. The transport of fission gas from within the fuel grains to the grain boundaries (intra-granular fission gas release) is a fundamental controlling mechanism of fission gas release and gaseous swelling in nuclear fuel. Hence, accurate numerical solution of themore » corresponding mathematical problem needs to be included in fission gas behaviour models used in fuel performance codes. Under the assumption of equilibrium between trapping and resolution, the process can be described mathematically by a single diffusion equation for the gas atom concentration in a grain. In this work, we propose a new numerical algorithm (PolyPole-1) to efficiently solve the fission gas diffusion equation in time-varying conditions. The PolyPole-1 algorithm is based on the analytic modal solution of the diffusion equation for constant conditions, with the addition of polynomial corrective terms that embody the information on the deviation from constant conditions. The new algorithm is verified by comparing the results to a finite difference solution over a large number of randomly generated operation histories. Furthermore, comparison to state-of-the-art algorithms used in fuel performance codes demonstrates that the accuracy of the PolyPole-1 solution is superior to other algorithms, with similar computational effort. Finally, the concept of PolyPole-1 may be extended to the solution of the general problem of intra-granular fission gas diffusion during non-equilibrium trapping and resolution, which will be the subject of future work.« less
2015-03-01
public release; distribution is unlimited THIS PAGE INTENTIONALLY LEFT BLANK REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public ...AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is tmlimited 13. ABSTRACT (maximum 200 words) Nuclear powered...INTENTIONALLY LEFT BLANK Approved for public release; distribution is unlimited UNDERSEA COMMUNICATIONS BETWEEN SUBMARINES AND UNMANNED UNDERSEA VEHICLES IN A
'Skidding' of the CRRES G-9 barium release
NASA Technical Reports Server (NTRS)
Huba, J. D.; Mitchell, H. G.; Fedder, J. A.; Bernhardt, P. A.
1992-01-01
A simulation study and experimental data of the CRRES G-9 ionospheric barium release are presented. The simulation study is based on a 2D electrostatic code that incorporates time-dependent coupling to the background plasma. It is shown that the densest portion of the barium ion cloud 'skids' about 15 km within the first three seconds following the release, consistent with the optical data analyses.
Posttest calculations of bundle quench test CORA-13 with ATHLET-CD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestele, J.; Trambauer, K.; Schubert, J.D.
Gesellschaft fuer Anlagen- und Reaktorsicherheit is developing, in cooperation with the Institut fuer Kernenergetik und Energiesysteme, Stuttgart, the system code Analysis of Thermalhydraulics of Leaks and Transients with Core Degradation (ATHLET-CD). The code consists of detailed models of the thermal hydraulics of the reactor coolant system. This thermo-fluid dynamics module is coupled with modules describing the early phase of the core degradation, like cladding deformation, oxidation and melt relocation, and the release and transport of fission products. The assessment of the code is being done by the analysis of separate effect tests, integral tests, and plant events. The code willmore » be applied to the verification of severe accident management procedures. The out-of-pile test CORA-13 was conducted by Forschungszentrum Karlsruhe in their CORA test facility. The test consisted of two phases, a heatup phase and a quench phase. At the beginning of the quench phase, a sharp peak in the hydrogen generation rate was observed. Both phases of the test have been calculated with the system code ATHLET-CD. Special efforts have been made to simulate the heat losses and the flow distribution in the test facility and the thermal hydraulics during the quench phase. In addition to previous calculations, the material relocation and the quench phase have been modeled. The temperature increase during the heatup phase, the starting time of the temperature escalation, and the maximum temperatures have been calculated correctly. At the beginning of the quench phase, an increased hydrogen generation rate has been calculated as measured in the experiment.« less
SBEToolbox: A Matlab Toolbox for Biological Network Analysis
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418
SBEToolbox: A Matlab Toolbox for Biological Network Analysis.
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aartsen, M.G.; Abraham, K.; Ackermann, M.
We present an improved event-level likelihood formalism for including neutrino telescope data in global fits to new physics. We derive limits on spin-dependent dark matter-proton scattering by employing the new formalism in a re-analysis of data from the 79-string IceCube search for dark matter annihilation in the Sun, including explicit energy information for each event. The new analysis excludes a number of models in the weak-scale minimal supersymmetric standard model (MSSM) for the first time. This work is accompanied by the public release of the 79-string IceCube data, as well as an associated computer code for applying the new likelihoodmore » to arbitrary dark matter models.« less
DOSE: an R/Bioconductor package for disease ontology semantic and enrichment analysis.
Yu, Guangchuang; Wang, Li-Gen; Yan, Guang-Rong; He, Qing-Yu
2015-02-15
Disease ontology (DO) annotates human genes in the context of disease. DO is important annotation in translating molecular findings from high-throughput data to clinical relevance. DOSE is an R package providing semantic similarity computations among DO terms and genes which allows biologists to explore the similarities of diseases and of gene functions in disease perspective. Enrichment analyses including hypergeometric model and gene set enrichment analysis are also implemented to support discovering disease associations of high-throughput biological data. This allows biologists to verify disease relevance in a biological experiment and identify unexpected disease associations. Comparison among gene clusters is also supported. DOSE is released under Artistic-2.0 License. The source code and documents are freely available through Bioconductor (http://www.bioconductor.org/packages/release/bioc/html/DOSE.html). Supplementary data are available at Bioinformatics online. gcyu@connect.hku.hk or tqyhe@jnu.edu.cn. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.
Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei
2016-02-02
Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.
Python-Based Applications for Hydrogeological Modeling
NASA Astrophysics Data System (ADS)
Khambhammettu, P.
2013-12-01
Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The python wrapper invokes the underlying FORTRAN layer to compute transient groundwater elevations and processes this information to create time-series and 2D plots.
McCarty, Dennis; Rieckmann, Traci; Baker, Robin L; McConnell, K John
2017-03-01
Title 42 of the Code of Federal Regulations Part 2 (42 CFR Part 2) controls the release of patient information about treatment for substance use disorders. In 2016, the Substance Abuse and Mental Health Services Administration (SAMHSA) released a proposed rule to update the regulations, reduce provider burdens, and facilitate information exchange. Oregon's Medicaid program (Oregon Health Plan) altered the financing and structure of medical, dental, and behavioral care to promote greater integration and coordination. A qualitative analysis examined the perceived impact of 42 CFR Part 2 on care coordination and integration. Interviews with 76 stakeholders (114 interviews) conducted in 2012-2015 probed the processes of integrating behavioral health into primary care settings in Oregon and assessed issues associated with adherence to 42 CFR Part 2. Respondents expressed concerns that the regulations caused legal confusion, inhibited communication and information sharing, and required updating. Addiction treatment directors noted the challenges of obtaining patient consent to share information with primary care providers. The confidentiality regulations were perceived as a barrier to care coordination and integration. The Oregon Health Authority, therefore, requested regulatory changes. SAMHSA's proposed revisions permit a general consent to an entire health care team and allow inclusion of substance use disorder information within health information exchanges, but they mandate data segmentation of diagnostic and procedure codes related to substance use disorders and restrict access only to parties with authorized consent, possibly adding barriers to the coordination and integration of addiction treatment with primary care.
19 CFR 142.42 - Application for Line Release processing.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Company Information: name, address, city, state, contact person, phone number of contact person, and... identification number of the shipper or manufacturer. (f) Importer information (if importer is different than filer): Name, address, city, state and country, zip code, importer number, bond number, and surety code...
19 CFR 142.42 - Application for Line Release processing.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Company Information: name, address, city, state, contact person, phone number of contact person, and... identification number of the shipper or manufacturer. (f) Importer information (if importer is different than filer): Name, address, city, state and country, zip code, importer number, bond number, and surety code...
19 CFR 142.42 - Application for Line Release processing.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Company Information: name, address, city, state, contact person, phone number of contact person, and... identification number of the shipper or manufacturer. (f) Importer information (if importer is different than filer): Name, address, city, state and country, zip code, importer number, bond number, and surety code...
19 CFR 142.42 - Application for Line Release processing.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Company Information: name, address, city, state, contact person, phone number of contact person, and... identification number of the shipper or manufacturer. (f) Importer information (if importer is different than filer): Name, address, city, state and country, zip code, importer number, bond number, and surety code...
The ALICE Software Release Validation cluster
NASA Astrophysics Data System (ADS)
Berzano, D.; Krzewicki, M.
2015-12-01
One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.
ERIC Educational Resources Information Center
Group of Eight (NJ1), 2008
2008-01-01
The release of the "Australian Code for the Responsible Conduct of Research 2007" by the Australian Government in 2007 was welcomed by Go8 (Group of Eight) institutions, particularly in relation to the improvements and broader scope of the matters covered by Part A of that Code. However, as foreshadowed by the Go8 during the consultation…
Conflict Containment in the Balkans: Testing Extended Deterrence.
1995-03-01
STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is unlimited. 13. ABSTRACT This thesis critically analyzes a prominent theoretical...Containment 15. NUMBER OF in the Balkans; Deterrence; Coercive Diplomacy; Balance of Forces. PAGES: 161 16. PRICE CODE 17. SECURITY CLASSIFI- 18. SECURITY...Department of National Security Affai sAccesion For NTIS CRA&I DTtC TAB Unannounced Justifca ........... By- Distribution Availability Codes Avail and/or Dist
Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1997 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Mark A.; Mallette, Christine; Murray, William M.
1998-03-01
This annual report is in fulfillment of contract obligations with Bonneville Power Administration which is the funding source for the Oregon Department of Fish and Wildlife's Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule stock fall chinook were caught primarily in British Columbia and Washington ocean, and Oregon freshwater fisheries. Up-river bright stock fall chinook contributed primarily to Alaska and British Columbia ocean commercial, and Columbia River gillnet and other freshwater fisheries. Contribution of Rogue stock fall chinook released in the lower Columbia River occurred primarily in Oregon ocean commercial and Columbia river gillnet fisheries. Willamettemore » stock spring chinook contributed primarily to Alaska and British Columbia ocean commercial, Oregon freshwater sport and Columbia River gillnet fisheries. Willamette stock spring chinook released by CEDC contributed to similar ocean fisheries, but had much higher catch in gillnet fisheries than the same stocks released in the Willamette system. Up-river stocks of spring chinook contributed almost exclusively to Columbia River sport fisheries and other freshwater recovery areas. The up-river stocks of Columbia River summer steelhead contributed primarily to the Columbia River gillnet and other freshwater fisheries. Coho ocean fisheries from Washington to California were closed or very limited from 1994 through 1997 (1991 through 1994 broods). This has resulted in a greater average percent of catch for other fishery areas. Coho stocks released by ODFW below Bonneville Dam contributed mainly to Oregon and Washington ocean, Columbia Gillnet and other freshwater fisheries. Coho stocks released in the Klaskanine River and Youngs Bay area had similar ocean catch, but much higher contribution to gillnet fisheries than the other coho releases. Coho stocks released above Bonneville Dam had similar contribution to ocean fisheries as other coho releases. However, they contributed more to gillnet fisheries above Bonneville Dam than coho released below the dam. Survival rates of salmon and steelhead are influenced, not only by factors in the hatchery (disease, density, diet, size and time of release) but also by environmental factors in the river and ocean. These environmental factors are influenced by large scale weather patterns such as El Nino over which man has no influence. Changes in rearing conditions in the hatchery, over which man has some influence, do impact the survival rates. However, these impacts can be offset by impacts caused by environmental factors. Coho salmon released in the Columbia River generally experience better survival rates when released later in the spring. However, for the 1990 brood year June releases of Columbia River coho had much lower survival than May releases, for all ODFW hatcheries. In general survival of ODFW Columbia River hatchery coho has declined to low levels since the 1989 brood year. In an effort to evaluate photonic marking as a tool to mass mark salmonids, two groups of 1995 brood juvenile coho salmon were marked at Sandy Hatchery. The first group (Group A) received a fluorescent red mark, adipose fin clip and coded-wire tag. The second group (Group B) received a cryptic blue mark, adipose fin clip and coded-wire tag. Both groups were released in the spring of 1997. No photonic marks were detected in the precocious males (jacks) returning to Sandy hatchery in the fall of 1997.« less
28 CFR 2.40 - Conditions of release.
Code of Federal Regulations, 2013 CFR
2013-07-01
... special condition, because available information indicates a low risk of future substance abuse by the..., YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.40 Conditions of... substance. If the Commission finds after a revocation hearing that a releasee, released after December 31...
28 CFR 2.40 - Conditions of release.
Code of Federal Regulations, 2014 CFR
2014-07-01
... special condition, because available information indicates a low risk of future substance abuse by the..., YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.40 Conditions of... substance. If the Commission finds after a revocation hearing that a releasee, released after December 31...
28 CFR 2.40 - Conditions of release.
Code of Federal Regulations, 2011 CFR
2011-07-01
... special condition, because available information indicates a low risk of future substance abuse by the..., YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.40 Conditions of... substance. If the Commission finds after a revocation hearing that a releasee, released after December 31...
28 CFR 2.40 - Conditions of release.
Code of Federal Regulations, 2012 CFR
2012-07-01
... special condition, because available information indicates a low risk of future substance abuse by the..., YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.40 Conditions of... substance. If the Commission finds after a revocation hearing that a releasee, released after December 31...
Nagy, Amber; Harrison, Alistair; Sabbani, Supriya; Munson, Robert S; Dutta, Prabir K; Waldman, W James
2011-01-01
Background The focus of this study is on the antibacterial properties of silver nanoparticles embedded within a zeolite membrane (AgNP-ZM). Methods and Results These membranes were effective in killing Escherichia coli and were bacteriostatic against methicillin-resistant Staphylococcus aureus. E. coli suspended in Luria Bertani (LB) broth and isolated from physical contact with the membrane were also killed. Elemental analysis indicated slow release of Ag+ from the AgNP-ZM into the LB broth. The E. coli killing efficiency of AgNP-ZM was found to decrease with repeated use, and this was correlated with decreased release of silver ions with each use of the support. Gene expression microarrays revealed upregulation of several antioxidant genes as well as genes coding for metal transport, metal reduction, and ATPase pumps in response to silver ions released from AgNP-ZM. Gene expression of iron transporters was reduced, and increased expression of ferrochelatase was observed. In addition, upregulation of multiple antibiotic resistance genes was demonstrated. The expression levels of multicopper oxidase, glutaredoxin, and thioredoxin decreased with each support use, reflecting the lower amounts of Ag+ released from the membrane. The antibacterial mechanism of AgNP-ZM is proposed to be related to the exhaustion of antioxidant capacity. Conclusion These results indicate that AgNP-ZM provide a novel matrix for gradual release of Ag+. PMID:21931480
Fundamental Studies in the Molecular Basis of Laser Induced Retinal Damage
1988-01-01
Cornell University School of Applied & Engineering Physics Ithaca, NY 14853 DOD DISTRIBUTION STATEMENT Approved for public release; distribution unlimited...Code) 7b. ADDRESS (City, State, and ZIP Code) School of Applied & Engineering Physics Ithaca, NY 14853 8a. NAME OF FUNDING/SPONSORING Bb. OFFICE SYMBOL
Fundamental Studies in the Molecular Basis of Laser Induced Retinal Damage
1988-01-01
Cornell University .LECT l School of Applied & Engineering PhysicsIthaca, NY 14853 0 JAN 198D DOD DISTRIBUTION STATEMENT Approved for public release...State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) School of Applied & Engineering Physics Ithaca, NY 14853 Ba. NAME OF FUNDING/ SPONSORING
ERIC Educational Resources Information Center
De Nigris, Rosemarie Previti
2017-01-01
The hypothesis of the study was explicit gradual release of responsibility comprehension instruction (GRR) (Pearson & Gallagher, 1983; Fisher & Frey, 2008) with the researcher-created Story Grammar Code (SGC) strategy would significantly increase third graders' comprehension of narrative fiction and nonfiction text. SGC comprehension…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Di; Mo, Kun; Ye, Bei
2015-09-30
This activity is supported by the US Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Product Line (FPL). Two major accomplishments in FY 15 are summarized in this report: (1) implementation of the FASTGRASS module in the BISON code; and (2) a Xe implantation experiment for large-grained UO 2. Both BISON AND MARMOT codes have been developed by Idaho National Laboratory (INL) to enable next generation fuel performance modeling capability as part of the NEAMS Program FPL. To contribute to the development of the Moose-Bison-Marmot (MBM) code suite, we have implemented the FASTGRASS fission gas model as a module inmore » the BISON code. Based on rate theory formulations, the coupled FASTGRASS module in BISON is capable of modeling LWR oxide fuel fission gas behavior and fission gas release. In addition, we conducted a Xe implantation experiment at the Argonne Tandem Linac Accelerator System (ATLAS) in order to produce the needed UO 2 samples with desired bubble morphology. With these samples, further experiments to study the fission gas diffusivity are planned to provide validation data for the Fission Gas Release Model in MARMOT codes.« less
Correlation of recent fission product release data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kress, T.S.; Lorenz, R.A.; Nakamura, T.
For the calculation of source terms associated with severe accidents, it is necessary to model the release of fission products from fuel as it heats and melts. Perhaps the most definitive model for fission product release is that of the FASTGRASS computer code developed at Argonne National Laboratory. There is persuasive evidence that these processes, as well as additional chemical and gas phase mass transport processes, are important in the release of fission products from fuel. Nevertheless, it has been found convenient to have simplified fission product release correlations that may not be as definitive as models like FASTGRASS butmore » which attempt in some simple way to capture the essence of the mechanisms. One of the most widely used such correlation is called CORSOR-M which is the present fission product/aerosol release model used in the NRC Source Term Code Package. CORSOR has been criticized as having too much uncertainty in the calculated releases and as not accurately reproducing some experimental data. It is currently believed that these discrepancies between CORSOR and the more recent data have resulted because of the better time resolution of the more recent data compared to the data base that went into the CORSOR correlation. This document discusses a simple correlational model for use in connection with NUREG risk uncertainty exercises. 8 refs., 4 figs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise P.; Demkowicz, Paul A.; Baldwin, Charles A.
2016-11-01
The PARFUME (PARticle FUel ModEl) code was used to predict silver release from tristructural isotropic (TRISO) coated fuel particles and compacts during the second irradiation experiment (AGR-2) of the Advanced Gas Reactor Fuel Development and Qualification program. The PARFUME model for the AGR-2 experiment used the fuel compact volume average temperature for each of the 559 days of irradiation to calculate the release of fission product silver from a representative particle for a select number of AGR-2 compacts and individual fuel particles containing either mixed uranium carbide/oxide (UCO) or 100% uranium dioxide (UO2) kernels. Post-irradiation examination (PIE) measurements were performedmore » to provide data on release of silver from these compacts and individual fuel particles. The available experimental fractional releases of silver were compared to their corresponding PARFUME predictions. Preliminary comparisons show that PARFUME under-predicts the PIE results in UCO compacts and is in reasonable agreement with experimental data for UO2 compacts. The accuracy of PARFUME predictions is impacted by the code limitations in the modeling of the temporal and spatial distributions of the temperature across the compacts. Nevertheless, the comparisons on silver release lie within the same order of magnitude.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
...The FAA proposes to rule and invite public comment on the release of land at the Reading Regional Airport, Reading, Pennsylvania under the provisions of Section 47125(a) of Title 49 United States Code (U.S.C.).
Application of Uniform Measurement Error Distribution
2016-03-18
subrata.sanyal@navy.mil Point of Contact: Measurement Science & Engineering Department Operations (Code: MS02) P.O. Box 5000 Corona , CA 92878... Corona , California 92878-5000 March 18, 2016 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited...NSWC Corona Public Release Control Number 16-005) NSWCCORDIV/RDTR-2016-005 iii
THE HYDROCARBON SPILL SCREENING MODEL (HSSM), VOLUME 2: THEORETICAL BACKGROUND AND SOURCE CODES
A screening model for subsurface release of a nonaqueous phase liquid which is less dense than water (LNAPL) is presented. The model conceptualizes the release as consisting of 1) vertical transport from near the surface to the capillary fringe, 2) radial spreading of an LNAPL l...
NASA Technical Reports Server (NTRS)
Watts, Michael E.; Dejpour, Shabob R.
1989-01-01
The changes made on the data analysis and management program DATAMAP (Data from Aeromechanics Test and Analytics - Management and Analysis Package) are detailed. These changes are made to Version 3.07 (released February, 1981) and are called Version 4.0. Version 4.0 improvements were performed by Sterling Software under contract to NASA Ames Research Center. The increased capabilities instituted in this version include the breakout of the source code into modules for ease of modification, addition of a more accurate curve fit routine, ability to handle higher frequency data, additional data analysis features, and improvements in the functionality of existing features. These modification will allow DATAMAP to be used on more data sets and will make future modifications and additions easier to implement.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cerjan, Charles J.; Shi, Xizeng
The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less
A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry (Brief)
2014-10-01
SOQPSK 0.0085924 us 0.015231 kH2 10 1/2 20 Time Modulation/ Coding State ... .. . . D - 2/3 3/4 4/5 GTRI_B-‹#› MATLAB GUI Interface 8...802.11a) • Modulations: BPSK, QPSK, 16 QAM, 64 QAM • Cyclic Prefix Lengths • Number of Subcarriers • Coding • LDPC • Rates: 1/2, 2/3, 3/4, 4/5...and Coding in Airborne Telemetry (Brief) October 2014 DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Test
DIANA-LncBase v2: indexing microRNA targets on non-coding transcripts
Paraskevopoulou, Maria D.; Vlachos, Ioannis S.; Karagkouni, Dimitra; Georgakilas, Georgios; Kanellos, Ilias; Vergoulis, Thanasis; Zagganas, Konstantinos; Tsanakas, Panayiotis; Floros, Evangelos; Dalamagas, Theodore; Hatzigeorgiou, Artemis G.
2016-01-01
microRNAs (miRNAs) are short non-coding RNAs (ncRNAs) that act as post-transcriptional regulators of coding gene expression. Long non-coding RNAs (lncRNAs) have been recently reported to interact with miRNAs. The sponge-like function of lncRNAs introduces an extra layer of complexity in the miRNA interactome. DIANA-LncBase v1 provided a database of experimentally supported and in silico predicted miRNA Recognition Elements (MREs) on lncRNAs. The second version of LncBase (www.microrna.gr/LncBase) presents an extensive collection of miRNA:lncRNA interactions. The significantly enhanced database includes more than 70 000 low and high-throughput, (in)direct miRNA:lncRNA experimentally supported interactions, derived from manually curated publications and the analysis of 153 AGO CLIP-Seq libraries. The new experimental module presents a 14-fold increase compared to the previous release. LncBase v2 hosts in silico predicted miRNA targets on lncRNAs, identified with the DIANA-microT algorithm. The relevant module provides millions of predicted miRNA binding sites, accompanied with detailed metadata and MRE conservation metrics. LncBase v2 caters information regarding cell type specific miRNA:lncRNA regulation and enables users to easily identify interactions in 66 different cell types, spanning 36 tissues for human and mouse. Database entries are also supported by accurate lncRNA expression information, derived from the analysis of more than 6 billion RNA-Seq reads. PMID:26612864
Improving the Multi-Wavelength Capability of Chandra Large Programs
NASA Astrophysics Data System (ADS)
Pacucci, Fabio
2017-09-01
In order to fully exploit the joint Chandra/JWST/HST ventures to detect faint sources, we urgently need an advanced matching algorithm between optical/NIR and X-ray catalogs/images. This will be of paramount importance in bridging the gap between upcoming optical/NIR facilities (JWST) and later X-ray ones (Athena, Lynx). We propose to develop an advanced and automated tool to improve the identification of Chandra X-ray counterparts detected in deep optical/NIR fields based on T-PHOT, a software widely used in the community. The developed code will include more than 20 years in advancements of X-ray data analysis and will be released to the public. Finally, we will release an updated catalog of X-ray sources in the CANDELS regions: a leap forward in our endeavor of charting the Universe.
Power Aware Signal Processing Environment (PASPE) for PAC/C
2003-02-01
vs. FFT Size For our implementation , the Annapolis FFT core was radix-256, and therefore the smallest PN code length that could be processed was the...PN-64. A C- code version of correlate was compared to the FPGA 61 implementation . The results in Figure 68 show that for a PN-1024, the...12a. DISTRIBUTION / AVAILABILITY STATEMENT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum
The Italian Code of Medical Deontology: characterizing features of its 2014 edition.
Conti, Andrea Alberto
2015-09-14
The latest edition of the Italian Code of Medical Deontology has been released by the Italian Federation of the Registers of Physicians and Dentists in May 2014 (1). The previous edition of the Italian Code dated back to 2006 (2), and it has been integrated and updated by a multi-professional and inter-disciplinary panel involving, besides physicians, representatives of scientific societies and trade unions, jurisconsults and experts in bioethics....
Health Physics Code System for Evaluating Accidents Involving Radioactive Materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-10-01
Version 03 The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculational tool for evaluating accidents involving radioactive materials. HOTSPOT codes provide a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. The developer's website is: http://www.llnl.gov/nhi/hotspot/. Four general programs, PLUME, EXPLOSION, FIRE, and RESUSPENSION, calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Additional programs deal specifically with the release of plutonium, uranium, and tritium to expedite an initial assessmentmore » of accidents involving nuclear weapons. The FIDLER program can calibrate radiation survey instruments for ground survey measurements and initial screening of personnel for possible plutonium uptake in the lung. The HOTSPOT codes are fast, portable, easy to use, and fully documented in electronic help files. HOTSPOT supports color high resolution monitors and printers for concentration plots and contours. The codes have been extensively used by the DOS community since 1985. Tables and graphical output can be directed to the computer screen, printer, or a disk file. The graphical output consists of dose and ground contamination as a function of plume centerline downwind distance, and radiation dose and ground contamination contours. Users have the option of displaying scenario text on the plots. HOTSPOT 3.0.1 fixes three significant Windows 7 issues: Executable installed properly under "Program Files/HotSpot 3.0". Installation package now smaller: removed dependency on older Windows DLL files which previously needed to; Forms now properly scale based on DPI instead of font for users who change their screen resolution to something other than 100%. This is a more common feature in Windows 7; Windows installer was starting everytime most users started the program, even after HotSpot was already installed. Now, after the program is installed the installer may come up once for each new user but only the first time they run HotSpot on a particular machine. So no user should see the installer come up more than once over many uses; and GPS capability updated to directly use a serial port through a USB connection. Non-USB connections should still work. Fixed table output inconsistencies for fire scenarios.« less
A New Python Library for Spectroscopic Analysis with MIDAS Style
NASA Astrophysics Data System (ADS)
Song, Y.; Luo, A.; Zhao, Y.
2013-10-01
The ESO MIDAS is a system for astronomers to analyze data which many astronomers are using. Python is a high level script language and there are many applications for astronomical data process. We are releasing a new Python library which realizes some MIDAS commands in Python. People can use it to write a MIDAS style Python code. We call it PydasLib. It is a Python library based on ESO MIDAS functions, which is easily used by astronomers who are familiar with the usage of MIDAS.
JEnsembl: a version-aware Java API to Ensembl data systems
Paterson, Trevor; Law, Andy
2012-01-01
Motivation: The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. Results: The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing ‘through time’ comparative analyses to be performed. Availability: Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net). Contact: jensembl-develop@lists.sf.net, andy.law@roslin.ed.ac.uk, trevor.paterson@roslin.ed.ac.uk PMID:22945789
Michael, Claudia; Rizzi, Andreas M
2015-02-27
Glycan reductive isotope labeling (GRIL) using (12)C6-/(13)C6-aniline as labeling reagent is reported with the aim of quantitative N-glycan fingerprinting. Porous graphitized carbon (PGC) as stationary phase in capillary scale HPLC coupled to electrospray mass spectrometry with time of flight analyzer was applied for the determination of labeled N-glycans released from glycoproteins. The main benefit of using stable isotope-coding in the context of comparative glycomics lies in the improved accuracy and precision of the quantitative analysis in combined samples and in the potential of correcting for structure-dependent incomplete enzymatic release of oligosaccharides when comparing identical target proteins. The method was validated with respect to mobile phase parameters, reproducibility, accuracy, linearity and limit of detection/quantification (LOD/LOQ) using test glycoproteins. It is shown that the developed method is capable of determining relative amounts of N-glycans (including isomers) comparing two samples in one single HPLC-MS run. The analytical potential and usefulness of GRIL in combination with PGC-ESI-TOF-MS is demonstrated comparing glycosylation in human monoclonal antibodies produced in Chinese hamster ovary cells (CHO) and hybridoma cell lines. Copyright © 2015 Elsevier B.V. All rights reserved.
FY16 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2016-09-30
The goal of the NEAMS neutronics effort is to develop a neutronics toolkit for use on sodium-cooled fast reactors (SFRs) which can be extended to other reactor types. The neutronics toolkit includes the high-fidelity deterministic neutron transport code PROTEUS and many supporting tools such as a cross section generation code MC 2-3, a cross section library generation code, alternative cross section generation tools, mesh generation and conversion utilities, and an automated regression test tool. The FY16 effort for NEAMS neutronics focused on supporting the release of the SHARP toolkit and existing and new users, continuing to develop PROTEUS functions necessarymore » for performance improvement as well as the SHARP release, verifying PROTEUS against available existing benchmark problems, and developing new benchmark problems as needed. The FY16 research effort was focused on further updates of PROTEUS-SN and PROTEUS-MOCEX and cross section generation capabilities as needed.« less
CometBoards Users Manual Release 1.0
NASA Technical Reports Server (NTRS)
Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo
1996-01-01
Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.
Variable thickness transient ground-water flow model. Volume 3. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenauer, A.E.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less
Computer Simulation of the VASIMR Engine
NASA Technical Reports Server (NTRS)
Garrison, David
2005-01-01
The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.
Preliminary Numerical Simulation of IR Structure Development in a Hypothetical Uranium Release.
1981-11-16
art Identify by block nAsb.’) IR Structure Power spectrum Uranium release Parallax effects Numerical simulation PHARO code Isophots LWIR 20. _PSTRACT...release at 200 km altitude. Of interest is the LWIR emission from uranium oxide ions, induced by sunlight and earthshine. Assuming a one-level fluid...defense systems of long wave infrared ( LWIR ) emissions from metallic oxides in the debris from a high altitude nuclear explosion (HANE) is an
Initial Multidisciplinary Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.;
2010-01-01
Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.
Secure Hardware Design for Trust
2014-03-01
approach. The Grain VHDL code was obtained from [13] and implemented in the same fashion as shown in Figure 5. Approved for Public Release...CRC implementation for USB token protocol was chosen was the main candidate. The VHDL source code was generated from [14] using the standard CRC5...10 6.1 Logic Encryption Implementation of AES
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-05-17
PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
Copper benchmark experiment for the testing of JEFF-3.2 nuclear data for fusion applications
NASA Astrophysics Data System (ADS)
Angelone, M.; Flammini, D.; Loreti, S.; Moro, F.; Pillon, M.; Villar, R.; Klix, A.; Fischer, U.; Kodeli, I.; Perel, R. L.; Pohorecky, W.
2017-09-01
A neutronics benchmark experiment on a pure Copper block (dimensions 60 × 70 × 70 cm3) aimed at testing and validating the recent nuclear data libraries for fusion applications was performed in the frame of the European Fusion Program at the 14 MeV ENEA Frascati Neutron Generator (FNG). Reaction rates, neutron flux spectra and doses were measured using different experimental techniques (e.g. activation foils techniques, NE213 scintillator and thermoluminescent detectors). This paper first summarizes the analyses of the experiment carried-out using the MCNP5 Monte Carlo code and the European JEFF-3.2 library. Large discrepancies between calculation (C) and experiment (E) were found for the reaction rates both in the high and low neutron energy range. The analysis was complemented by sensitivity/uncertainty analyses (S/U) using the deterministic and Monte Carlo SUSD3D and MCSEN codes, respectively. The S/U analyses enabled to identify the cross sections and energy ranges which are mostly affecting the calculated responses. The largest discrepancy among the C/E values was observed for the thermal (capture) reactions indicating severe deficiencies in the 63,65Cu capture and elastic cross sections at lower rather than at high energy. Deterministic and MC codes produced similar results. The 14 MeV copper experiment and its analysis thus calls for a revision of the JEFF-3.2 copper cross section and covariance data evaluation. A new analysis of the experiment was performed with the MCNP5 code using the revised JEFF-3.3-T2 library released by NEA and a new, not yet distributed, revised JEFF-3.2 Cu evaluation produced by KIT. A noticeable improvement of the C/E results was obtained with both new libraries.
NASA Technical Reports Server (NTRS)
Nguyen, H. L.; Ying, S.-J.
1990-01-01
Jet-A spray combustion has been evaluated in gas turbine combustion with the use of propane chemical kinetics as the first approximation for the chemical reactions. Here, the numerical solutions are obtained by using the KIVA-2 computer code. The KIVA-2 code is the most developed of the available multidimensional combustion computer programs for application of the in-cylinder combustion dynamics of internal combustion engines. The released version of KIVA-2 assumes that 12 chemical species are present; the code uses an Arrhenius kinetic-controlled combustion model governed by a four-step global chemical reaction and six equilibrium reactions. Researchers efforts involve the addition of Jet-A thermophysical properties and the implementation of detailed reaction mechanisms for propane oxidation. Three different detailed reaction mechanism models are considered. The first model consists of 131 reactions and 45 species. This is considered as the full mechanism which is developed through the study of chemical kinetics of propane combustion in an enclosed chamber. The full mechanism is evaluated by comparing calculated ignition delay times with available shock tube data. However, these detailed reactions occupy too much computer memory and CPU time for the computation. Therefore, it only serves as a benchmark case by which to evaluate other simplified models. Two possible simplified models were tested in the existing computer code KIVA-2 for the same conditions as used with the full mechanism. One model is obtained through a sensitivity analysis using LSENS, the general kinetics and sensitivity analysis program code of D. A. Bittker and K. Radhakrishnan. This model consists of 45 chemical reactions and 27 species. The other model is based on the work published by C. K. Westbrook and F. L. Dryer.
Modelling of the Gadolinium Fuel Test IFA-681 using the BISON Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastore, Giovanni; Hales, Jason Dean; Novascone, Stephen Rhead
2016-05-01
In this work, application of Idaho National Laboratory’s fuel performance code BISON to modelling of fuel rods from the Halden IFA-681 gadolinium fuel test is presented. First, an overview is given of BISON models, focusing on UO2/UO2-Gd2O3 fuel and Zircaloy cladding. Then, BISON analyses of selected fuel rods from the IFA-681 test are performed. For the first time in a BISON application to integral fuel rod simulations, the analysis is informed by detailed neutronics calculations in order to accurately capture the radial power profile throughout the fuel, which is strongly affected by the complex evolution of absorber Gd isotopes. Inmore » particular, radial power profiles calculated at IFE–Halden Reactor Project with the HELIOS code are used. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project. Some slide have been added as an Appendix to present the newly developed PolyPole-1 algorithm for modeling of intra-granular fission gas release.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faydide, B.
1997-07-01
This paper presents the current and planned numerical development for improving computing performance in case of Cathare applications needing real time, like simulator applications. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the general characteristics of the code are presented, dealing with physical models, numerical topics, and validation strategy. Then, the current and planned applications of Cathare in the field of simulators are discussed. Some of these applications were made in the past, using a simplified and fast-running version of Cathare (Cathare-Simu); the status of the numerical improvements obtained withmore » Cathare-Simu is presented. The planned developments concern mainly the Simulator Cathare Release (SCAR) project which deals with the use of the most recent version of Cathare inside simulators. In this frame, the numerical developments are related with the speed up of the calculation process, using parallel processing and improvement of code reliability on a large set of NPP transients.« less
Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane
2008-11-06
This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.
Flow Instability Tests for a Particle Bed Reactor Nuclear Thermal Rocket Fuel Element
1993-05-01
2.0 with GWBASIC or higher (DOS 5.0 was installed on the machine). Since the source code was written in BASIC, it was easy to make modifications...8217 AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for Public Release IAW 190-1 Distribution Unlimited MICHAEL M. BRICKER, SMSgt, USAF Chief...Administration 13. ABSTRACT (Maximum 200 words) i.14. SUBJECT TERMS 15. NUMBER OF PAGES 339 16. PRICE CODE . SECURITY CLASSIFICATION 18. SECURITY
Dissemination and support of ARGUS for accelerator applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-29
... International Policy, Fuel Economy and Consumer Programs, National Highway Traffic Safety Administration, 1200... key, powered by a battery and consists of a transmitter/receiver which communicates with the EWS... and starter. The ignition and fuel supply are only released when a correct coded release signal has...
75 FR 9248 - Endangered and Threatened Wildlife and Plants; Permit Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-01
... endangered species in the Code of Federal Regulations (CFR) at 50 CFR 17. Submit your written data, comments... renewal to take (capture and release) Indiana bats, gray bats, and Virginia big-eared bats (Corynorhinus... and release) Indiana bats, gray bats, Virginia big-eared bats, Ozark big-eared bats (Corynorhinus...
User Manual for the NASA Glenn Ice Accretion Code LEWICE. Version 2.2.2
NASA Technical Reports Server (NTRS)
Wright, William B.
2002-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.2.2 of this code, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A of this report has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.
Characterization of in situ oil shale retorts prior to ignition
Turner, Thomas F.; Moore, Dennis F.
1984-01-01
Method and system for characterizing a vertical modified in situ oil shale retort prior to ignition of the retort. The retort is formed by mining a void at the bottom of a proposed retort in an oil shale deposit. The deposit is then sequentially blasted into the void to form a plurality of layers of rubble. A plurality of units each including a tracer gas cannister are installed at the upper level of each rubble layer prior to blasting to form the next layer. Each of the units includes a receiver that is responsive to a coded electromagnetic (EM) signal to release gas from the associated cannister into the rubble. Coded EM signals are transmitted to the receivers to selectively release gas from the cannisters. The released gas flows through the retort to an outlet line connected to the floor of the retort. The time of arrival of the gas at a detector unit in the outlet line relative to the time of release of gas from the cannisters is monitored. This information enables the retort to be characterized prior to ignition.
A Roadmap to Continuous Integration for ATLAS Software Development
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.
Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Mathematical description of complex chemical kinetics and application to CFD modeling codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
pyOpenMS: a Python-based interface to the OpenMS mass-spectrometry algorithm library.
Röst, Hannes L; Schmitt, Uwe; Aebersold, Ruedi; Malmström, Lars
2014-01-01
pyOpenMS is an open-source, Python-based interface to the C++ OpenMS library, providing facile access to a feature-rich, open-source algorithm library for MS-based proteomics analysis. It contains Python bindings that allow raw access to the data structures and algorithms implemented in OpenMS, specifically those for file access (mzXML, mzML, TraML, mzIdentML among others), basic signal processing (smoothing, filtering, de-isotoping, and peak-picking) and complex data analysis (including label-free, SILAC, iTRAQ, and SWATH analysis tools). pyOpenMS thus allows fast prototyping and efficient workflow development in a fully interactive manner (using the interactive Python interpreter) and is also ideally suited for researchers not proficient in C++. In addition, our code to wrap a complex C++ library is completely open-source, allowing other projects to create similar bindings with ease. The pyOpenMS framework is freely available at https://pypi.python.org/pypi/pyopenms while the autowrap tool to create Cython code automatically is available at https://pypi.python.org/pypi/autowrap (both released under the 3-clause BSD licence). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ndah, Elvis; Jonckheere, Veronique
2017-01-01
Proteogenomics is an emerging research field yet lacking a uniform method of analysis. Proteogenomic studies in which N-terminal proteomics and ribosome profiling are combined, suggest that a high number of protein start sites are currently missing in genome annotations. We constructed a proteogenomic pipeline specific for the analysis of N-terminal proteomics data, with the aim of discovering novel translational start sites outside annotated protein coding regions. In summary, unidentified MS/MS spectra were matched to a specific N-terminal peptide library encompassing protein N termini encoded in the Arabidopsis thaliana genome. After a stringent false discovery rate filtering, 117 protein N termini compliant with N-terminal methionine excision specificity and indicative of translation initiation were found. These include N-terminal protein extensions and translation from transposable elements and pseudogenes. Gene prediction provided supporting protein-coding models for approximately half of the protein N termini. Besides the prediction of functional domains (partially) contained within the newly predicted ORFs, further supporting evidence of translation was found in the recently released Araport11 genome re-annotation of Arabidopsis and computational translations of sequences stored in public repositories. Most interestingly, complementary evidence by ribosome profiling was found for 23 protein N termini. Finally, by analyzing protein N-terminal peptides, an in silico analysis demonstrates the applicability of our N-terminal proteogenomics strategy in revealing protein-coding potential in species with well- and poorly-annotated genomes. PMID:28432195
Willems, Patrick; Ndah, Elvis; Jonckheere, Veronique; Stael, Simon; Sticker, Adriaan; Martens, Lennart; Van Breusegem, Frank; Gevaert, Kris; Van Damme, Petra
2017-06-01
Proteogenomics is an emerging research field yet lacking a uniform method of analysis. Proteogenomic studies in which N-terminal proteomics and ribosome profiling are combined, suggest that a high number of protein start sites are currently missing in genome annotations. We constructed a proteogenomic pipeline specific for the analysis of N-terminal proteomics data, with the aim of discovering novel translational start sites outside annotated protein coding regions. In summary, unidentified MS/MS spectra were matched to a specific N-terminal peptide library encompassing protein N termini encoded in the Arabidopsis thaliana genome. After a stringent false discovery rate filtering, 117 protein N termini compliant with N-terminal methionine excision specificity and indicative of translation initiation were found. These include N-terminal protein extensions and translation from transposable elements and pseudogenes. Gene prediction provided supporting protein-coding models for approximately half of the protein N termini. Besides the prediction of functional domains (partially) contained within the newly predicted ORFs, further supporting evidence of translation was found in the recently released Araport11 genome re-annotation of Arabidopsis and computational translations of sequences stored in public repositories. Most interestingly, complementary evidence by ribosome profiling was found for 23 protein N termini. Finally, by analyzing protein N-terminal peptides, an in silico analysis demonstrates the applicability of our N-terminal proteogenomics strategy in revealing protein-coding potential in species with well- and poorly-annotated genomes. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Advances in Geologic Disposal System Modeling and Application to Crystalline Rock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mariner, Paul E.; Stein, Emily R.; Frederick, Jennifer M.
The Used Fuel Disposition Campaign (UFDC) of the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE), Office of Fuel Cycle Technology (OFCT) is conducting research and development (R&D) on geologic disposal of used nuclear fuel (UNF) and high-level nuclear waste (HLW). Two of the high priorities for UFDC disposal R&D are design concept development and disposal system modeling (DOE 2011). These priorities are directly addressed in the UFDC Generic Disposal Systems Analysis (GDSA) work package, which is charged with developing a disposal system modeling and analysis capability for evaluating disposal system performance for nuclear waste in geologic mediamore » (e.g., salt, granite, clay, and deep borehole disposal). This report describes specific GDSA activities in fiscal year 2016 (FY 2016) toward the development of the enhanced disposal system modeling and analysis capability for geologic disposal of nuclear waste. The GDSA framework employs the PFLOTRAN thermal-hydrologic-chemical multi-physics code and the Dakota uncertainty sampling and propagation code. Each code is designed for massively-parallel processing in a high-performance computing (HPC) environment. Multi-physics representations in PFLOTRAN are used to simulate various coupled processes including heat flow, fluid flow, waste dissolution, radionuclide release, radionuclide decay and ingrowth, precipitation and dissolution of secondary phases, and radionuclide transport through engineered barriers and natural geologic barriers to the biosphere. Dakota is used to generate sets of representative realizations and to analyze parameter sensitivity.« less
EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual
NASA Technical Reports Server (NTRS)
Raju, M. S.
1998-01-01
EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.
Ribeiro, Kleber Silva; Vasconcellos, Camilla Ioshida; Soares, Rodrigo Pedro; Mendes, Maria Tays; Ellis, Cameron C; Aguilera-Flores, Marcela; de Almeida, Igor Correia; Schenkman, Sergio; Iwai, Leo Kei; Torrecilhas, Ana Claudia
2018-01-01
Trypanosoma cruzi , the aetiologic agent of Chagas disease, releases vesicles containing a wide range of surface molecules known to affect the host immunological responses and the cellular infectivity. Here, we compared the secretome of two distinct strains (Y and YuYu) of T. cruzi , which were previously shown to differentially modulate host innate and acquired immune responses. Tissue culture-derived trypomastigotes of both strains secreted extracellular vesicles (EVs), as demonstrated by electron scanning microscopy. EVs were purified by exclusion chromatography or ultracentrifugation and quantitated using nanoparticle tracking analysis. Trypomastigotes from YuYu strain released higher number of EVs than those from Y strain, enriched with virulence factors trans -sialidase (TS) and cruzipain. Proteomic analysis confirmed the increased abundance of proteins coded by the TS gene family, mucin-like glycoproteins, and some typical exosomal proteins in the YuYu strain, which also showed considerable differences between purified EVs and vesicle-free fraction as compared to the Y strain. To evaluate whether such differences were related to parasite infectivity, J774 macrophages and LLC-MK2 kidney cells were preincubated with purified EVs from both strains and then infected with Y strain trypomastigotes. EVs released by YuYu strain caused a lower infection but higher intracellular proliferation in J774 macrophages than EVs from Y strain. In contrast, YuYu strain-derived EVs caused higher infection of LLC-MK2 cells than Y strain-derived EVs. In conclusion, quantitative and qualitative differences in EVs and secreted proteins from different T. cruzi strains may correlate with infectivity/virulence during the host-parasite interaction.
Ribeiro, Kleber Silva; Vasconcellos, Camilla Ioshida; Soares, Rodrigo Pedro; Ellis, Cameron C.; Aguilera-Flores, Marcela; de Almeida, Igor Correia
2018-01-01
ABSTRACT Trypanosoma cruzi, the aetiologic agent of Chagas disease, releases vesicles containing a wide range of surface molecules known to affect the host immunological responses and the cellular infectivity. Here, we compared the secretome of two distinct strains (Y and YuYu) of T. cruzi, which were previously shown to differentially modulate host innate and acquired immune responses. Tissue culture-derived trypomastigotes of both strains secreted extracellular vesicles (EVs), as demonstrated by electron scanning microscopy. EVs were purified by exclusion chromatography or ultracentrifugation and quantitated using nanoparticle tracking analysis. Trypomastigotes from YuYu strain released higher number of EVs than those from Y strain, enriched with virulence factors trans-sialidase (TS) and cruzipain. Proteomic analysis confirmed the increased abundance of proteins coded by the TS gene family, mucin-like glycoproteins, and some typical exosomal proteins in the YuYu strain, which also showed considerable differences between purified EVs and vesicle-free fraction as compared to the Y strain. To evaluate whether such differences were related to parasite infectivity, J774 macrophages and LLC-MK2 kidney cells were preincubated with purified EVs from both strains and then infected with Y strain trypomastigotes. EVs released by YuYu strain caused a lower infection but higher intracellular proliferation in J774 macrophages than EVs from Y strain. In contrast, YuYu strain-derived EVs caused higher infection of LLC-MK2 cells than Y strain-derived EVs. In conclusion, quantitative and qualitative differences in EVs and secreted proteins from different T. cruzi strains may correlate with infectivity/virulence during the host–parasite interaction. PMID:29696081
Comprehensive analysis of transport aircraft flight performance
NASA Astrophysics Data System (ADS)
Filippone, Antonio
2008-04-01
This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.
The Automated Instrumentation and Monitoring System (AIMS) reference manual
NASA Technical Reports Server (NTRS)
Yan, Jerry; Hontalas, Philip; Listgarten, Sherry
1993-01-01
Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).
GPCRdb: an information system for G protein-coupled receptors
Isberg, Vignir; Mordalski, Stefan; Munk, Christian; Rataj, Krzysztof; Harpsøe, Kasper; Hauser, Alexander S.; Vroling, Bas; Bojarski, Andrzej J.; Vriend, Gert; Gloriam, David E.
2016-01-01
Recent developments in G protein-coupled receptor (GPCR) structural biology and pharmacology have greatly enhanced our knowledge of receptor structure-function relations, and have helped improve the scientific foundation for drug design studies. The GPCR database, GPCRdb, serves a dual role in disseminating and enabling new scientific developments by providing reference data, analysis tools and interactive diagrams. This paper highlights new features in the fifth major GPCRdb release: (i) GPCR crystal structure browsing, superposition and display of ligand interactions; (ii) direct deposition by users of point mutations and their effects on ligand binding; (iii) refined snake and helix box residue diagram looks; and (iii) phylogenetic trees with receptor classification colour schemes. Under the hood, the entire GPCRdb front- and back-ends have been re-coded within one infrastructure, ensuring a smooth browsing experience and development. GPCRdb is available at http://www.gpcrdb.org/ and it's open source code at https://bitbucket.org/gpcr/protwis. PMID:26582914
Recent Salmon Declines: A Result of Lost Feeding Opportunities Due to Bad Timing?
Chittenden, Cedar M.; Jensen, Jenny L. A.; Ewart, David; Anderson, Shannon; Balfry, Shannon; Downey, Elan; Eaves, Alexandra; Saksida, Sonja; Smith, Brian; Vincent, Stephen; Welch, David; McKinley, R. Scott
2010-01-01
As the timing of spring productivity blooms in near-shore areas advances due to warming trends in global climate, the selection pressures on out-migrating salmon smolts are shifting. Species and stocks that leave natal streams earlier may be favoured over later-migrating fish. The low post-release survival of hatchery fish during recent years may be in part due to static release times that do not take the timing of plankton blooms into account. This study examined the effects of release time on the migratory behaviour and survival of wild and hatchery-reared coho salmon (Oncorhynchus kisutch) using acoustic and coded-wire telemetry. Plankton monitoring and near-shore seining were also conducted to determine which habitat and food sources were favoured. Acoustic tags (n = 140) and coded-wire tags (n = 266,692) were implanted into coho salmon smolts at the Seymour and Quinsam Rivers, in British Columbia, Canada. Differences between wild and hatchery fish, and early and late releases were examined during the entire lifecycle. Physiological sampling was also carried out on 30 fish from each release group. The smolt-to-adult survival of coho salmon released during periods of high marine productivity was 1.5- to 3-fold greater than those released both before and after, and the fish's degree of smoltification affected their downstream migration time and duration of stay in the estuary. Therefore, hatchery managers should consider having smolts fully developed and ready for release during the peak of the near-shore plankton blooms. Monitoring chlorophyll a levels and water temperature early in the spring could provide a forecast of the timing of these blooms, giving hatcheries time to adjust their release schedule. PMID:20805978
FRENDY: A new nuclear data processing system being developed at JAEA
NASA Astrophysics Data System (ADS)
Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio
2017-09-01
JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.
NASA Technical Reports Server (NTRS)
Krueger, Ronald
2008-01-01
An approach for assessing the delamination propagation simulation capabilities in commercial finite element codes is presented and demonstrated. For this investigation, the Double Cantilever Beam (DCB) specimen and the Single Leg Bending (SLB) specimen were chosen for full three-dimensional finite element simulations. First, benchmark results were created for both specimens. Second, starting from an initially straight front, the delamination was allowed to propagate. The load-displacement relationship and the total strain energy obtained from the propagation analysis results and the benchmark results were compared and good agreements could be achieved by selecting the appropriate input parameters. Selecting the appropriate input parameters, however, was not straightforward and often required an iterative procedure. Qualitatively, the delamination front computed for the DCB specimen did not take the shape of a curved front as expected. However, the analysis of the SLB specimen yielded a curved front as was expected from the distribution of the energy release rate and the failure index across the width of the specimen. Overall, the results are encouraging but further assessment on a structural level is required.
Fatigue Life Analysis of Tapered Hybrid Composite Flexbeams
NASA Technical Reports Server (NTRS)
Murri, Gretchen B.; Schaff, Jeffery R.; Dobyns, Alan L.
2002-01-01
Nonlinear-tapered flexbeam laminates from a full-size composite helicopter rotor hub flexbeam were tested under combined constant axial tension and cyclic bending loads. The two different graphite/glass hybrid configurations tested under cyclic loading failed by delamination in the tapered region. A 2-D finite element model was developed which closely approximated the flexbeam geometry, boundary conditions, and loading. The analysis results from two geometrically nonlinear finite element codes, ANSYS and ABAQUS, are presented and compared. Strain energy release rates (G) obtained from the above codes using the virtual crack closure technique (VCCT) at a resin crack location in the flexbeams are presented for both hybrid material types. These results compare well with each other and suggest that the initial delamination growth from the resin crack toward the thick region of the flexbeam is strongly mode II. The peak calculated G values were used with material characterization data to calculate fatigue life curves and compared with test data. A curve relating maximum surface strain to number of loading cycles at delamination onset compared reasonably well with the test results.
A Taxonomy for Insourcing in the Aerospace Industry
2011-06-01
Patterson Air Force Base, Ohio APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. The...editor (2008) DAU Acquipedia (2011) US Code Title 10 (2011) Robert M. Morgan and Shelby Hunt (1994) Miller and Whitford (2006) Harris and Raviv...G. Bennett (1991) Lambert, Douglas M. editor (2008) DAU Acquipedia (2011) US Code Title 10 (2011) Robert M. Morgan and Shelby Hunt (1994
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2010 CFR
2010-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2012 CFR
2012-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2014 CFR
2014-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2013 CFR
2013-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2011 CFR
2011-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
19 CFR 142.46 - Presentation of invoice and assignment of entry number.
Code of Federal Regulations, 2011 CFR
2011-04-01
... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...
19 CFR 142.46 - Presentation of invoice and assignment of entry number.
Code of Federal Regulations, 2013 CFR
2013-04-01
... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...
19 CFR 142.46 - Presentation of invoice and assignment of entry number.
Code of Federal Regulations, 2012 CFR
2012-04-01
... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...
19 CFR 142.46 - Presentation of invoice and assignment of entry number.
Code of Federal Regulations, 2014 CFR
2014-04-01
... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...
19 CFR 142.46 - Presentation of invoice and assignment of entry number.
Code of Federal Regulations, 2010 CFR
2010-04-01
... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...
Geographic Information Systems: A Primer
1990-10-01
AVAILABILITY OF REPORT Approved for public release; distribution 2b DECLASSjFICATION/ DOWNGRADING SCHEDULE unlimited. 4 PERFORMING ORGANIZATION REPORT...utilizing sophisticated integrated databases (usually vector-based), avoid the indirect value coding scheme by recognizing names or direct magnitudes...intricate involvement required by the operator in order to establish a functional coding scheme . A simple raster system, in which cell values indicate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan; Faucett, Christopher; Haskin, Troy Christopher
Following the conclusion of the first phase of the crosswalk analysis, one of the key unanswered questions was whether or not the deviations found would persist during a partially recovered accident scenario, similar to the one that occurred in TMI - 2. In particular this analysis aims to compare the impact of core degradation morphology on quenching models inherent within the two codes and the coolability of debris during partially recovered accidents. A primary motivation for this study is the development of insights into how uncertainties in core damage progression models impact the ability to assess the potential for recoverymore » of a degraded core. These quench and core recovery models are of the most interest when there is a significant amount of core damage, but intact and degraded fuel still remain in the cor e region or the lower plenum. Accordingly this analysis presents a spectrum of partially recovered accident scenarios by varying both water injection timing and rate to highlight the impact of core degradation phenomena on recovered accident scenarios. This analysis uses the newly released MELCOR 2.2 rev. 966 5 and MAAP5, Version 5.04. These code versions, which incorporate a significant number of modifications that have been driven by analyses and forensic evidence obtained from the Fukushima - Daiichi reactor site.« less
The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond
NASA Astrophysics Data System (ADS)
MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.
2005-12-01
The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
2016-03-01
in this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of...gravity, or pretest . 1 Approved for public release; distribution is unlimited. Fine Location 2 Code position 9–10: This substring represents the spacial...itself. For example, upper, pretest , or Hybrid III mid-sized male ATD. Physical dimension Code position 13–14: This substring represents the type of the
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
An exploratory study of reiki experiences in women who have cancer.
Kirshbaum, Marilynne N; Stead, Maxine; Bartys, Serena
2016-04-02
To explore the perceptions and experiences of reiki for women who have cancer and identify outcome measures for an intervention study. A cross-sectional qualitative study of 10 women who had received reiki after cancer treatment was conducted. Interviews were audiotaped, transcribed and coded using framework analysis. Key themes identified were: limited understanding of reiki prior to receiving any reiki; release of emotional strain during reiki-feelings of a release of energy, a clearing of the mind from cancer, inner peace/relaxation, hope, a sense of being cared for; experience of physical sensations during reiki, such as pain relief and tingling; physical, emotional and cognitive improvements after reiki, such as improved sleep, a sense of calm and peace, reduced depression and improved self-confidence. Findings suggest that reiki could be a beneficial tool in the self-management of quality of life issues for women who have cancer.
Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique
NASA Technical Reports Server (NTRS)
Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann
2010-01-01
We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package
Glass Fiber Resin Composites and Components at Arctic Temperatures
2015-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited GLASS FIBER RESIN...3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE GLASS FIBER RESIN COMPOSITES AND COMPONENTS AT ARCTIC TEMPERATURES 5...public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Glass fiber reinforced composites (GFRC
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-31
... Response, Compensation, and Liability Act and the Texas Solid Waste Disposal Act Notice is hereby given... Texas Solid Waste Disposal Act, Texas Health & Safety Code Ann. Sec. Sec. 361.001 to 361.966 (hereafter... responding to the releases and threatened releases of solid wastes and hazardous substances at and from the...
Resonant Terahertz Absorption Using Metamaterial Structures
2012-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited RESONANT TERAHERTZ...Second Reader: James H. Newman THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting... public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The Sensor Research Lab at the Naval Postgraduate
USDA-ARS?s Scientific Manuscript database
Unlike the classical gonadotropin-releasing hormone (GNRH1), the second mammalian isoform (GNRH2) is an ineffective stimulant of gonadotropin release. Species that produce GNRH2 may not maintain a functional GNRH2 receptor (GNRHR2) due to coding errors. A full length GNRHR2 gene has been identified ...
GISMO: A MATLAB toolbox for seismic research, monitoring, & education
NASA Astrophysics Data System (ADS)
Thompson, G.; Reyes, C. G.; Kempler, L. A.
2017-12-01
GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS Pensive system.
JSPAM: A restricted three-body code for simulating interacting galaxies
NASA Astrophysics Data System (ADS)
Wallin, J. F.; Holincheck, A. J.; Harvey, A.
2016-07-01
Restricted three-body codes have a proven ability to recreate much of the disturbed morphology of actual interacting galaxies. As more sophisticated n-body models were developed and computer speed increased, restricted three-body codes fell out of favor. However, their supporting role for performing wide searches of parameter space when fitting orbits to real systems demonstrates a continuing need for their use. Here we present the model and algorithm used in the JSPAM code. A precursor of this code was originally described in 1990, and was called SPAM. We have recently updated the software with an alternate potential and a treatment of dynamical friction to more closely mimic the results from n-body tree codes. The code is released publicly for use under the terms of the Academic Free License ("AFL") v. 3.0 and has been added to the Astrophysics Source Code Library.
DIANA-LncBase v2: indexing microRNA targets on non-coding transcripts.
Paraskevopoulou, Maria D; Vlachos, Ioannis S; Karagkouni, Dimitra; Georgakilas, Georgios; Kanellos, Ilias; Vergoulis, Thanasis; Zagganas, Konstantinos; Tsanakas, Panayiotis; Floros, Evangelos; Dalamagas, Theodore; Hatzigeorgiou, Artemis G
2016-01-04
microRNAs (miRNAs) are short non-coding RNAs (ncRNAs) that act as post-transcriptional regulators of coding gene expression. Long non-coding RNAs (lncRNAs) have been recently reported to interact with miRNAs. The sponge-like function of lncRNAs introduces an extra layer of complexity in the miRNA interactome. DIANA-LncBase v1 provided a database of experimentally supported and in silico predicted miRNA Recognition Elements (MREs) on lncRNAs. The second version of LncBase (www.microrna.gr/LncBase) presents an extensive collection of miRNA:lncRNA interactions. The significantly enhanced database includes more than 70 000 low and high-throughput, (in)direct miRNA:lncRNA experimentally supported interactions, derived from manually curated publications and the analysis of 153 AGO CLIP-Seq libraries. The new experimental module presents a 14-fold increase compared to the previous release. LncBase v2 hosts in silico predicted miRNA targets on lncRNAs, identified with the DIANA-microT algorithm. The relevant module provides millions of predicted miRNA binding sites, accompanied with detailed metadata and MRE conservation metrics. LncBase v2 caters information regarding cell type specific miRNA:lncRNA regulation and enables users to easily identify interactions in 66 different cell types, spanning 36 tissues for human and mouse. Database entries are also supported by accurate lncRNA expression information, derived from the analysis of more than 6 billion RNA-Seq reads. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
The Ballistic Research Laboratory CAD Package Release, 4.0, Volume 1: The BRL CAD Philosophy
1991-12-01
basis. As with many large systems, parts of it were the result of years of evolution , with many band-aids, hacks, and •backward compatibility...6) and from a PATCH file VIa the FASTGEN code [ 5 ) Over a period spannmg more than fifteen years , two stgntficant communities of vulnerability...Release 4.0 follow the man pages. Release 3.0 Notes and Errata Sheets are found in the last sections of this volume. Papers discussing supplemental
New developments and prospects on COSI, the simulation software for fuel cycle analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eschbach, R.; Meyer, M.; Coquelet-Pascal, C.
2013-07-01
COSI, software developed by the Nuclear Energy Direction of the CEA, is a code simulating a pool of nuclear power plants with its associated fuel cycle facilities. This code has been designed to study various short, medium and long term options for the introduction of various types of nuclear reactors and for the use of associated nuclear materials. In the frame of the French Act for waste management, scenario studies are carried out with COSI, to compare different options of evolution of the French reactor fleet and options of partitioning and transmutation of plutonium and minor actinides. Those studies aimmore » in particular at evaluating the sustainability of Sodium cooled Fast Reactors (SFR) deployment and the possibility to transmute minor actinides. The COSI6 version is a completely renewed software released in 2006. COSI6 is now coupled with the last version of CESAR (CESAR5.3 based on JEFF3.1.1 nuclear data) allowing the calculations on irradiated fuel with 200 fission products and 100 heavy nuclides. A new release is planned in 2013, including in particular the coupling with a recommended database of reactors. An exercise of validation of COSI6, carried out on the French PWR historic nuclear fleet, has been performed. During this exercise quantities like cumulative natural uranium consumption, or cumulative depleted uranium, or UOX/MOX spent fuel storage, or stocks of reprocessed uranium, or plutonium content in fresh MOX fuel, or the annual production of high level waste, have been computed by COSI6 and compared to industrial data. The results have allowed us to validate the essential phases of the fuel cycle computation, and reinforces the credibility of the results provided by the code.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozaki, N.; Lappalainen, J.; Linnoila, M.
Serotonin (5-HT){sub ID} receptors are 5-HT release-regulating autoreceptors in the human brain. Abnormalities in brain 5-HT function have been hypothesized in the pathophysiology of various psychiatric disorders, including obsessive-compulsive disorder, autism, mood disorders, eating disorders, impulsive violent behavior, and alcoholism. Thus, mutations occurring in 5-HT autoreceptors may cause or increase the vulnerability to any of these conditions. 5-HT{sub 1D{alpha}} and 5-HT{sub 1D{Beta}} subtypes have been previously localized to chromosomes 1p36.3-p34.3 and 6q13, respectively, using rodent-human hybrids and in situ localization. In this communication, we report the detection of a 5-HT{sub 1D{alpha}} receptor gene polymorphism by single strand conformation polymorphism (SSCP)more » analysis of the coding sequence. The polymorphism was used for fine scale linkage mapping of 5-HT{sub 1D{alpha}} on chromosome 1. This polymorphism should also be useful for linkage studies in populations and in families. Our analysis also demonstrates that functionally significant coding sequence variants of the 5-HT{sub 1D{alpha}} are probably not abundant either among alcoholics or in the general population. 14 refs., 1 fig., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less
Combustion-acoustic stability analysis for premixed gas turbine combustors
NASA Technical Reports Server (NTRS)
Darling, Douglas; Radhakrishnan, Krishnan; Oyediran, Ayo; Cowan, Lizabeth
1995-01-01
Lean, prevaporized, premixed combustors are susceptible to combustion-acoustic instabilities. A model was developed to predict eigenvalues of axial modes for combustion-acoustic interactions in a premixed combustor. This work extends previous work by including variable area and detailed chemical kinetics mechanisms, using the code LSENS. Thus the acoustic equations could be integrated through the flame zone. Linear perturbations were made of the continuity, momentum, energy, chemical species, and state equations. The qualitative accuracy of our approach was checked by examining its predictions for various unsteady heat release rate models. Perturbations in fuel flow rate are currently being added to the model.
Giloteaux, Ludovic; Holmes, Dawn E; Williams, Kenneth H; Wrighton, Kelly C; Wilkins, Michael J; Montgomery, Alison P; Smith, Jessica A; Orellana, Roberto; Thompson, Courtney A; Roper, Thomas J; Long, Philip E; Lovley, Derek R
2013-01-01
The possibility of arsenic release and the potential role of Geobacter in arsenic biogeochemistry during in situ uranium bioremediation was investigated because increased availability of organic matter has been associated with substantial releases of arsenic in other subsurface environments. In a field experiment conducted at the Rifle, CO study site, groundwater arsenic concentrations increased when acetate was added. The number of transcripts from arrA, which codes for the α-subunit of dissimilatory As(V) reductase, and acr3, which codes for the arsenic pump protein Acr3, were determined with quantitative reverse transcription-PCR. Most of the arrA (>60%) and acr3-1 (>90%) sequences that were recovered were most similar to Geobacter species, while the majority of acr3-2 (>50%) sequences were most closely related to Rhodoferax ferrireducens. Analysis of transcript abundance demonstrated that transcription of acr3-1 by the subsurface Geobacter community was correlated with arsenic concentrations in the groundwater. In contrast, Geobacter arrA transcript numbers lagged behind the major arsenic release and remained high even after arsenic concentrations declined. This suggested that factors other than As(V) availability regulated the transcription of arrA in situ, even though the presence of As(V) increased the transcription of arrA in cultures of Geobacter lovleyi, which was capable of As(V) reduction. These results demonstrate that subsurface Geobacter species can tightly regulate their physiological response to changes in groundwater arsenic concentrations. The transcriptomic approach developed here should be useful for the study of a diversity of other environments in which Geobacter species are considered to have an important influence on arsenic biogeochemistry. PMID:23038171
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giloteaux, L.; Holmes, Dawn E.; Williams, Kenneth H.
2013-02-04
The possibility of arsenic release and the potential role of Geobacter in arsenic biogeochemistry during in situ uranium bioremediation was investigated because increased availability of organic matter has been associated with substantial releases of arsenic in other subsurface environments. In a field experiment conducted at the Rifle, CO study site, groundwater arsenic concentrations increased when acetate was added. The number of transcripts from arrA, which codes for the alpha subunit of dissimilatory As(V) reductase, and acr3, which codes for the arsenic pump protein Acr3, were determined with quantitative RT-PCR. Most of the arrA (> 60%) and acr3-1 (> 90%) sequencesmore » that were recovered were most similar to Geobacter species, while the majority of acr3-2 (>50%) sequences were most closely related to Rhodoferax ferrireducens. Analysis of transcript abundance demonstrated that transcription of acr3-1 by the subsurface Geobacter community was correlated with arsenic concentrations in the groundwater. In contrast, Geobacter arrA transcript numbers lagged behind the major arsenic release and remained high even after arsenic concentrations declined. This suggested that factors other than As(V) availability regulated transcription of arrA in situ even though the presence of As(V) increased transcription of arrA in cultures of G. lovleyi, which was capable of As(V) reduction. These results demonstrate that subsurface Geobacter species can tightly regulate their physiological response to changes in groundwater arsenic concentrations. The transcriptomic approach developed here should be useful for the study of a diversity of other environments in which Geobacter species are considered to have an important influence on arsenic biogeochemistry.« less
The Future of ECHO: Evaluating Open Source Possibilities
NASA Astrophysics Data System (ADS)
Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.
2012-12-01
NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.
Metalloid Aluminum Clusters with Fluorine
2016-12-01
molecular dynamics, binding energy , siesta code, density of states, projected density of states 15. NUMBER OF PAGES 69 16. PRICE CODE 17. SECURITY...high energy density compared to explosives, but typically release this energy slowly via diffusion-limited combustion. There is recent interest in using...examine the cluster binding energy and electronic structure. Partial fluorine substitution in a prototypical aluminum-cyclopentadienyl cluster results
30 CFR 948.13 - State statutory and regulatory provisions set aside.
Code of Federal Regulations, 2010 CFR
2010-07-01
... wording in section 22A-3-23(c)(3) of the Code of West Virginia is inconsistent with section 519(c)(3) of..., That such a release may be made where the quality of the untreated postmining water discharged is...(e) of the Code of West Virginia is inconsistent with section 515(e) of the Surface Mining Control...
2014-03-01
ADMINISTRATIVE INFORMATION The work described in this report was performed by the Unmanned Systems Science & Technology Branch (Code 71710) and the...Unmanned Systems Advanced Development Branch (Code 71720), Space and Naval Warfare Systems Center Pacific (SSC Pacific), San Diego, CA, and the Air...Earth™ is a trademark of Google Inc. Released by T. Pastore, Head Unmanned Systems Science & Technology Branch Under authority of A. D
Reaction path of energetic materials using THOR code
NASA Astrophysics Data System (ADS)
Duraes, L.; Campos, J.; Portugal, A.
1997-07-01
The method of predicting reaction path, using a thermochemical computer code, named THOR, allows for isobar and isochor adiabatic combustion and CJ detonation regimes, the calculation of the composition and thermodynamic properties of reaction products of energetic materials. THOR code assumes the thermodynamic equilibria of all possible products, for the minimum Gibbs free energy, using a thermal equation of state (EoS). The used HL EoS is a new EoS developed in previous works. HL EoS is supported by a Boltzmann EoS, taking α =13.5 to the exponent of the intermolecular potential and θ=1.4 to the adimensional temperature. This code allows now the possibility of estimating various sets of reaction products, obtained successively by the decomposition of the original reacting compound, as a function of the released energy. Two case studies of thermal decomposition procedure were selected, described, calculated and discussed - Ammonium Nitrate based explosives and Nitromethane - because they are very known explosives and their equivalence ratio is respectively near and greater than the stoicheiometry. Predictions of detonation properties of other condensed explosives, as a function of energy release, present results in good correlation with experimental values.
Wilson, Sarah E; Deeks, Shelley L; Rosella, Laura C
2015-09-15
In Ontario, Canada, we conducted an evaluation of rotavirus (RV) vaccine on hospitalizations and Emergency Department (ED) visitations for acute gastroenteritis (AGE). In our original analysis, any one of the International Classification of Disease, Version 10 (ICD-10) codes was used for outcome ascertainment: RV-specific- (A08.0), viral- (A08.3, A08. 4, A08.5), and unspecified infectious- gastroenteritis (A09). Annual age-specific rates per 10,000 population were calculated. The average monthly rate of AGE hospitalization for children under age two increased from 0.82 per 10,000 from January 2003 to March 2009, to 2.35 over the period of April 2009 to March 31, 2013. Similar trends were found for ED consultations and in other age groups. A rise in events corresponding to the A09 code was found when the outcome definition was disaggregated by ICD-10 code. Documentation obtained from the World Health Organization confirmed that a change in directive for the classification of unspecified gastroenteritis occurred with the release of ICD-10 in April 2009. AGE events previously classified under the code K52.9, are now classified under code A09.9. Based on change in the classification of unspecified gastroenteritis we modified our outcome definition to also include unspecified non-infectious-gastroenteritis (K52.9). We recommend other investigators consider using both A09.9 and K52.9 ICD-10 codes for outcome ascertainment in future rotavirus vaccine impact studies to ensure that all unspecified cases of AGE are captured, especially if the study period spans 2009.
BRD4 assists elongation of both coding and enhancer RNAs guided by histone acetylation
Kanno, Tomohiko; Kanno, Yuka; LeRoy, Gary; Campos, Eric; Sun, Hong-Wei; Brooks, Stephen R; Vahedi, Golnaz; Heightman, Tom D; Garcia, Benjamin A; Reinberg, Danny; Siebenlist, Ulrich; O’Shea, John J; Ozato, Keiko
2016-01-01
Small-molecule BET inhibitors interfere with the epigenetic interactions between acetylated histones and the bromodomains of the BET family proteins, including BRD4, and they potently inhibit growth of malignant cells by targeting cancer-promoting genes. BRD4 interacts with the pause-release factor P-TEFb, and has been proposed to release Pol II from promoter-proximal pausing. We show that BRD4 occupied widespread genomic regions in mouse cells, and directly stimulated elongation of both protein-coding transcripts and non-coding enhancer RNAs (eRNAs), dependent on the function of bromodomains. BRD4 interacted physically with elongating Pol II complexes, and assisted Pol II progression through hyper-acetylated nucleosomes by interacting with acetylated histones via bromodomains. On active enhancers, the BET inhibitor JQ1 antagonized BRD4-associated eRNA synthesis. Thus, BRD4 is involved in multiple steps of the transcription hierarchy, primarily by assisting transcript elongation both at enhancers and on gene bodies. PMID:25383670
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Ford, W.E. III; Petrie, L.M.
AMPX-77 is a modular system of computer programs that pertain to nuclear analyses, with a primary emphasis on tasks associated with the production and use of multigroup cross sections. AH basic cross-section data are to be input in the formats used by the Evaluated Nuclear Data Files (ENDF/B), and output can be obtained in a variety of formats, including its own internal and very general formats, along with a variety of other useful formats used by major transport, diffusion theory, and Monte Carlo codes. Processing is provided for both neutron and gamma-my data. The present release contains codes all writtenmore » in the FORTRAN-77 dialect of FORTRAN and wig process ENDF/B-V and earlier evaluations, though major modules are being upgraded in order to process ENDF/B-VI and will be released when a complete collection of usable routines is available.« less
Preliminary study of the CRRES magnetospheric barium releases
NASA Technical Reports Server (NTRS)
Huba, J. D.; Bernhardt, P. A.; Lyon, J. G.
1992-01-01
Preliminary theoretical and computational analyses of the Combined Release and Radiation Effects Satellite (CRRES) magnetospheric barium releases are presented. The focus of the studies is on the evolution of the diamagnetic cavity which is formed by the barium ions as they expand outward, and on the structuring of the density and magnetic field during the expansion phase of the releases. Two sets of simulation studies are discussed. The first set is based upon a 2D ideal MHD code and provides estimates of the time and length scales associated with the formation and collapse of the diamagnetic cavity. The second set uses a nonideal MHD code; specifically, the Hall term is included. This additional term is critical to the dynamics of sub-Alfvenic plasma expansions, such as the CRRES barium releases, because it leads to instability of the expanding plasma. Detailed simulations of the G4 and G10 releases were performed. In both cases the expanding plasma rapidly structured: the G4 release structured at time t less than about 3 s and developed scale sizes of about 1-2 km, while the G10 release structured at time t less than about 22 s and developed scale sizes of about 10-15 km. It is also found that the diamagnetic cavity size is reduced from those obtained from the ideal MHD results because of the structure. On the other hand, the structuring allows the formation of plasma blobs which appear to free stream across the magnetic field; thus, the barium plasma can propagate to larger distances traverse to the magnetic field than the case where no structuring occurs. Finally, a new normal mode of the system was discovered which may be excited at the leading edge of the expanding barium plasma.
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cruz Silva, A. H.; Danninger, M.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Edsjö, J.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Fuchs, T.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Griffith, Z.; Groß, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Krückl, G.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mandelartz, M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Savage, C.; Schatto, K.; Schimp, M.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schulte, L.; Schumacher, L.; Scott, P.; Seckel, D.; Seunarine, S.; Silverwood, H.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stasik, A.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Ter-Antonyan, S.; Terliuk, A.; Te{š}ić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.
2016-04-01
We present an improved event-level likelihood formalism for including neutrino telescope data in global fits to new physics. We derive limits on spin-dependent dark matter-proton scattering by employing the new formalism in a re-analysis of data from the 79-string IceCube search for dark matter annihilation in the Sun, including explicit energy information for each event. The new analysis excludes a number of models in the weak-scale minimal supersymmetric standard model (MSSM) for the first time. This work is accompanied by the public release of the 79-string IceCube data, as well as an associated computer code for applying the new likelihood to arbitrary dark matter models.
Design and Development of a Counter Swarm Prototype Air Vehicle
2017-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. DESIGN AND DEVELOPMENT OF A...INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting burden for this collection of information is...number ____N/A____. 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13
28 CFR 2.68 - Prisoners transferred pursuant to treaty.
Code of Federal Regulations, 2010 CFR
2010-07-01
... date set by the Commission controls. If the release date set by the Commission under 18 U.S.C. 4106A(b... controls. (7) It is the Commission's interpretation of 18 U.S.C. 4106A that U.S. Code provisions for... transferees in mind. Thus, in every transfer treaty case, the release date will be determined through an...
2010-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited HIGH TERAHERTZ...Gamani Karunasiri Dragoslav Grbovic THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public ...Approved for public release; distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The terahertz (THz) region of the
Tunable Bandwidth Quantum Well Infrared Photo Detector (TB-QWIP)
2003-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited. TUNABLE...Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In this thesis a
Spectroscopic Imaging with an Uncooled Microbolometer Infrared Camera and Step-Scan FTIR
2006-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public released; distribution is unlimited SPECTROSCOPIC...Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the...STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The purpose of this
2014-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited MODELING AND... Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.K.; Cole, C.R.; Bond, F.W.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
Sensitivity analysis of Monju using ERANOS with JENDL-4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.
2012-07-01
This paper deals with sensitivity analysis using JENDL-4.0 nuclear data applied to the Monju reactor. In 2010 the Japan Atomic Energy Agency - JAEA - released a new set of nuclear data: JENDL-4.0. This new evaluation is expected to contain improved data on actinides and covariance matrices. Covariance matrices are a key point in quantification of uncertainties due to basic nuclear data. For sensitivity analysis, the well-established ERANOS [1] code was chosen because of its integrated modules that allow users to perform a sensitivity analysis of complex reactor geometries. A JENDL-4.0 cross-section library is not available for ERANOS. Therefore amore » cross-section library had to be made from the original nuclear data set, available as ENDF formatted files. This is achieved by using the following codes: NJOY, CALENDF, MERGE and GECCO in order to create a library for the ECCO cell code (part of ERANOS). In order to make sure of the accuracy of the new ECCO library, two benchmark experiments have been analyzed: the MZA and MZB cores of the MOZART program measured at the ZEBRA facility in the UK. These were chosen due to their similarity to the Monju core. Using the JENDL-4.0 ECCO library we have analyzed the criticality of Monju during the restart in 2010. We have obtained good agreement with the measured criticality. Perturbation calculations have been performed between JENDL-3.3 and JENDL-4.0 based models. The isotopes {sup 239}Pu, {sup 238}U, {sup 241}Am and {sup 241}Pu account for a major part of observed differences. (authors)« less
Report of the Defense Review Committee for the Code of Conduct. Volume 2. Supplement,
1976-01-01
think of all the innocent people killed, schools and churches destroyed, medical aid stations ruined ...." "During the (preflight) briefings, I was...other people could live by the Code of Conduct, then I could too, but I don’t think you could ever find anybody who was ever released that lived...Code of Conduct; it was trying just to survive." " I think there is a need for some sort of document to assist people , particularly people who could
esATAC: An Easy-to-use Systematic pipeline for ATAC-seq data analysis.
Wei, Zheng; Zhang, Wei; Fang, Huan; Li, Yanda; Wang, Xiaowo
2018-03-07
ATAC-seq is rapidly emerging as one of the major experimental approaches to probe chromatin accessibility genome-wide. Here, we present "esATAC", a highly integrated easy-to-use R/Bioconductor package, for systematic ATAC-seq data analysis. It covers essential steps for full analyzing procedure, including raw data processing, quality control and downstream statistical analysis such as peak calling, enrichment analysis and transcription factor footprinting. esATAC supports one command line execution for preset pipelines, and provides flexible interfaces for building customized pipelines. esATAC package is open source under the GPL-3.0 license. It is implemented in R and C ++. Source code and binaries for Linux, MAC OS X and Windows are available through Bioconductor https://www.bioconductor.org/packages/release/bioc/html/esATAC.html). xwwang@tsinghua.edu.cn. Supplementary data are available at Bioinformatics online.
Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trambauer, K.
1997-07-01
The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonablemore » accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running time.« less
Python Radiative Transfer Emission code (PyRaTE): non-LTE spectral lines simulations
NASA Astrophysics Data System (ADS)
Tritsis, A.; Yorke, H.; Tassis, K.
2018-05-01
We describe PyRaTE, a new, non-local thermodynamic equilibrium (non-LTE) line radiative transfer code developed specifically for post-processing astrochemical simulations. Population densities are estimated using the escape probability method. When computing the escape probability, the optical depth is calculated towards all directions with density, molecular abundance, temperature and velocity variations all taken into account. A very easy-to-use interface, capable of importing data from simulations outputs performed with all major astrophysical codes, is also developed. The code is written in PYTHON using an "embarrassingly parallel" strategy and can handle all geometries and projection angles. We benchmark the code by comparing our results with those from RADEX (van der Tak et al. 2007) and against analytical solutions and present case studies using hydrochemical simulations. The code will be released for public use.
1975-02-01
UNCLASSIFIED AD NUMBER LIMITATION CHANGES TO: FROM: AUTHORITY THIS PAGE IS UNCLASSIFIED ADB013811 Approved for public release; distribution is...of Changing Sampling Frequency and Bits/Sample 13 Image Coding Methods 63 Basic Dual-Mode Coder Code Assignment 73 Oversampled Dual...results from the threshold at which a 1 bit will oe trans- mitted. The threshold corresponds to a finite change on the gray scale or resolution of the
Joint Experimentation on Scalable Parallel Processors (JESPP)
2006-04-01
made use of local embedded relational databases, implemented using sqlite on each node of an SPP to execute queries and return results via an ad hoc ...rl.af.mil 12a. DISTRIBUTION / AVAILABILITY STATEENT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. 12b. DISTRIBUTION CODE 13. ABSTRACT...Experimentation Directorate (J9) required expansion of its joint semi-automated forces (JSAF) code capabilities; including number of entities, behavior complexity
Mobile Tracking and Location Awareness in Disaster Relief and Humanitarian Assistance Situations
2012-09-01
establishing mobile ad - hoc networks. Smartphones also have accelerometers that are used to detect any motion by the device. Furthermore, almost every...AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words...Picture, Situational Awareness 15. NUMBER OF PAGES 55 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY
Study of Software Tools to Support Systems Engineering Management
2015-06-01
Management 15. NUMBER OF PAGES 137 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS...AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) According to a...PAGE Unclassified 19. SECURITY CLASSIFICATION OF ABSTRACT Unclassified 20. LIMITATION OF ABSTRACT UU NSN 7540–01–280–5500 Standard Form 298
Determining Market Categorization of United States Zip Codes for Purposes of Army Recruiting
2016-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited DETERMINING MARKET ...2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE DETERMINING MARKET CATEGORIZATION OF UNITED STATES ZIP CODES FOR...Army uses commercial market segmentation data to analyze markets and past accessions to assign recruiters and quotas to maximize production. We use
S2LET: A code to perform fast wavelet analysis on the sphere
NASA Astrophysics Data System (ADS)
Leistedt, B.; McEwen, J. D.; Vandergheynst, P.; Wiaux, Y.
2013-10-01
We describe S2LET, a fast and robust implementation of the scale-discretised wavelet transform on the sphere. Wavelets are constructed through a tiling of the harmonic line and can be used to probe spatially localised, scale-dependent features of signals on the sphere. The reconstruction of a signal from its wavelets coefficients is made exact here through the use of a sampling theorem on the sphere. Moreover, a multiresolution algorithm is presented to capture all information of each wavelet scale in the minimal number of samples on the sphere. In addition S2LET supports the HEALPix pixelisation scheme, in which case the transform is not exact but nevertheless achieves good numerical accuracy. The core routines of S2LET are written in C and have interfaces in Matlab, IDL and Java. Real signals can be written to and read from FITS files and plotted as Mollweide projections. The S2LET code is made publicly available, is extensively documented, and ships with several examples in the four languages supported. At present the code is restricted to axisymmetric wavelets but will be extended to directional, steerable wavelets in a future release.
NASA Astrophysics Data System (ADS)
Coindreau, O.; Duriez, C.; Ederli, S.
2010-10-01
Progress in the treatment of air oxidation of zirconium in severe accident (SA) codes are required for a reliable analysis of severe accidents involving air ingress. Air oxidation of zirconium can actually lead to accelerated core degradation and increased fission product release, especially for the highly-radiotoxic ruthenium. This paper presents a model to simulate air oxidation kinetics of Zircaloy-4 in the 600-1000 °C temperature range. It is based on available experimental data, including separate-effect experiments performed at IRSN and at Forschungszentrum Karlsruhe. The kinetic transition, named "breakaway", from a diffusion-controlled regime to an accelerated oxidation is taken into account in the modeling via a critical mass gain parameter. The progressive propagation of the locally initiated breakaway is modeled by a linear increase in oxidation rate with time. Finally, when breakaway propagation is completed, the oxidation rate stabilizes and the kinetics is modeled by a linear law. This new modeling is integrated in the severe accident code ASTEC, jointly developed by IRSN and GRS. Model predictions and experimental data from thermogravimetric results show good agreement for different air flow rates and for slow temperature transient conditions.
Possible etiologies of increased incidence of gastroschisis.
Souther, Christina; Puapong, Devin P; Woo, Russell; Johnson, Sidney M
2017-11-01
Gastroschisis incidence has increased over the past decade nationally and in Hawaii. Pesticides have been implicated as potential causative factors for gastroschisis, and use of restricted use pesticides (RUPs) is widespread in Hawaii. This study was conducted to characterize gastroschisis cases in Hawaii and determine whether RUP application correlates with gastroschisis incidence. Gastroschisis patients treated in Hawaii between September, 2008 and August, 2015 were mapped by zip code along with RUP use. Spatial analysis software was used to identify patients' homes located within the pesticide application zone and agricultural land use areas. 71 gastroschisis cases were identified. 2.8% of patients were from Kauai, 64.8% from Oahu, 16.9% from Hawaii, 14.1% from Maui, and 1.4% from Molokai. RUPs have been used on all of these islands. 78.9% of patients lived in zip codes overlapping agricultural land use areas. 85.9% of patients shared zip codes with RUP-use areas. The majority of gastroschisis patients were from RUP-use areas, supporting the idea that pesticides may contribute to the development of gastroschisis, although limited data on specific releases make it difficult to apply these findings. As more RUP-use data become available to the public, these important research questions can be investigated further.
Interpretation and modelling of fission product Ba and Mo releases from fuel
NASA Astrophysics Data System (ADS)
Brillant, G.
2010-02-01
The release mechanisms of two fission products (namely barium and molybdenum) in severe accident conditions are studied using the VERCORS experimental observations. Barium is observed to be mostly released under reducing conditions while molybdenum release is most observed under oxidizing conditions. As well, the volatility of some precipitates in fuel is evaluated by thermodynamic equilibrium calculations. The polymeric species (MoO 3) n are calculated to largely contribute to molybdenum partial pressure and barium volatility is greatly enhanced if the gas atmosphere is reducing. Analytical models of fission product release from fuel are proposed for barium and molybdenum. Finally, these models have been integrated in the ASTEC/ELSA code and validation calculations have been performed on several experimental tests.
Tobacco use in popular movies during the past decade.
Mekemson, C; Glik, D; Titus, K; Myerson, A; Shaivitz, A; Ang, A; Mitchell, S
2004-12-01
The top 50 commercially successful films released per year from 1991 to 2000 were content coded to assess trends in tobacco use over time and attributes of films predictive of higher smoking rates. This observational study used media content analysis methods to generate data about tobacco use depictions in films studied (n = 497). Films are the basic unit of analysis. Once films were coded and preliminary analysis completed, outcome data were transformed to approximate multivariate normality before being analysed with general linear models and longitudinal mixed method regression methods. Tobacco use per minute of film was the main outcome measure used. Predictor variables include attributes of films and actors. Tobacco use was defined as any cigarette, cigar, and chewing tobacco use as well as the display of smoke and cigarette paraphernalia such as ashtrays, brand names, or logos within frames of films reviewed. Smoking rates in the top films fluctuated yearly over the decade with an overall modest downward trend (p < 0.005), with the exception of R rated films where rates went up. The decrease in smoking rates found in films in the past decade is modest given extensive efforts to educate the entertainment industry on this issue over the past decade. Monitoring, education, advocacy, and policy change to bring tobacco depiction rates down further should continue.
Tycho 2: A Proxy Application for Kinetic Transport Sweeps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrett, Charles Kristopher; Warsa, James S.
2016-09-14
Tycho 2 is a proxy application that implements discrete ordinates (SN) kinetic transport sweeps on unstructured, 3D, tetrahedral meshes. It has been designed to be small and require minimal dependencies to make collaboration and experimentation as easy as possible. Tycho 2 has been released as open source software. The software is currently in a beta release with plans for a stable release (version 1.0) before the end of the year. The code is parallelized via MPI across spatial cells and OpenMP across angles. Currently, several parallelization algorithms are implemented.
FY17Q4 Ristra project: Release Version 1.0 of a production toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungerford, Aimee L.; Daniel, David John
2017-09-21
The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.
In-Orbit Collision Analysis for VEGA Second Flight
NASA Astrophysics Data System (ADS)
Volpi, M.; Fossati, T.; Battie, F.
2013-08-01
ELV, as prime contractor of the VEGA launcher, which operates in the protected LEO zone (up to 2000 km altitude), has to demonstrate that it abides by ESA debris mitigation rules, as well as by those imposed by the French Law on Space Operations (LOS). After the full success of VEGA qualification flight, the second flight(VV02) will extend the qualification domain of the launcher to multi-payload missions, with the release of two satellites (Proba-V and VNRedSat-1) and one Cubesat (ESTCube-1) on different SSO orbits The multi-payload adapter, VESPA, also separates its upper part before the second payload release. This paper will present the results of the long-term analyses on inorbit collision between these different bodies. Typical duration of propagation requested by ELV customer is around 50 orbits, requiring a state-of-the-art simulator able to compute efficiently orbits disturbs, usually neglected in launcher trajectory optimization itself. To address the issue of in-orbit collision, ELV has therefore developed its own simulator, POLPO [1], a FORTRAN code which performs the long-term propagation of the released objects trajectories and computes the mutual distance between them. The first part of the paper shall introduce the simulator itself, explaining the computation method chosen and briefly discussing the perturbing effects and their models taken into account in the tool, namely: - gravity field modeling (zonal and tesseral harmonics) - atmospheric model - solar pressure - third-body interaction A second part will describe the application of the in-orbit collision analysis to the second flight mission. Main characteristics of the second flight will be introduced, as well as the dispersions considered for the Monte-Carlo analysis performed. The results of the long-term collision analysis between all the separated bodies will then be presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman, M.; Members of the Cross Sections Evaluation Working Group
2009-06-01
In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats andmore » several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.« less
Konikow, Leonard F.; Sanford, W.E.; Campbell, P.J.
1997-01-01
In a solute-transport model, if a constant-concentration boundary condition is applied at a node in an active flow field, a solute flux can occur by both advective and dispersive processes. The potential for advective release is demonstrated by reexamining the Hydrologic Code Intercomparison (HYDROCOIN) project case 5 problem, which represents a salt dome overlain by a shallow groundwater system. The resulting flow field includes significant salinity and fluid density variations. Several independent teams simulated this problem using finite difference or finite element numerical models. We applied a method-of-characteristics model (MOCDENSE). The previous numerical implementations by HYDROCOIN teams of a constant-concentration boundary to represent salt release by lateral dispersion only (as stipulated in the original problem definition) was flawed because this boundary condition allows the release of salt into the flow field by both dispersion and advection. When the constant-concentration boundary is modified to allow salt release by dispersion only, significantly less salt is released into the flow field. The calculated brine distribution for case 5 depends very little on which numerical model is used, as long as the selected model is solving the proper equations. Instead, the accuracy of the solution depends strongly on the proper conceptualization of the problem, including the detailed design of the constant-concentration boundary condition. The importance and sensitivity to the manner of specification of this boundary does not appear to have been recognized previously in the analysis of this problem.
Recent plant studies using Victoria 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BIXLER,NATHAN E.; GASSER,RONALD D.
2000-03-08
VICTORIA 2.0 is a mechanistic computer code designed to analyze fission product behavior within the reactor coolant system (RCS) during a severe nuclear reactor accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS and secondary circuits. These predictions account for the chemical and aerosol processes that affect radionuclide behavior. VICTORIA 2.0 was released in early 1999; a new version VICTORIA 2.1, is now under development. The largest improvements in VICTORIA 2.1 are connected with the thermochemical database, which is being revised andmore » expanded following the recommendations of a peer review. Three risk-significant severe accident sequences have recently been investigated using the VICTORIA 2.0 code. The focus here is on how various chemistry options affect the predictions. Additionally, the VICTORIA predictions are compared with ones made using the MELCOR code. The three sequences are a station blackout in a GE BWR and steam generator tube rupture (SGTR) and pump-seal LOCA sequences in a 3-loop Westinghouse PWR. These sequences cover a range of system pressures, from fully depressurized to full system pressure. The chief results of this study are the fission product fractions that are retained in the core, RCS, secondary, and containment and the fractions that are released into the environment.« less
Service Wear Test Evaluation of Structural/Proximity Firefighters Gloves
1991-06-05
CLOTHING AND TEXTILE RESEARCH FACILITY NATICK, MASSACHUSETTS Approved for public release; Technical Report No. NCTRF 188 distribution unlimited. 92 12 ;e3...ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER NAVY CLOTHING AND TEXTILE RESEARCH FACILITY P.O. BOX 59 NCTRF REPORT NO...CODE APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 13. ABSTRACT (Maximum 200 words) The Navy Clothing and Textile Research Facility (NCTRF
Demonstration of a Near and Mid-Infrared Detector Using Multiple Step Quantum Wells
2003-09-01
MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited. DEMONSTRATION OF A NEAR AND MID-INFRARED... Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) In this
Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties
Ilas, Germina; Liljenfeldt, Henrik
2017-05-19
Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less
Analysis of the influence of the heat transfer phenomena on the late phase of the ThAI Iod-12 test
NASA Astrophysics Data System (ADS)
Gonfiotti, B.; Paci, S.
2014-11-01
Iodine is one of the major contributors to the source term during a severe accident in a Nuclear Power Plant for its volatility and high radiological consequences. Therefore, large efforts have been made to describe the Iodine behaviour during an accident, especially in the containment system. Due to the lack of experimental data, in the last years many attempts were carried out to fill the gaps on the knowledge of Iodine behaviour. In this framework, two tests (ThAI Iod-11 and Iod-12) were carried out inside a multi-compartment steel vessel. A quite complex transient characterizes these two tests; therefore they are also suitable for thermal- hydraulic benchmarks. The two tests were originally released for a benchmark exercise during the SARNET2 EU Project. At the end of this benchmark a report covering the main findings was issued, stating that the common codes employed in SA studies were able to simulate the tests but with large discrepancies. The present work is then related to the application of the new versions of ASTEC and MELCOR codes with the aim of carry out a new code-to-code comparison vs. ThAI Iod-12 experimental data, focusing on the influence of the heat exchanges with the outer environment, which seems to be one of the most challenging issues to cope with.
Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilas, Germina; Liljenfeldt, Henrik
Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less
An assessment of the CORCON-MOD3 code. Part 1: Thermal-hydraulic calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strizhov, V.; Kanukova, V.; Vinogradova, T.
1996-09-01
This report deals with the subject of CORCON-Mod3 code validation (thermal-hydraulic modeling capability only) based on MCCI (molten core concrete interaction) experiments conducted under different programs in the past decade. Thermal-hydraulic calculations (i.e., concrete ablation, melt temperature, melt energy, concrete temperature, and condensible and non-condensible gas generation) were performed with the code, and compared with the data from 15 experiments, conducted at different scales using both simulant (metallic and oxidic) and prototypic melt materials, using different concrete types, and with and without an overlying water pool. Sensitivity studies were performed in a few cases involving, for example, heat transfer frommore » melt to concrete, condensed phase chemistry, etc. Further, special analysis was performed using the ACE L8 experimental data to illustrate the differences between the experimental and the reactor conditions, and to demonstrate that with proper corrections made to the code, the calculated results were in better agreement with the experimental data. Generally, in the case of dry cavity and metallic melts, CORCON-Mod3 thermal-hydraulic calculations were in good agreement with the test data. For oxidic melts in a dry cavity, uncertainties in heat transfer models played an important role for two melt configurations--a stratified geometry with segregated metal and oxide layers, and a heterogeneous mixture. Some discrepancies in the gas release data were noted in a few cases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Haihua; Zhang, Hongbin; Zou, Ling
2014-10-01
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The RELAP-7 code develop-ment effort started in October of 2011 and by the end of the second development year, a number of physical components with simplified two phase flow capability have been de-veloped to support the simplified boiling water reactor (BWR) extended station blackout (SBO) analyses. The demonstration case includes the major components for the primary system of a BWR, as well as the safety system components for the safety relief valve (SRV), the reactor core isolation cooling (RCIC)more » system, and the wet well. Three scenar-ios for the SBO simulations have been considered. Since RELAP-7 is not a severe acci-dent analysis code, the simulation stops when fuel clad temperature reaches damage point. Scenario I represents an extreme station blackout accident without any external cooling and cooling water injection. The system pressure is controlled by automatically releasing steam through SRVs. Scenario II includes the RCIC system but without SRV. The RCIC system is fully coupled with the reactor primary system and all the major components are dynamically simulated. The third scenario includes both the RCIC system and the SRV to provide a more realistic simulation. This paper will describe the major models and dis-cuss the results for the three scenarios. The RELAP-7 simulations for the three simplified SBO scenarios show the importance of dynamically simulating the SRVs, the RCIC sys-tem, and the wet well system to the reactor safety during extended SBO accidents.« less
JEWEL 2.0.0: directions for use
NASA Astrophysics Data System (ADS)
Zapp, Korinna
2014-02-01
In this publication the first official release of the Jewel 2.0.0 code [The first version Jewel 1 (Zapp et al. in Eur Phys J C 60:617, 2009) could only treat elastic scattering explicitly and the code was never published, The code can be downloaded from the official Jewel homepage http://jewel.hepforge.org] is presented. Jewel is a Monte Carlo event generator simulating QCD jet evolution in heavy-ion collisions. It treats the interplay of QCD radiation and re-scattering in a medium with fully microscopic dynamics in a consistent perturbative framework with minimal assumptions. After a qualitative introduction into the physics of Jewel detailed information about the practical aspects of using the code is given. The code is available from the official Jewel homepage http://jewel.hepforge.org.
The EMEP MSC-W chemical transport model - technical description
NASA Astrophysics Data System (ADS)
Simpson, D.; Benedictow, A.; Berge, H.; Bergström, R.; Emberson, L. D.; Fagerli, H.; Flechard, C. R.; Hayman, G. D.; Gauss, M.; Jonson, J. E.; Jenkin, M. E.; Nyíri, A.; Richter, C.; Semeena, V. S.; Tsyro, S.; Tuovinen, J.-P.; Valdebenito, Á.; Wind, P.
2012-08-01
The Meteorological Synthesizing Centre-West (MSC-W) of the European Monitoring and Evaluation Programme (EMEP) has been performing model calculations in support of the Convention on Long Range Transboundary Air Pollution (CLRTAP) for more than 30 years. The EMEP MSC-W chemical transport model is still one of the key tools within European air pollution policy assessments. Traditionally, the model has covered all of Europe with a resolution of about 50 km × 50 km, and extending vertically from ground level to the tropopause (100 hPa). The model has changed extensively over the last ten years, however, with flexible processing of chemical schemes, meteorological inputs, and with nesting capability: the code is now applied on scales ranging from local (ca. 5 km grid size) to global (with 1 degree resolution). The model is used to simulate photo-oxidants and both inorganic and organic aerosols. In 2008 the EMEP model was released for the first time as public domain code, along with all required input data for model runs for one year. The second release of the EMEP MSC-W model became available in mid 2011, and a new release is targeted for summer 2012. This publication is intended to document this third release of the EMEP MSC-W model. The model formulations are given, along with details of input data-sets which are used, and a brief background on some of the choices made in the formulation is presented. The model code itself is available at www.emep.int, along with the data required to run for a full year over Europe.
... Monitoring Review Plans Program Integrity National Correct Coding Initiative Affordable Care Act Program Integrity Provisions Cost Sharing ... to Care Living Well Quality of Care Improvement Initiatives Medicaid Managed Care Performance Measurement Releases & Announcements Enrollment ...
... Monitoring Review Plans Program Integrity National Correct Coding Initiative Affordable Care Act Program Integrity Provisions Cost Sharing ... to Care Living Well Quality of Care Improvement Initiatives Medicaid Managed Care Performance Measurement Releases & Announcements Enrollment ...
Optimum Boundaries of Signal-to-Noise Ratio for Adaptive Code Modulations
2017-11-14
1510–1521, Feb. 2015. [2]. Pursley, M. B. and Royster, T. C., “Adaptive-rate nonbinary LDPC coding for frequency - hop communications ,” IEEE...and this can cause a very narrowband noise near the center frequency during USRP signal acquisition and generation. This can cause a high BER...Final Report APPROVED FOR PUBLIC RELEASE; DISTRIBUTION IS UNLIMITED. AIR FORCE RESEARCH LABORATORY Space Vehicles Directorate 3550 Aberdeen Ave
2015-09-30
02.003’N, 07°01.981’W) To be recovered in 2016 Ranging code #08D1; releasing code #0803 In collaboration with Rune Hansen of the University of...the animal with PTT 134760 was tracked moving all the way south to the Azores Archipelago. Figure courtesy of Rune Hansen. Objective 4. conduct
Overriding Ethical Constraints in Lethal Autonomous Systems
2012-01-01
absolve the guilt from the party that issued the order in the first place. During the Nuremberg trials it was not sufficient for a soldier to merely...with coded authorization by two separate individuals, ideally the operator and his immediate superior. The inverse situation, denying the system...potentially violating. Permission to override in case 2 requires a coded two-key release by two separate operators, each going through the override
Topics in the Sequential Design of Experiments
1992-03-01
decision , unless so designated by other documentation. 12a. DISTRIBUTION /AVAILABIIUTY STATEMENT 12b. DISTRIBUTION CODE Approved for public release...3 0 1992 D 14. SUBJECT TERMS 15. NUMBER OF PAGES12 Design of Experiments, Renewal Theory , Sequential Testing 1 2. PRICE CODE Limit Theory , Local...distributions for one parameter exponential families," by Michael Woodroofe. Stntca, 2 (1991), 91-112. [6] "A non linear renewal theory for a functional of
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 29 2012-07-01 2012-07-01 false Form: Application for Reimbursement to... RESPONSE TO HAZARDOUS SUBSTANCE RELEASES Pt. 310, App. III Appendix III to Part 310—Form: Application for... ER18FE98.000 ER18FE98.001 Attachment 1 to Form 9310-1 Cost Element Codes and Comments [Cost Element Codes...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 28 2014-07-01 2014-07-01 false Form: Application for Reimbursement to... RESPONSE TO HAZARDOUS SUBSTANCE RELEASES Pt. 310, App. III Appendix III to Part 310—Form: Application for... ER18FE98.000 ER18FE98.001 Attachment 1 to Form 9310-1 Cost Element Codes and Comments [Cost Element Codes...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 29 2013-07-01 2013-07-01 false Form: Application for Reimbursement to... RESPONSE TO HAZARDOUS SUBSTANCE RELEASES Pt. 310, App. III Appendix III to Part 310—Form: Application for... ER18FE98.000 ER18FE98.001 Attachment 1 to Form 9310-1 Cost Element Codes and Comments [Cost Element Codes...
Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso
2018-04-22
This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems
NASA Astrophysics Data System (ADS)
Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.
2008-08-01
This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.
Analysis of xRAGE and flag high explosive burn models with PBX 9404 cylinder tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrier, Danielle; Andersen, Kyle Richard
High explosives are energetic materials that release their chemical energy in a short interval of time. They are able to generate extreme heat and pressure by a shock driven chemical decomposition reaction, which makes them valuable tools that must be understood. This study investigated the accuracy and performance of two Los Alamos National Laboratory hydrodynamic codes, which are used to determine the behavior of explosives within a variety of systems: xRAGE which utilizes an Eulerian mesh, and FLAG with utilizes a Lagrangian mesh. Various programmed and reactive burn models within both codes were tested using a copper cylinder expansion test.more » The test was based on a recent experimental setup which contained the plastic bonded explosive PBX 9404. Detonation velocity versus time curves for this explosive were obtained using Photon Doppler Velocimetry (PDV). The modeled results from each of the burn models tested were then compared to one another and to the experimental results. This study validate« less
NASA Astrophysics Data System (ADS)
Martin, Rodger; Ghoniem, Nasr M.
1986-11-01
A pin-type fusion reactor blanket is designed using γ-LiAlO 2 solid tritium breeder. Tritium transport and diffusive inventory are modeled using the DIFFUSE code. Two approaches are used to obtain characteristic LiAlO 2 grain temperatures. DIFFUSE provides intragranular diffusive inventories which scale up to blanket size. These results compare well with a numerical analysis, giving a steady-state blanket tritium inventory of 13 g. Start-up transient inventories are modeled using DIFFUSE for both full and restricted coolant flow. Full flow gives rapid inventory buildup while restricted flow prevents this buildup. Inventories after shutdown are modeled: reduced cooling is found to have little effect on removing tritium, but preheating rapidly purges inventory. DIFFUSE provides parametric modeling of solid breeder density, radiation, and surface effects. 100% dense pins are found to give massive inventory and marginal tritium release. Only large trapping energies and concentrations significantly increase inventory. Diatomic surface recombination is only significant at high temperatures.
Fatigue Life Methodology for Tapered Hybrid Composite Flexbeams
NASA Technical Reports Server (NTRS)
urri, Gretchen B.; Schaff, Jeffery R.
2006-01-01
Nonlinear-tapered flexbeam specimens from a full-size composite helicopter rotor hub flexbeam were tested under combined constant axial tension and cyclic bending loads. Two different graphite/glass hybrid configurations tested under cyclic loading failed by delamination in the tapered region. A 2-D finite element model was developed which closely approximated the flexbeam geometry, boundary conditions, and loading. The analysis results from two geometrically nonlinear finite element codes, ANSYS and ABAQUS, are presented and compared. Strain energy release rates (G) associated with simulated delamination growth in the flexbeams are presented from both codes. These results compare well with each other and suggest that the initial delamination growth from the tip of the ply-drop toward the thick region of the flexbeam is strongly mode II. The peak calculated G values were used with material characterization data to calculate fatigue life curves for comparison with test data. A curve relating maximum surface strain to number of loading cycles at delamination onset compared well with the test results.
The HIV Prison Paradox: Agency and HIV-Positive Women's Experiences in Jail and Prison in Alabama.
Sprague, Courtenay; Scanlon, Michael L; Radhakrishnan, Bharathi; Pantalone, David W
2017-08-01
Incarcerated women face significant barriers to achieve continuous HIV care. We employed a descriptive, exploratory design using qualitative methods and the theoretical construct of agency to investigate participants' self-reported experiences accessing HIV services in jail, in prison, and post-release in two Alabama cities. During January 2014, we conducted in-depth interviews with 25 formerly incarcerated HIV-positive women. Two researchers completed independent coding, producing preliminary codes from transcripts using content analysis. Themes were developed iteratively, verified, and refined. They encompassed (a) special rules for HIV-positive women: isolation, segregation, insults, food rationing, and forced disclosure; (b) absence of counseling following initial HIV diagnosis; and (c) HIV treatment impediments: delays, interruption, and denial. Participants deployed agentic strategies of accommodation, resistance, and care-seeking to navigate the social world of prison and HIV services. Findings illuminate the "HIV prison paradox": the chief opportunities that remain unexploited to engage and re-engage justice-involved women in the HIV care continuum.
Inter-comparison of Computer Codes for TRISO-based Fuel Micro-Modeling and Performance Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian Boer; Chang Keun Jo; Wen Wu
2010-10-01
The Next Generation Nuclear Plant (NGNP), the Deep Burn Pebble Bed Reactor (DB-PBR) and the Deep Burn Prismatic Block Reactor (DB-PMR) are all based on fuels that use TRISO particles as their fundamental constituent. The TRISO particle properties include very high durability in radiation environments, hence the designs reliance on the TRISO to form the principal barrier to radioactive materials release. This durability forms the basis for the selection of this fuel type for applications such as Deep Bun (DB), which require exposures up to four times those expected for light water reactors. It follows that the study and predictionmore » of the durability of TRISO particles must be carried as part of the safety and overall performance characterization of all the designs mentioned above. Such evaluations have been carried out independently by the performers of the DB project using independently developed codes. These codes, PASTA, PISA and COPA, incorporate models for stress analysis on the various layers of the TRISO particle (and of the intervening matrix material for some of them), model for fission products release and migration then accumulation within the SiC layer of the TRISO particle, just next to the layer, models for free oxygen and CO formation and migration to the same location, models for temperature field modeling within the various layers of the TRISO particle and models for the prediction of failure rates. All these models may be either internal to the code or external. This large number of models and the possibility of different constitutive data and model formulations and the possibility of a variety of solution techniques makes it highly unlikely that the model would give identical results in the modeling of identical situations. The purpose of this paper is to present the results of an inter-comparison between the codes and to identify areas of agreement and areas that need reconciliation. The inter-comparison has been carried out by the cooperating institutions using a set of pre-defined TRISO conditions (burnup levels, temperature or power levels, etc.) and the outcome will be tabulated in the full length paper. The areas of agreement will be pointed out and the areas that require further modeling or reconciliation will be shown. In general the agreement between the codes is good within less than one order of magnitude in the prediction of TRISO failure rates.« less
Integer sequence discovery from small graphs
Hoppe, Travis; Petrone, Anna
2015-01-01
We have exhaustively enumerated all simple, connected graphs of a finite order and have computed a selection of invariants over this set. Integer sequences were constructed from these invariants and checked against the Online Encyclopedia of Integer Sequences (OEIS). 141 new sequences were added and six sequences were extended. From the graph database, we were able to programmatically suggest relationships among the invariants. It will be shown that we can readily visualize any sequence of graphs with a given criteria. The code has been released as an open-source framework for further analysis and the database was constructed to be extensible to invariants not considered in this work. PMID:27034526
Planck 2015 results: V. LFI calibration
Ade, P. A. R.; Aghanim, N.; Ashdown, M.; ...
2016-09-20
In this paper, we present a description of the pipeline used to calibrate the Planck Low Frequency Instrument (LFI) timelines into thermodynamic temperatures for the Planck 2015 data release, covering four years of uninterrupted operations. As in the 2013 data release, our calibrator is provided by the spin-synchronous modulation of the cosmic microwave background dipole, but we now use the orbital component, rather than adopting the Wilkinson Microwave Anisotropy Probe (WMAP) solar dipole. This allows our 2015 LFI analysis to provide an independent Solar dipole estimate, which is in excellent agreement with that of HFI and within 1σ (0.3% inmore » amplitude) of the WMAP value. This 0.3% shift in the peak-to-peak dipole temperature from WMAP and a general overhaul of the iterative calibration code increases the overall level of the LFI maps by 0.45% (30 GHz), 0.64% (44 GHz), and 0.82% (70 GHz) in temperature with respect to the 2013 Planck data release, thus reducing the discrepancy with the power spectrum measured by WMAP. We estimate that the LFI calibration uncertainty is now at the level of 0.20% for the 70 GHz map, 0.26% for the 44 GHz map, and 0.35% for the 30 GHz map. Finally, we provide a detailed description of the impact of all the changes implemented in the calibration since the previous data release.« less
Planck 2015 results: V. LFI calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ade, P. A. R.; Aghanim, N.; Ashdown, M.
In this paper, we present a description of the pipeline used to calibrate the Planck Low Frequency Instrument (LFI) timelines into thermodynamic temperatures for the Planck 2015 data release, covering four years of uninterrupted operations. As in the 2013 data release, our calibrator is provided by the spin-synchronous modulation of the cosmic microwave background dipole, but we now use the orbital component, rather than adopting the Wilkinson Microwave Anisotropy Probe (WMAP) solar dipole. This allows our 2015 LFI analysis to provide an independent Solar dipole estimate, which is in excellent agreement with that of HFI and within 1σ (0.3% inmore » amplitude) of the WMAP value. This 0.3% shift in the peak-to-peak dipole temperature from WMAP and a general overhaul of the iterative calibration code increases the overall level of the LFI maps by 0.45% (30 GHz), 0.64% (44 GHz), and 0.82% (70 GHz) in temperature with respect to the 2013 Planck data release, thus reducing the discrepancy with the power spectrum measured by WMAP. We estimate that the LFI calibration uncertainty is now at the level of 0.20% for the 70 GHz map, 0.26% for the 44 GHz map, and 0.35% for the 30 GHz map. Finally, we provide a detailed description of the impact of all the changes implemented in the calibration since the previous data release.« less
Planck 2015 results. V. LFI calibration
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Chamballu, A.; Christensen, P. R.; Colombi, S.; Colombo, L. P. L.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Novikov, D.; Novikov, I.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Peel, M.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Pierpaoli, E.; Pietrobon, D.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Rocha, G.; Romelli, E.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Stolyarov, V.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Wilkinson, A.; Yvon, D.; Zacchei, A.; Zonca, A.
2016-09-01
We present a description of the pipeline used to calibrate the Planck Low Frequency Instrument (LFI) timelines into thermodynamic temperatures for the Planck 2015 data release, covering four years of uninterrupted operations. As in the 2013 data release, our calibrator is provided by the spin-synchronous modulation of the cosmic microwave background dipole, but we now use the orbital component, rather than adopting the Wilkinson Microwave Anisotropy Probe (WMAP) solar dipole. This allows our 2015 LFI analysis to provide an independent Solar dipole estimate, which is in excellent agreement with that of HFI and within 1σ (0.3% in amplitude) of the WMAP value. This 0.3% shift in the peak-to-peak dipole temperature from WMAP and a general overhaul of the iterative calibration code increases the overall level of the LFI maps by 0.45% (30 GHz), 0.64% (44 GHz), and 0.82% (70 GHz) in temperature with respect to the 2013 Planck data release, thus reducing the discrepancy with the power spectrum measured by WMAP. We estimate that the LFI calibration uncertainty is now at the level of 0.20% for the 70 GHz map, 0.26% for the 44 GHz map, and 0.35% for the 30 GHz map. We provide a detailed description of the impact of all the changes implemented in the calibration since the previous data release.
Sediment pollution characteristics and in situ control in a deep drinking water reservoir.
Zhou, Zizhen; Huang, Tinglin; Li, Yang; Ma, Weixing; Zhou, Shilei; Long, Shenghai
2017-02-01
Sediment pollution characteristics, in situ sediment release potential, and in situ inhibition of sediment release were investigated in a drinking water reservoir. Results showed that organic carbon (OC), total nitrogen (TN), and total phosphorus (TP) in sediments increased from the reservoir mouth to the main reservoir. Fraction analysis indicated that nitrogen in ion exchangeable form and NaOH-extractable P (Fe/Al-P) accounted for 43% and 26% of TN and TP in sediments of the main reservoir. The Risk Assessment Code for metal elements showed that Fe and Mn posed high to very high risk. The results of the in situ reactor experiment in the main reservoir showed the same trends as those observed in the natural state of the reservoir in 2011 and 2012; the maximum concentrations of total OC, TN, TP, Fe, and Mn reached 4.42mg/L, 3.33mg/L, 0.22mg/L, 2.56mg/L, and 0.61mg/L, respectively. An in situ sediment release inhibition technology, the water-lifting aerator, was utilized in the reservoir. The results of operating the water-lifting aerator indicated that sediment release was successfully inhibited and that OC, TN, TP, Fe, and Mn in surface sediment could be reduced by 13.25%, 15.23%, 14.10%, 5.32%, and 3.94%, respectively. Copyright © 2016. Published by Elsevier B.V.
Giménez, Estela; Sanz-Nebot, Victòria; Rizzi, Andreas
2013-09-01
Glycan reductive isotope labeling (GRIL) using [(12)C]- and [(13)C]-coded aniline was used for relative quantitation of N-glycans. In a first step, the labeling method by reductive amination was optimized for this reagent. It could be demonstrated that selecting aniline as limiting reactant and using the reductant in excess is critical for achieving high derivatization yields (over 95 %) and good reproducibility (relative standard deviations ∼1-5 % for major and ∼5-10 % for minor N-glycans). In a second step, zwitterionic-hydrophilic interaction liquid chromatography in capillary columns coupled to electrospray mass spectrometry with time-of-flight analyzer (μZIC-HILIC-ESI-TOF-MS) was applied for the analysis of labeled N-glycans released from intact glycoproteins. Ovalbumin, bovine α1-acid-glycoprotein and bovine fetuin were used as test glycoproteins to establish and evaluate the methodology. Excellent separation of isomeric N-glycans and reproducible quantitation via the extracted ion chromatograms indicate a great potential of the proposed methodology for glycoproteomic analysis and for reliable relative quantitation of glycosylation variants in biological samples.
Cost Metric Algorithms for Internetwork Applications
1989-04-01
5000. Released by Under authority of M. B. Vineberg, Head . X E Jahn, Head System Design and Battle Force and Theater Architechture Branch...for public release; distribution unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) S. MONITORING ORGANIZATION REPORT NUMBER(S) NOSC TR 1284 6a...NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBO 7a. NAME OF MONITORING ORGANIZATION Naval Ocean Systems Center Code 854 6c. ADDRESS (C, SftW&WZPCa
Optimal Sensor Placement in Active Multistatic Sonar Networks
2014-06-01
As b→ 0, the Fermi function approaches the cookie cutter model. 1Discovered in 1926 by Enrico Fermi and Paul Dirac when researching electron...Thesis Co-Advisors: Emily M. Craparo Craig W. Rasmussen Second Reader: Mümtaz Karataş Approved for public release; distribution is unlimited THIS PAGE...A. 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200
Development of New Generation of Multibody System Computer Codes
2012-11-02
Illinois at Chicago, 842 West Taylor Street, Chicago, IL 60607 Paramsothy Jayakumar Michael D. Letherwood U.S. Army RDECOM-TARDEC, 6501 E. 11 Mile...NUMBER W56HZV-13-C-0032 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Ahmed Shabana; Paramsothy Jayakumar ; Michael Letherwood 5d...public release. UNCLASSIFIED: Distribution Statement A. Approved for public release. REFERENCES 1. Contreras, U., Jayakumar , P., Letherwood, M
The Navy’s Coupled Atmosphere-Ocean-Wave Prediction System
2011-04-15
is provided below. llMurly lurtir llcai Mux. N. M,*!. Heal I luv . .tn.l Wind Sura f»f \\. Hu. Alia il.t.iujn 2«OX) i n.H 1 1 1 T 1 1...DATE IDD-MM YYYY) 15-04-201 I 2. REPORT TYPE Conference Proceeding 3. DATES COVERED (From To) 4. TITLE AND SUBTITLE The Navy’s Coupled...Code 7Q3n A I Division, Code I Author, Code i .y "f^****^ Cv^py>v’V/v^ 1. Release of this paper is approved. 2. To the best knowledge of
Mironov, Vladimir; Moskovsky, Alexander; D’Mello, Michael; ...
2017-10-04
The Hartree-Fock (HF) method in the quantum chemistry package GAMESS represents one of the most irregular algorithms in computation today. Major steps in the calculation are the irregular computation of electron repulsion integrals (ERIs) and the building of the Fock matrix. These are the central components of the main Self Consistent Field (SCF) loop, the key hotspot in Electronic Structure (ES) codes. By threading the MPI ranks in the official release of the GAMESS code, we not only speed up the main SCF loop (4x to 6x for large systems), but also achieve a significant (>2x) reduction in the overallmore » memory footprint. These improvements are a direct consequence of memory access optimizations within the MPI ranks. We benchmark our implementation against the official release of the GAMESS code on the Intel R Xeon PhiTM supercomputer. Here, scaling numbers are reported on up to 7,680 cores on Intel Xeon Phi coprocessors.« less
Maxdose-SR and popdose-SR routine release atmospheric dose models used at SRS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, G. T.; Trimor, P. P.
MAXDOSE-SR and POPDOSE-SR are used to calculate dose to the offsite Reference Person and to the surrounding Savannah River Site (SRS) population respectively following routine releases of atmospheric radioactivity. These models are currently accessed through the Dose Model Version 2014 graphical user interface (GUI). MAXDOSE-SR and POPDOSE-SR are personal computer (PC) versions of MAXIGASP and POPGASP, which both resided on the SRS IBM Mainframe. These two codes follow U.S. Nuclear Regulatory Commission (USNRC) Regulatory Guides 1.109 and 1.111 (1977a, 1977b). The basis for MAXDOSE-SR and POPDOSE-SR are USNRC developed codes XOQDOQ (Sagendorf et. al 1982) and GASPAR (Eckerman et. almore » 1980). Both of these codes have previously been verified for use at SRS (Simpkins 1999 and 2000). The revisions incorporated into MAXDOSE-SR and POPDOSE-SR Version 2014 (hereafter referred to as MAXDOSE-SR and POPDOSE-SR unless otherwise noted) were made per Computer Program Modification Tracker (CPMT) number Q-CMT-A-00016 (Appendix D). Version 2014 was verified for use at SRS in Dixon (2014).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carruthers, L.M.; Lee, C.E.
1976-10-01
The theoretical and numerical data base development of the LARC-1 code is described. Four analytical models of fission product release from an HTGR core during the loss of forced circulation accident are developed. Effects of diffusion, adsorption and evaporation of the metallics and precursors are neglected in this first LARC model. Comparison of the analytic models indicates that the constant release-renormalized model is adequate to describe the processes involved. The numerical data base for release constants, temperature modeling, fission product release rates, coated fuel particle failure fraction and aged coated fuel particle failure fractions is discussed. Analytic fits and graphicmore » displays for these data are given for the Ft. St. Vrain and GASSAR models.« less
GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise Paul
This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. •more » The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary. 09/2016: Tables 6 and 8 updated. AGR-2 input data added« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise P.
2014-09-01
This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparisonmore » of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary.« less
GSAC - Generic Seismic Application Computing
NASA Astrophysics Data System (ADS)
Herrmann, R. B.; Ammon, C. J.; Koper, K. D.
2004-12-01
With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient manipulation of SAC files under a variety of operating systems. PySAC has proven to be valuable in organizing large data sets. An array processing package includes standard beamforming algorithms and a search based method for inference of slowness vectors. The search results can be visualized using GMT scripts output by the C programs, and the resulting snapshots can be combined into an animation of the time evolution of the 2D slowness field.
Arsenic Detoxification by Geobacter Species.
Dang, Yan; Walker, David J F; Vautour, Kaitlin E; Dixon, Steven; Holmes, Dawn E
2017-02-15
Insight into the mechanisms for arsenic detoxification by Geobacter species is expected to improve the understanding of global cycling of arsenic in iron-rich subsurface sedimentary environments. Analysis of 14 different Geobacter genomes showed that all of these species have genes coding for an arsenic detoxification system (ars operon), and several have genes required for arsenic respiration (arr operon) and methylation (arsM). Genes encoding four arsenic repressor-like proteins were detected in the genome of G. sulfurreducens; however, only one (ArsR1) regulated transcription of the ars operon. Elimination of arsR1 from the G. sulfurreducens chromosome resulted in enhanced transcription of genes coding for the arsenic efflux pump (Acr3) and arsenate reductase (ArsC). When the gene coding for Acr3 was deleted, cells were not able to grow in the presence of either the oxidized or reduced form of arsenic, while arsC deletion mutants could grow in the presence of arsenite but not arsenate. These studies shed light on how Geobacter influences arsenic mobility in anoxic sediments and may help us develop methods to remediate arsenic contamination in the subsurface. This study examines arsenic transformation mechanisms utilized by Geobacter, a genus of iron-reducing bacteria that are predominant in many anoxic iron-rich subsurface environments. Geobacter species play a major role in microbially mediated arsenic release from metal hydroxides in the subsurface. This release raises arsenic concentrations in drinking water to levels that are high enough to cause major health problems. Therefore, information obtained from studies of Geobacter should shed light on arsenic cycling in iron-rich subsurface sedimentary environments, which may help reduce arsenic-associated illnesses. These studies should also help in the development of biosensors that can be used to detect arsenic contaminants in anoxic subsurface environments. We examined 14 different Geobacter genomes and found that all of these species possess genes coding for an arsenic detoxification system (ars operon), and some also have genes required for arsenic respiration (arr operon) and arsenic methylation (arsM). Copyright © 2017 American Society for Microbiology.
Arsenic Detoxification by Geobacter Species
Walker, David J. F.; Vautour, Kaitlin E.; Dixon, Steven
2016-01-01
ABSTRACT Insight into the mechanisms for arsenic detoxification by Geobacter species is expected to improve the understanding of global cycling of arsenic in iron-rich subsurface sedimentary environments. Analysis of 14 different Geobacter genomes showed that all of these species have genes coding for an arsenic detoxification system (ars operon), and several have genes required for arsenic respiration (arr operon) and methylation (arsM). Genes encoding four arsenic repressor-like proteins were detected in the genome of G. sulfurreducens; however, only one (ArsR1) regulated transcription of the ars operon. Elimination of arsR1 from the G. sulfurreducens chromosome resulted in enhanced transcription of genes coding for the arsenic efflux pump (Acr3) and arsenate reductase (ArsC). When the gene coding for Acr3 was deleted, cells were not able to grow in the presence of either the oxidized or reduced form of arsenic, while arsC deletion mutants could grow in the presence of arsenite but not arsenate. These studies shed light on how Geobacter influences arsenic mobility in anoxic sediments and may help us develop methods to remediate arsenic contamination in the subsurface. IMPORTANCE This study examines arsenic transformation mechanisms utilized by Geobacter, a genus of iron-reducing bacteria that are predominant in many anoxic iron-rich subsurface environments. Geobacter species play a major role in microbially mediated arsenic release from metal hydroxides in the subsurface. This release raises arsenic concentrations in drinking water to levels that are high enough to cause major health problems. Therefore, information obtained from studies of Geobacter should shed light on arsenic cycling in iron-rich subsurface sedimentary environments, which may help reduce arsenic-associated illnesses. These studies should also help in the development of biosensors that can be used to detect arsenic contaminants in anoxic subsurface environments. We examined 14 different Geobacter genomes and found that all of these species possess genes coding for an arsenic detoxification system (ars operon), and some also have genes required for arsenic respiration (arr operon) and arsenic methylation (arsM). PMID:27940542
Understanding Coronal Heating through Time-Series Analysis and Nanoflare Modeling
NASA Astrophysics Data System (ADS)
Romich, Kristine; Viall, Nicholeen
2018-01-01
Periodic intensity fluctuations in coronal loops, a signature of temperature evolution, have been observed using the Atmospheric Imaging Assembly (AIA) aboard NASA’s Solar Dynamics Observatory (SDO) spacecraft. We examine the proposal that nanoflares, or impulsive bursts of energy release in the solar atmosphere, are responsible for the intensity fluctuations as well as the megakelvin-scale temperatures observed in the corona. Drawing on the work of Cargill (2014) and Bradshaw & Viall (2016), we develop a computer model of the energy released by a sequence of nanoflare events in a single magnetic flux tube. We then use EBTEL (Enthalpy-Based Thermal Evolution of Loops), a hydrodynamic model of plasma response to energy input, to simulate intensity as a function of time across the coronal AIA channels. We test the EBTEL output for periodicities using a spectral code based on Mann and Lees’ (1996) multitaper method and present preliminary results here. Our ultimate goal is to establish whether quasi-continuous or impulsive energy bursts better approximate the original SDO data.
Effective Thermal Conductivity of Graphite Materials with Cracks
NASA Astrophysics Data System (ADS)
Pestchaanyi, S. E.; Landman, I. S.
The dependence of effective thermal diffusivity on temperature caused by volumetric cracks is modelled for macroscopic graphite samples using the three-dimensional thermomechanics code Pegasus-3D. At high off-normal heat loads typical of the divertor armour, thermostress due to the anisotropy of graphite grains is much larger than that due to the temperature gradient. Numerical simulation demonstrated that the volumetric crack density both in fine grain graphites and in the CFC matrix depends mainly on the local sample temperature, not on the temperature gradient. This allows to define an effective thermal diffusivity for graphite with cracks. The results obtained are used to explain intense cracking and particle release from carbon based materials under electron beam heat load. Decrease of graphite thermal diffusivity with increase of the crack density explains particle release mechanism in the experiments with CFC where a clear energy threshold for the onset of particle release has been observed in J. Linke et al. Fusion Eng. Design, in press, Bazyler et al., these proceedings. Surface temperature measurement is necessary to calibrate the Pegasus-3D code for simulation of ITER divertor armour brittle destruction.
Xu, Wei; Morishita, Wade; Buckmaster, Paul S; Pang, Zhiping P; Malenka, Robert C; Südhof, Thomas C
2012-03-08
Neurons encode information by firing spikes in isolation or bursts and propagate information by spike-triggered neurotransmitter release that initiates synaptic transmission. Isolated spikes trigger neurotransmitter release unreliably but with high temporal precision. In contrast, bursts of spikes trigger neurotransmission reliably (i.e., boost transmission fidelity), but the resulting synaptic responses are temporally imprecise. However, the relative physiological importance of different spike-firing modes remains unclear. Here, we show that knockdown of synaptotagmin-1, the major Ca(2+) sensor for neurotransmitter release, abrogated neurotransmission evoked by isolated spikes but only delayed, without abolishing, neurotransmission evoked by bursts of spikes. Nevertheless, knockdown of synaptotagmin-1 in the hippocampal CA1 region did not impede acquisition of recent contextual fear memories, although it did impair the precision of such memories. In contrast, knockdown of synaptotagmin-1 in the prefrontal cortex impaired all remote fear memories. These results indicate that different brain circuits and types of memory employ distinct spike-coding schemes to encode and transmit information. Copyright © 2012 Elsevier Inc. All rights reserved.
32 CFR 806b.45 - Releasable information.
Code of Federal Regulations, 2011 CFR
2011-07-01
...-inclusive. (a) Name. (b) Rank. (c) Grade. (d) Air Force specialty code. (e) Pay (including base pay, special...) Pay date. (n) Source of commission. (o) Professional military education. (p) Promotion sequence number...
32 CFR 806b.45 - Releasable information.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-inclusive. (a) Name. (b) Rank. (c) Grade. (d) Air Force specialty code. (e) Pay (including base pay, special...) Pay date. (n) Source of commission. (o) Professional military education. (p) Promotion sequence number...
Discrete Spring Model for Predicting Delamination Growth in Z-Fiber Reinforced DCB Specimens
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; OBrien, T. Kevin
2004-01-01
Beam theory analysis was applied to predict delamination growth in Double Cantilever Beam (DCB) specimens reinforced in the thickness direction with pultruded pins, known as Z-fibers. The specimen arms were modeled as cantilever beams supported by discrete springs, which were included to represent the pins. A bi-linear, irreversible damage law was used to represent Z-fiber damage, the parameters of which were obtained from previous experiments. Closed-form solutions were developed for specimen compliance and displacements corresponding to Z-fiber row locations. A solution strategy was formulated to predict delamination growth, in which the parent laminate mode I critical strain energy release rate was used as the criterion for delamination growth. The solution procedure was coded into FORTRAN 90, giving a dedicated software tool for performing the delamination prediction. Comparison of analysis results with previous analysis and experiment showed good agreement, yielding an initial verification for the analytical procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Kalinich, Donald A.
2014-02-01
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less
A Finite Rate Chemical Analysis of Nitric Oxide Flow Contamination Effects on Scramjet Performance
NASA Technical Reports Server (NTRS)
Cabell, Karen F.; Rock, Kenneth E.
2003-01-01
The level of nitric oxide contamination in the test gas of the Langley Research Center Arc-Heated Scramjet Test Facility and the effect of the contamination on scramjet test engine performance were investigated analytically. A finite rate chemical analysis was performed to determine the levels of nitric oxide produced in the facility at conditions corresponding to Mach 6 to 8 flight simulations. Results indicate that nitric oxide levels range from one to three mole percent, corroborating previously obtained measurements. A three-stream combustor code with finite rate chemistry was used to investigate the effects of nitric oxide on scramjet performance. Results indicate that nitric oxide in the test gas causes a small increase in heat release and thrust performance for the test conditions investigated. However, a rate constant uncertainty analysis suggests that the effect of nitric oxide ranges from no net effect, to an increase of about 10 percent in thrust performance.
EXP-PAC: providing comparative analysis and storage of next generation gene expression data.
Church, Philip C; Goscinski, Andrzej; Lefèvre, Christophe
2012-07-01
Microarrays and more recently RNA sequencing has led to an increase in available gene expression data. How to manage and store this data is becoming a key issue. In response we have developed EXP-PAC, a web based software package for storage, management and analysis of gene expression and sequence data. Unique to this package is SQL based querying of gene expression data sets, distributed normalization of raw gene expression data and analysis of gene expression data across experiments and species. This package has been populated with lactation data in the international milk genomic consortium web portal (http://milkgenomics.org/). Source code is also available which can be hosted on a Windows, Linux or Mac APACHE server connected to a private or public network (http://mamsap.it.deakin.edu.au/~pcc/Release/EXP_PAC.html). Copyright © 2012 Elsevier Inc. All rights reserved.
User's Manual for Space Debris Surfaces (SD_SURF)
NASA Technical Reports Server (NTRS)
Elfer, N. C.
1996-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design which is best suited to the predominant penetration mechanism. The analysis also indicates the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs and Microsoft EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII version 1.2a or 1.3 (Cosmic released). The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs.
NASA Astrophysics Data System (ADS)
Sandalski, Stou
Smooth particle hydrodynamics is an efficient method for modeling the dynamics of fluids. It is commonly used to simulate astrophysical processes such as binary mergers. We present a newly developed GPU accelerated smooth particle hydrodynamics code for astrophysical simulations. The code is named
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kicker, Dwayne Curtis; Herrick, Courtney G; Zeitler, Todd
The numerical code DRSPALL (from direct release spallings) is written to calculate the volume of Waste Isolation Pilot Plant solid waste subject to material failure and transport to the surface (i.e., spallings) as a result of a hypothetical future inadvertent drilling intrusion into the repository. An error in the implementation of the DRSPALL finite difference equations was discovered and documented in a software problem report in accordance with the quality assurance procedure for software requirements. This paper describes the corrections to DRSPALL and documents the impact of the new spallings data from the modified DRSPALL on previous performance assessment calculations.more » Updated performance assessments result in more simulations with spallings, which generally translates to an increase in spallings releases to the accessible environment. Total normalized radionuclide releases using the modified DRSPALL data were determined by forming the summation of releases across each potential release pathway, namely borehole cuttings and cavings releases, spallings releases, direct brine releases, and transport releases. Because spallings releases are not a major contributor to the total releases, the updated performance assessment calculations of overall mean complementary cumulative distribution functions for total releases are virtually unchanged. Therefore, the corrections to the spallings volume calculation did not impact Waste Isolation Pilot Plant performance assessment calculation results.« less
The kinetics of aerosol particle formation and removal in NPP severe accidents
NASA Astrophysics Data System (ADS)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.
2016-06-01
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.
The kinetics of aerosol particle formation and removal in NPP severe accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.
2016-06-08
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less
Haley, Danielle F; Golin, Carol E; Farel, Claire E; Wohl, David A; Scheyett, Anna M; Garrett, Jenna J; Rosen, David L; Parker, Sharon D
2014-12-09
Although prison provides the opportunity for HIV diagnosis and access to in-prison care, following release, many HIV-infected inmates experience clinical setbacks, including nonadherence to antiretrovirals, elevations in viral load, and HIV disease progression. HIV-infected former inmates face numerous barriers to successful community reentry and to accessing healthcare. However, little is known about the outcome expectations of HIV-infected inmates for release, how their post-release lives align with pre-release expectations, and how these processes influence engagement in HIV care following release from prison. We conducted semi-structured interviews (24 pre- and 13 post-release) with HIV-infected inmates enrolled in a randomized controlled trial of a case management intervention to enhance post-release linkage to care. Two researchers independently coded data using a common codebook. Intercoder reliability was strong (kappa = 0.86). We analyzed data using Grounded Theory methodology and Applied Thematic Analysis. We collected and compared baseline sociodemographic and behavioral characteristics of all cohort participants who did and did not participate in the qualitative interviews using Fisher's Exact Tests for categorical measures and Wilcoxon rank-sum tests for continuous measures. Most participants were heterosexual, middle-aged, single, African American men and women with histories of substance use. Substudy participants were more likely to anticipate living with family/friends and needing income assistance post-release. Most were taking antiretrovirals prior to release and anticipated needing help securing health benefits and medications post-release. Before release, most participants felt confident they would be able to manage their HIV. However, upon release, many experienced intermittent or prolonged periods of antiretroviral nonadherence, largely due to substance use relapse or delays in care initiation. Substance use was precipitated by stressful life experiences, including stigma, and contact with drug-using social networks. As informed by the Social Cognitive Theory and HIV Stigma Framework, findings illustrate the reciprocal relationships among substance use, experiences of stigma, pre- and post-release environments, and skills needed to engage in HIV care. These findings underscore the need for comprehensive evidence-based interventions to prepare inmates to transition from incarceration to freedom, particularly those that strengthen linkage to HIV care and focus on realities of reentry, including stigma, meeting basic needs, preventing substance abuse, and identifying community resources.
Python-Assisted MODFLOW Application and Code Development
NASA Astrophysics Data System (ADS)
Langevin, C.
2013-12-01
The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.
Hong, Yonglong; He, Haitao; Sui, Wen; Zhang, Jingge; Zhang, Shenfu; Yang, Dajiang
2018-06-01
Following the publication of this article, we realize that the title appeared incorrectly: This appeared in print as "Long non‑coding RNA H1 promotes cell proliferation and invasion by acting as a ceRNA of miR‑138 and releasing EZH2 in oral squamous cell carcinoma", and the corrected title is now featured above ("H1" should have read as "H19"). Note that this error did not have any bearing on the results reported in the paper, or on the conclusions therein. We regret any inconvenience that this mistake has caused. [the original article was published in the International Journal of Oncology 52: 901‑912, 2018; DOI: 10.3892/ijo.2018.4247].
Rosetta3: An Object-Oriented Software Suite for the Simulation and Design of Macromolecules
Leaver-Fay, Andrew; Tyka, Michael; Lewis, Steven M.; Lange, Oliver F.; Thompson, James; Jacak, Ron; Kaufman, Kristian; Renfrew, P. Douglas; Smith, Colin A.; Sheffler, Will; Davis, Ian W.; Cooper, Seth; Treuille, Adrien; Mandell, Daniel J.; Richter, Florian; Ban, Yih-En Andrew; Fleishman, Sarel J.; Corn, Jacob E.; Kim, David E.; Lyskov, Sergey; Berrondo, Monica; Mentzer, Stuart; Popović, Zoran; Havranek, James J.; Karanicolas, John; Das, Rhiju; Meiler, Jens; Kortemme, Tanja; Gray, Jeffrey J.; Kuhlman, Brian; Baker, David; Bradley, Philip
2013-01-01
We have recently completed a full re-architecturing of the Rosetta molecular modeling program, generalizing and expanding its existing functionality. The new architecture enables the rapid prototyping of novel protocols by providing easy to use interfaces to powerful tools for molecular modeling. The source code of this rearchitecturing has been released as Rosetta3 and is freely available for academic use. At the time of its release, it contained 470,000 lines of code. Counting currently unpublished protocols at the time of this writing, the source includes 1,285,000 lines. Its rapid growth is a testament to its ease of use. This document describes the requirements for our new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform. PMID:21187238
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
Atmospheric Renewable Energy Research, Volume 3: Solar-Power Microgrids and Atmospheric Influences
2016-09-01
Approved for public release; distribution unlimited. ii REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...code) 575-678-3237 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 Approved for public release; distribution is unlimited. iii
2009-06-01
M. Harkins Approved for public release; distribution is unlimited THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved...12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200...The integration of a MWIR signature into VMIFF will add a daytime capability. A new generation of compact MWIR sources is emerging to meet demands
2008-09-01
NPS-OC-08-005 NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA Approved for public release; distribution is...official policy or position of the Department of Defense or the U.S. Government. 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public...release; distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) A universally accessible web-based marine
Maxine: A spreadsheet for estimating dose from chronic atmospheric radioactive releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jannik, Tim; Bell, Evaleigh; Dixon, Kenneth
MAXINE is an EXCEL© spreadsheet, which is used to estimate dose to individuals for routine and accidental atmospheric releases of radioactive materials. MAXINE does not contain an atmospheric dispersion model, but rather doses are estimated using air and ground concentrations as input. Minimal input is required to run the program and site specific parameters are used when possible. Complete code description, verification of models, and user’s manual have been included.
High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.
Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel
2018-06-19
Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.
Analytical model for release calculations in solid thin-foils ISOL targets
NASA Astrophysics Data System (ADS)
Egoriti, L.; Boeckx, S.; Ghys, L.; Houngbo, D.; Popescu, L.
2016-10-01
A detailed analytical model has been developed to simulate isotope-release curves from thin-foils ISOL targets. It involves the separate modeling of diffusion and effusion inside the target. The former has been modeled using both first and second Fick's law. The latter, effusion from the surface of the target material to the end of the ionizer, was simulated with the Monte Carlo code MolFlow+. The calculated delay-time distribution for this process was then fitted using a double-exponential function. The release curve obtained from the convolution of diffusion and effusion shows good agreement with experimental data from two different target geometries used at ISOLDE. Moreover, the experimental yields are well reproduced when combining the release fraction with calculated in-target production.
Mars Global Reference Atmospheric Model (Mars-GRAM): Release No. 2 - Overview and applications
NASA Technical Reports Server (NTRS)
James, B.; Johnson, D.; Tyree, L.
1993-01-01
The Mars Global Reference Atmospheric Model (Mars-GRAM), a science and engineering model for empirically parameterizing the temperature, pressure, density, and wind structure of the Martian atmosphere, is described with particular attention to the model's newest version, Mars-GRAM, Release No. 2 and to the improvements incorporated into the Release No. 2 model as compared with the Release No. 1 version. These improvements include (1) an addition of a new capability to simulate local-scale Martian dust storms and the growth and decay of these storms; (2) an addition of the Zurek and Haberle (1988) wave perturbation model, for simulating tidal perturbation effects; and (3) a new modular version of Mars-GRAM, for incorporation as a subroutine into other codes.
Rcount: simple and flexible RNA-Seq read counting.
Schmid, Marc W; Grossniklaus, Ueli
2015-02-01
Analysis of differential gene expression by RNA sequencing (RNA-Seq) is frequently done using feature counts, i.e. the number of reads mapping to a gene. However, commonly used count algorithms (e.g. HTSeq) do not address the problem of reads aligning with multiple locations in the genome (multireads) or reads aligning with positions where two or more genes overlap (ambiguous reads). Rcount specifically addresses these issues. Furthermore, Rcount allows the user to assign priorities to certain feature types (e.g. higher priority for protein-coding genes compared to rRNA-coding genes) or to add flanking regions. Rcount provides a fast and easy-to-use graphical user interface requiring no command line or programming skills. It is implemented in C++ using the SeqAn (www.seqan.de) and the Qt libraries (qt-project.org). Source code and 64 bit binaries for (Ubuntu) Linux, Windows (7) and MacOSX are released under the GPLv3 license and are freely available on github.com/MWSchmid/Rcount. marcschmid@gmx.ch Test data, genome annotation files, useful Python and R scripts and a step-by-step user guide (including run-time and memory usage tests) are available on github.com/MWSchmid/Rcount. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tobacco use in popular movies during the past decade
Mekemson, C; Glik, D; Titus, K; Myerson, A; Shaivitz, A; Ang, A; Mitchell, S
2004-01-01
Objective: The top 50 commercially successful films released per year from 1991 to 2000 were content coded to assess trends in tobacco use over time and attributes of films predictive of higher smoking rates. Design: This observational study used media content analysis methods to generate data about tobacco use depictions in films studied (n = 497). Films are the basic unit of analysis. Once films were coded and preliminary analysis completed, outcome data were transformed to approximate multivariate normality before being analysed with general linear models and longitudinal mixed method regression methods. Main outcome measures: Tobacco use per minute of film was the main outcome measure used. Predictor variables include attributes of films and actors. Tobacco use was defined as any cigarette, cigar, and chewing tobacco use as well as the display of smoke and cigarette paraphernalia such as ashtrays, brand names, or logos within frames of films reviewed. Results: Smoking rates in the top films fluctuated yearly over the decade with an overall modest downward trend (p < 0.005), with the exception of R rated films where rates went up. Conclusions: The decrease in smoking rates found in films in the past decade is modest given extensive efforts to educate the entertainment industry on this issue over the past decade. Monitoring, education, advocacy, and policy change to bring tobacco depiction rates down further should continue. PMID:15564625
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Hou, Xiaodong; Du, Yongmei; Liu, Xinmin; Zhang, Hongbo; Liu, Yanhua; Yan, Ning; Zhang, Zhongfeng
2017-01-01
Sprouting is a key factor affecting the quality of potato tubers. The present study aimed to compare the differential expression of long non-coding RNAs (lncRNAs) in the apical meristem during the dormancy release and sprouting stages by using lncRNA sequencing. Microscopic observations and Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analyses revealed the changes in the morphology and expression of lncRNAs in potato tubers during sprouting. Meristematic cells of potato tuber apical buds divided continuously and exhibited vegetative cone bulging and vascularisation. In all, 3175 lncRNAs were identified from the apical buds of potato tubers, among which 383 lncRNAs were up-regulated and 340 were down-regulated during sprouting. The GO enrichment analysis revealed that sprouting mainly influenced the expression of lncRNAs related to the cellular components of potato apical buds (e.g., cytoplasm and organelles) and cellular metabolic processes. The KEGG enrichment analysis also showed significant enrichment of specific metabolic pathways. In addition, 386 differentially expressed lncRNAs during sprouting were identified as putative targets of 235 potato miRNAs. Quantitative real-time polymerase chain reaction results agreed with the sequencing data. Our study provides the first systematic study of numerous lncRNAs involved in the potato tuber sprouting process and lays the foundation for further studies to elucidate their precise functions. PMID:29286332
Accuracy comparison among different machine learning techniques for detecting malicious codes
NASA Astrophysics Data System (ADS)
Narang, Komal
2016-03-01
In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.
Extension of Generalized Fluid System Simulation Program's Fluid Property Database
NASA Technical Reports Server (NTRS)
Patel, Kishan
2011-01-01
This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.
mdFoam+: Advanced molecular dynamics in OpenFOAM
NASA Astrophysics Data System (ADS)
Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.
2018-03-01
This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.
MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.
1995-03-01
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less
Validating the BISON fuel performance code to integral LWR experiments
Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...
2016-03-24
BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.
This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. Themore » code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.« less
MitoNuc: a database of nuclear genes coding for mitochondrial proteins. Update 2002.
Attimonelli, Marcella; Catalano, Domenico; Gissi, Carmela; Grillo, Giorgio; Licciulli, Flavio; Liuni, Sabino; Santamaria, Monica; Pesole, Graziano; Saccone, Cecilia
2002-01-01
Mitochondria, besides their central role in energy metabolism, have recently been found to be involved in a number of basic processes of cell life and to contribute to the pathogenesis of many degenerative diseases. All functions of mitochondria depend on the interaction of nuclear and organelle genomes. Mitochondrial genomes have been extensively sequenced and analysed and data have been collected in several specialised databases. In order to collect information on nuclear coded mitochondrial proteins we developed MitoNuc, a database containing detailed information on sequenced nuclear genes coding for mitochondrial proteins in Metazoa. The MitoNuc database can be retrieved through SRS and is available via the web site http://bighost.area.ba.cnr.it/mitochondriome where other mitochondrial databases developed by our group, the complete list of the sequenced mitochondrial genomes, links to other mitochondrial sites and related information, are available. The MitoAln database, related to MitoNuc in the previous release, reporting the multiple alignments of the relevant homologous protein coding regions, is no longer supported in the present release. In order to keep the links among entries in MitoNuc from homologous proteins, a new field in the database has been defined: the cluster identifier, an alpha numeric code used to identify each cluster of homologous proteins. A comment field derived from the corresponding SWISS-PROT entry has been introduced; this reports clinical data related to dysfunction of the protein. The logic scheme of MitoNuc database has been implemented in the ORACLE DBMS. This will allow the end-users to retrieve data through a friendly interface that will be soon implemented.
Facile and High-Throughput Synthesis of Functional Microparticles with Quick Response Codes.
Ramirez, Lisa Marie S; He, Muhan; Mailloux, Shay; George, Justin; Wang, Jun
2016-06-01
Encoded microparticles are high demand in multiplexed assays and labeling. However, the current methods for the synthesis and coding of microparticles either lack robustness and reliability, or possess limited coding capacity. Here, a massive coding of dissociated elements (MiCODE) technology based on innovation of a chemically reactive off-stoichimetry thiol-allyl photocurable polymer and standard lithography to produce a large number of quick response (QR) code microparticles is introduced. The coding process is performed by photobleaching the QR code patterns on microparticles when fluorophores are incorporated into the prepolymer formulation. The fabricated encoded microparticles can be released from a substrate without changing their features. Excess thiol functionality on the microparticle surface allows for grafting of amine groups and further DNA probes. A multiplexed assay is demonstrated using the DNA-grafted QR code microparticles. The MiCODE technology is further characterized by showing the incorporation of BODIPY-maleimide (BDP-M) and Nile Red fluorophores for coding and the use of microcontact printing for immobilizing DNA probes on microparticle surfaces. This versatile technology leverages mature lithography facilities for fabrication and thus is amenable to scale-up in the future, with potential applications in bioassays and in labeling consumer products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Status and Plans for the Vienna VLBI and Satellite Software (VieVS 3.0)
NASA Astrophysics Data System (ADS)
Gruber, Jakob; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krásná, Hana; Kwak, Younghee; Landskron, Daniel; Madzak, Matthias; Mayer, David; McCallum, Jamie; Plank, Lucia; Schartner, Matthias; Shabala, Stas; Teke, Kamil; Sun, Jing
2017-04-01
The Vienna VLBI and Satellite Software (VieVS) is a geodetic analysis software developed and maintained at Technische Universität Wien (TU Wien) with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing Very Long Baseline Interferometry (VLBI) analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 3.0, released in early 2017, includes several new features, e.g., improved scheduling capabilities for observing quasars and satellites. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI and Satellite Software (VieVS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hinkebein, Thomas E.
The intrusion of gas into oils stored within the SPR has been examined. When oil is stored in domal salts, gases intrude into the stored oil from the surrounding salt. Aspects of the mechanism of gas intrusion have been examined. In all cases, this gas intrusion results in increases in the oil vapor pressure. Data that have been gathered from 1993 to August 2002 are presented to show the resultant increases in bubble-point pressure on a cavern-by-cavern as well as on a stream basis. The measurement techniques are presented with particular emphasis on the TVP 95. Data analysis methods aremore » presented to show the methods required to obtain recombined cavern oil compositions. Gas-oil ratios are also computed from the data and are presented on a cavern-by-cavern and stream basis. The observed increases in bubble-point pressure and gas-oil ratio are further statistically analyzed to allow data interpretation. Emissions plume modeling is used to determine adherence to state air regulations. Gas intrusion is observed to be variable among the sites and within each dome. Gas intrusions at Bryan Mound and Big Hill have resulted in the largest increases in bubble-point pressure for the Strategic Petroleum Reserve (SPR). The streams at Bayou Choctaw and West Hackberry show minimal bubble-point pressure increases. Emissions plume modeling, using the state mandated ISCST code, of oil storage tanks showed that virtually no gas may be released when H2S standards are considered. DOE plans to scavenge H2S to comply with the very tight standards on this gas. With the assumption of scavenging, benzene releases become the next most controlling factor. Model results show that a GOR of 0.6 SCF/BBL may be emissions that are within standards. Employing the benzene gas release standard will significantly improve oil deliverability. New plume modeling using the computational fluid dynamics code, FLUENT, is addressing limitations of the state mandated ISCST model.« less
Sensitivity Analysis of OECD Benchmark Tests in BISON
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, Laura Painton; Gamble, Kyle; Schmidt, Rodney C.
2015-09-01
This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining coremore » boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.« less
MATH77 - A LIBRARY OF MATHEMATICAL SUBPROGRAMS FOR FORTRAN 77, RELEASE 4.0
NASA Technical Reports Server (NTRS)
Lawson, C. L.
1994-01-01
MATH77 is a high quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for the basic computational processes of science and engineering. The portability of MATH77 meets the needs of present-day scientists and engineers who typically use a variety of computing environments. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. Usage of the user-callable subprograms is described in 69 sections of the 416 page users' manual. The topics covered by MATH77 are indicated by the following list of chapter titles in the users' manual: Mathematical Functions, Pseudo-random Number Generation, Linear Systems of Equations and Linear Least Squares, Matrix Eigenvalues and Eigenvectors, Matrix Vector Utilities, Nonlinear Equation Solving, Curve Fitting, Table Look-Up and Interpolation, Definite Integrals (Quadrature), Ordinary Differential Equations, Minimization, Polynomial Rootfinding, Finite Fourier Transforms, Special Arithmetic , Sorting, Library Utilities, Character-based Graphics, and Statistics. Besides subprograms that are adaptations of public domain software, MATH77 contains a number of unique packages developed by the authors of MATH77. Instances of the latter type include (1) adaptive quadrature, allowing for exceptional generality in multidimensional cases, (2) the ordinary differential equations solver used in spacecraft trajectory computation for JPL missions, (3) univariate and multivariate table look-up and interpolation, allowing for "ragged" tables, and providing error estimates, and (4) univariate and multivariate derivative-propagation arithmetic. MATH77 release 4.0 is a subroutine library which has been carefully designed to be usable on any computer system that supports the full ANSI standard FORTRAN 77 language. It has been successfully implemented on a CRAY Y/MP computer running UNICOS, a UNISYS 1100 computer running EXEC 8, a DEC VAX series computer running VMS, a Sun4 series computer running SunOS, a Hewlett-Packard 720 computer running HP-UX, a Macintosh computer running MacOS, and an IBM PC compatible computer running MS-DOS. Accompanying the library is a set of 196 "demo" drivers that exercise all of the user-callable subprograms. The FORTRAN source code for MATH77 comprises 109K lines of code in 375 files with a total size of 4.5Mb. The demo drivers comprise 11K lines of code and 418K. Forty-four percent of the lines of the library code and 29% of those in the demo code are comment lines. The standard distribution medium for MATH77 is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 9track 1600 BPI magnetic tape in VAX BACKUP format and a TK50 tape cartridge in VAX BACKUP format. An electronic copy of the documentation is included on the distribution media. Previous releases of MATH77 have been used over a number of years in a variety of JPL applications. MATH77 Release 4.0 was completed in 1992. MATH77 is a copyrighted work with all copyright vested in NASA.
Olfactory modulation by dopamine in the context of aversive learning
Riffell, Jeffrey A.; Martin, Joshua P.; Gage, Stephanie L.; Nighorn, Alan J.
2012-01-01
The need to detect and process sensory cues varies in different behavioral contexts. Plasticity in sensory coding can be achieved by the context-specific release of neuromodulators in restricted brain areas. The context of aversion triggers the release of dopamine in the insect brain, yet the effects of dopamine on sensory coding are unknown. In this study, we characterize the morphology of dopaminergic neurons that innervate each of the antennal lobes (ALs; the first synaptic neuropils of the olfactory system) of the moth Manduca sexta and demonstrate with electrophysiology that dopamine enhances odor-evoked responses of the majority of AL neurons while reducing the responses of a small minority. Because dopamine release in higher brain areas mediates aversive learning we developed a naturalistic, ecologically inspired aversive learning paradigm in which an innately appetitive host plant floral odor is paired with a mimic of the aversive nectar of herbivorized host plants. This pairing resulted in a decrease in feeding behavior that was blocked when dopamine receptor antagonists were injected directly into the ALs. These results suggest that a transient dopaminergic enhancement of sensory output from the AL contributes to the formation of aversive memories. We propose a model of olfactory modulation in which specific contexts trigger the release of different neuromodulators in the AL to increase olfactory output to downstream areas of processing. PMID:22552185
Sevgi, Ferhan; Kaynarsoy, Buket; Ozyazici, Mine; Pekcetin, Cetin; Ozyurt, Dogan
2008-01-01
The new mefenamic acid-alginate bead formulation prepared by ionotropic gelation method using 3 x 2(2) factorial design has shown adequate controlled release properties in vitro. In the present study, the irritation effects of mefenamic acid (MA), a prominent non-steroidal anti-inflammatory (NSAI) drug, were evaluated on rat gastric and duodenal mucosa when suspended in 0.5% (w/v) sodiumcarboxymethylcellulose (NaCMC) solution and loaded in alginate beads. Wistar albino rats weighing 200 +/- 50 g were used during in vivo animal studies. In this work, biodegradable controlled release MA beads and free MA were evaluated according to the degree of gastric or duodenal damage following oral administration in rats. The gastric and duodenal mucosa was examined for any haemorrhagic changes. Formulation code A10 showing both Case II transport and zero order drug release and t(50) % value of 5.22 h was chosen for in vivo animal studies. For in vivo trials, free MA (100 mgkg(-1)), blank and MA (100 mgkg(-1)) loaded alginate beads (formulation code A10) were suspended in 0.5% (w/v) NaCMC solution and each group was given to six rats orally by gavage. NaCMC solution was used as a control in experimental studies. In vivo data showed that the administration of MA in alginate beads prevented the gastric lesions.
Photometric redshifts for Hyper Suprime-Cam Subaru Strategic Program Data Release 1
NASA Astrophysics Data System (ADS)
Tanaka, Masayuki; Coupon, Jean; Hsieh, Bau-Ching; Mineo, Sogo; Nishizawa, Atsushi J.; Speagle, Joshua; Furusawa, Hisanori; Miyazaki, Satoshi; Murayama, Hitoshi
2018-01-01
Photometric redshifts are a key component of many science objectives in the Hyper Suprime-Cam Subaru Strategic Program (HSC-SSP). In this paper, we describe and compare the codes used to compute photometric redshifts for HSC-SSP, how we calibrate them, and the typical accuracy we achieve with the HSC five-band photometry (grizy). We introduce a new point estimator based on an improved loss function and demonstrate that it works better than other commonly used estimators. We find that our photo-z's are most accurate at 0.2 ≲ zphot ≲ 1.5, where we can straddle the 4000 Å break. We achieve σ[Δzphot/(1 + zphot)] ˜ 0.05 and an outlier rate of about 15% for galaxies down to i = 25 within this redshift range. If we limit ourselves to a brighter sample of i < 24, we achieve σ ˜ 0.04 and ˜8% outliers. Our photo-z's should thus enable many science cases for HSC-SSP. We also characterize the accuracy of our redshift probability distribution function (PDF) and discover that some codes over-/underestimate the redshift uncertainties, which has implications for N(z) reconstruction. Our photo-z products for the entire area in Public Data Release 1 are publicly available, and both our catalog products (such as point estimates) and full PDFs can be retrieved from the data release site, "https://hsc-release.mtk.nao.ac.jp/".
Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R
2008-05-15
A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.
Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J
1997-01-01
To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.
Reaction path of energetic materials using THOR code
NASA Astrophysics Data System (ADS)
Durães, L.; Campos, J.; Portugal, A.
1998-07-01
The method of predicting reaction path, using THOR code, allows for isobar and isochor adiabatic combustion and CJ detonation regimes, the calculation of the composition and thermodynamic properties of reaction products of energetic materials. THOR code assumes the thermodynamic equilibria of all possible products, for the minimum Gibbs free energy, using HL EoS. The code allows the possibility of estimating various sets of reaction products, obtained successively by the decomposition of the original reacting compound, as a function of the released energy. Two case studies of thermal decomposition procedure were selected, calculated and discussed—pure Ammonium Nitrate and its based explosive ANFO, and Nitromethane—because their equivalence ratio is respectively lower, near and greater than the stoicheiometry. Predictions of reaction path are in good correlation with experimental values, proving the validity of proposed method.
Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III
1996-01-01
Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.
The Effect of Spray Initial Conditions on Heat Release and Emissions in LDI CFD Calculations
NASA Technical Reports Server (NTRS)
Iannetti, Anthony C.; Liu, Nan-Suey; Davoudzadeh, Farhad
2008-01-01
The mass and velocity distribution of liquid spray has a primary effect on the combustion heat release process. This heat release process then affects emissions like nitrogen oxides (NOx) and carbon monoxide (CO). Computational Fluid Dynamics gives the engineer insight into these processes, but various setup options exist (number of droplet groups, and initial droplet temperature) for spray initial conditions. This paper studies these spray initial condition options using the National Combustion Code (NCC) on a single swirler lean direct injection (LDI) flame tube. Using laminar finite rate chemistry, comparisons are made against experimental data for velocity measurements, temperature, and emissions (NOx, CO).
TankSIM: A Cryogenic Tank Performance Prediction Program
NASA Technical Reports Server (NTRS)
Bolshinskiy, L. G.; Hedayat, A.; Hastings, L. J.; Moder, J. P.; Schnell, A. R.; Sutherlin, S. G.
2015-01-01
Developed for predicting the behavior of cryogenic liquids inside propellant tanks under various environmental and operating conditions. Provides a multi-node analysis of pressurization, ullage venting and thermodynamic venting systems (TVS) pressure control using axial jet or spray bar TVS. Allows user to combine several different phases for predicting the liquid behavior for the entire flight mission timeline or part of it. Is a NASA in-house code, based on FORTRAN 90-95 and Intel Visual FORTRAN compiler, but can be used on any other platform (Unix-Linux, Compaq Visual FORTRAN, etc.). The last Version 7, released on December 2014, included detailed User's Manual. Includes the use of several RefPROP subroutines for calculating fluid properties.
Modeling of outgassing and matrix decomposition in carbon-phenolic composites
NASA Technical Reports Server (NTRS)
Mcmanus, Hugh L.
1993-01-01
A new release rate equation to model the phase change of water to steam in composite materials was derived from the theory of molecular diffusion and equilibrium moisture concentration. The new model is dependent on internal pressure, the microstructure of the voids and channels in the composite materials, and the diffusion properties of the matrix material. Hence, it is more fundamental and accurate than the empirical Arrhenius rate equation currently in use. The model was mathematically formalized and integrated into the thermostructural analysis code CHAR. Parametric studies on variation of several parameters have been done. Comparisons to Arrhenius and straight-line models show that the new model produces physically realistic results under all conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kicker, Dwayne Curtis; Herrick, Courtney G; Zeitler, Todd
2015-11-01
The numerical code DRSPALL (from direct release spallings) is written to calculate the volume of Waste Isolation Pilot Plant solid waste subject to material failure and transport to the surface (i.e., spallings) as a result of a hypothetical future inadvertent drilling intrusion into the repository. An error in the implementation of the DRSPALL finite difference equations was discovered and documented in a software problem report in accordance with the quality assurance procedure for software requirements. This paper describes the corrections to DRSPALL and documents the impact of the new spallings data from the modified DRSPALL on previous performance assessment calculations.more » Updated performance assessments result in more simulations with spallings, which generally translates to an increase in spallings releases to the accessible environment. Total normalized radionuclide releases using the modified DRSPALL data were determined by forming the summation of releases across each potential release pathway, namely borehole cuttings and cavings releases, spallings releases, direct brine releases, and transport releases. Because spallings releases are not a major contributor to the total releases, the updated performance assessment calculations of overall mean complementary cumulative distribution functions for total releases are virtually unchanged. Therefore, the corrections to the spallings volume calculation did not impact Waste Isolation Pilot Plant performance assessment calculation results.« less
Evolution of the ATLAS Nightly Build System
NASA Astrophysics Data System (ADS)
Undrus, A.
2012-12-01
The ATLAS Nightly Build System is a major component in the ATLAS collaborative software organization, validation, and code approval scheme. For over 10 years of development it has evolved into a factory for automatic release production and grid distribution. The 50 multi-platform branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains 2200 packages with 4 million C++ and 1.4 million python scripting lines written by about 1000 developers. Recent development was focused on the integration of ATLAS Nightly Build and Installation systems. The nightly releases are distributed and validated and some are transformed into stable releases used for data processing worldwide. The ATLAS Nightly System is managed by the NICOS control tool on a computing farm with 50 powerful multiprocessor nodes. NICOS provides the fully automated framework for the release builds, testing, and creation of distribution kits. The ATN testing framework of the Nightly System runs unit and integration tests in parallel suites, fully utilizing the resources of multi-core machines, and provides the first results even before compilations complete. The NICOS error detection system is based on several techniques and classifies the compilation and test errors according to their severity. It is periodically tuned to place greater emphasis on certain software defects by highlighting the problems on NICOS web pages and sending automatic e-mail notifications to responsible developers. These and other recent developments will be presented and future plans will be described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.
1997-05-01
The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performancemore » considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.« less
The Chandra Source Catalog 2.0
NASA Astrophysics Data System (ADS)
Evans, Ian N.; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Miller, Joseph; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
The current version of the Chandra Source Catalog (CSC) continues to be well utilized by the astronomical community. Usage over the past year has continued to average more than 15,000 searches per month. Version 1.1 of the CSC, released in 2010, includes properties and data for 158,071 detections, corresponding to 106,586 distinct X-ray sources on the sky. The second major release of the catalog, CSC 2.0, will be made available to the user community in early 2018, and preliminary lists of detections and sources are available now. Release 2.0 will roughly triple the size of the current version of the catalog to an estimated 375,000 detections, corresponding to ~315,000 unique X-ray sources. Compared to release 1.1, the limiting sensitivity for compact sources in CSC 2.0 is significantly enhanced. This improvement is achieved by using a two-stage approach that involves stacking (co-adding) multiple observations of the same field prior to source detection, and then using an improved source detection approach that enables us to detect point source down to ~5 net counts on-axis for exposures shorter than ~15 ks. In addition to enhanced source detection capabilities, improvements to the Bayesian aperture photometry code included in release 2.0 provides robust photometric probability density functions (PDFs) in crowded fields even for low count detections. All post-aperture photometry properties (e.g., hardness ratios, source variability) work directly from the PDFs in release 2.0. CSC 2.0 also adds a Bayesian Blocks analysis of the multi-band aperture photometry PDFs to identify multiple observations of the same source that have similar photometric properties, and therefore can be analyzed simultaneously to improve S/N.We briefly describe these and other updates that significantly enhance the scientific utility of CSC 2.0 when compared to the earlier catalog release.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
SOPHAEROS code development and its application to falcon tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lajtha, G.; Missirlian, M.; Kissane, M.
1996-12-31
One of the key issues in source-term evaluation in nuclear reactor severe accidents is determination of the transport behavior of fission products released from the degrading core. The SOPHAEROS computer code is being developed to predict fission product transport in a mechanistic way in light water reactor circuits. These applications of the SOPHAEROS code to the Falcon experiments, among others not presented here, indicate that the numerical scheme of the code is robust, and no convergence problems are encountered. The calculation is also very fast being three times longer on a Sun SPARC 5 workstation than real time and typicallymore » {approx} 10 times faster than an identical calculation with the VICTORIA code. The study demonstrates that the SOPHAEROS 1.3 code is a suitable tool for prediction of the vapor chemistry and fission product transport with a reasonable level of accuracy. Furthermore, the fexibility of the code material data bank allows improvement of understanding of fission product transport and deposition in the circuit. Performing sensitivity studies with different chemical species or with different properties (saturation pressure, chemical equilibrium constants) is very straightforward.« less
Mazzaferri, Javier; Larrivée, Bruno; Cakir, Bertan; Sapieha, Przemyslaw; Costantino, Santiago
2018-03-02
Preclinical studies of vascular retinal diseases rely on the assessment of developmental dystrophies in the oxygen induced retinopathy rodent model. The quantification of vessel tufts and avascular regions is typically computed manually from flat mounted retinas imaged using fluorescent probes that highlight the vascular network. Such manual measurements are time-consuming and hampered by user variability and bias, thus a rapid and objective method is needed. Here, we introduce a machine learning approach to segment and characterize vascular tufts, delineate the whole vasculature network, and identify and analyze avascular regions. Our quantitative retinal vascular assessment (QuRVA) technique uses a simple machine learning method and morphological analysis to provide reliable computations of vascular density and pathological vascular tuft regions, devoid of user intervention within seconds. We demonstrate the high degree of error and variability of manual segmentations, and designed, coded, and implemented a set of algorithms to perform this task in a fully automated manner. We benchmark and validate the results of our analysis pipeline using the consensus of several manually curated segmentations using commonly used computer tools. The source code of our implementation is released under version 3 of the GNU General Public License ( https://www.mathworks.com/matlabcentral/fileexchange/65699-javimazzaf-qurva ).
Station Blackout at Browns Ferry Unit One - accident sequence analysis. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, D.H.; Harrington, R.M.; Greene, S.R.
1981-11-01
This study describes the predicted response of Unit 1 at the Browns Ferry Nuclear Plant to Station Blackout, defined as a loss of offsite power combined with failure of all onsite emergency diesel-generators to start and load. Every effort has been made to employ the most realistic assumptions during the process of defining the sequence of events for this hypothetical accident. DC power is assumed to remain available from the unit batteries during the initial phase and the operator actions and corresponding events during this period are described using results provided by an analysis code developed specifically for this purpose.more » The Station Blackout is assumed to persist beyond the point of battery exhaustion and the events during this second phase of the accident in which dc power would be unavailable were determined through use of the MARCH code. Without dc power, cooling water could no longer be injected into the reactor vessel and the events of the second phase include core meltdown and subsequent containment failure. An estimate of the magnitude and timing of the concomitant release of the noble gas, cesium, and iodine-based fission products to the environment is provided in Volume 2 of this report. 58 refs., 75 figs., 8 tabs.« less
Phase II Evaluation of Clinical Coding Schemes
Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith
1997-01-01
Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIECK, C.A.
1999-02-23
This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less
Identification of coronal heating events in 3D simulations
NASA Astrophysics Data System (ADS)
Kanella, Charalambos; Gudiksen, Boris V.
2017-07-01
Context. The solar coronal heating problem has been an open question in the science community since 1939. One of the proposed models for the transport and release of mechanical energy generated in the sub-photospheric layers and photosphere is the magnetic reconnection model that incorporates Ohmic heating, which releases a part of the energy stored in the magnetic field. In this model many unresolved flaring events occur in the solar corona, releasing enough energy to heat the corona. Aims: The problem with the verification and quantification of this model is that we cannot resolve small scale events due to limitations of the current observational instrumentation. Flaring events have scaling behavior extending from large X-class flares down to the so far unobserved nanoflares. Histograms of observable characteristics of flares show powerlaw behavior for energy release rate, size, and total energy. Depending on the powerlaw index of the energy release, nanoflares might be an important candidate for coronal heating; we seek to find that index. Methods: In this paper we employ a numerical three-dimensional (3D)-magnetohydrodynamic (MHD) simulation produced by the numerical code Bifrost, which enables us to look into smaller structures, and a new technique to identify the 3D heating events at a specific instant. The quantity we explore is the Joule heating, a term calculated directly by the code, which is explicitly correlated with the magnetic reconnection because it depends on the curl of the magnetic field. Results: We are able to identify 4136 events in a volume 24 × 24 × 9.5 Mm3 (I.e., 768 × 786 × 331 grid cells) of a specific snapshot. We find a powerlaw slope of the released energy per second equal to αP = 1.5 ± 0.02, and two powerlaw slopes of the identified volume equal to αV = 1.53 ± 0.03 and αV = 2.53 ± 0.22. The identified energy events do not represent all the released energy, but of the identified events, the total energy of the largest events dominate the energy release. Most of the energy release happens in the lower corona, while heating drops with height. We find that with a specific identification method large events can be resolved into smaller ones, but at the expense of the total identified energy releases. The energy release that cannot be identified as an event favors a low energy release mechanism. Conclusions: This is the first step to quantitatively identify magnetic reconnection sites and measure the energy released by current sheet formation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenzie-Carter, M.A.; Lyon, R.E.
This report contains information to support the Environmental Assessment for the Compact Ignition Tokamak Project (CIT) proposed for Princeton Plasma Physics Laboratory (PPPL). The assumptions and methodology used to assess the impact to members of the public from operational and accidental releases of radioactive material from the proposed CIT during the operational period of the project are described. A description of the tracer release tests conducted at PPPL by NOAA is included; dispersion values from these tests are used in the dose calculation. Radiological releases, doses, and resulting health risks are calculated. The computer code AIRDOS-EPA is used to calculatemore » the individual and population doses for routine releases; FUSCRAC3 is used to calculate doses resulting from off-normal releases where direct application of the NOAA tracer test data is not practical. Where applicable, doses are compared to regulatory limits and guidelines values. 44 refs., 5 figs., 18 tabs.« less
1989-11-01
STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution unlimited 13. ABSTRACT (Maximum 200 words) The Spacecraft Charging... Distribution I D Availability Codes ’ Avail and/orDist Special VIvt PREFACE The Spacecraft Charging Technology conference was held at the Naval... distribution , the spacecraft will charge negatively during this time according to dV/dt = 47ta 2 Jth ev/° / C whose solution is V/0= - ln(l + t/t) "t = C 0
2014-04-01
important data structures of RTEMS are introduced. Section 3.2.2 discusses the problems we found in RTEMS that may cause security vulnerabilities...the important data structures in RTEMS: Object, which is a critical data structure in the SCORE, tasks threads. Approved for Public Release...these important system codes. The example code shows a possibility that a user can delete a system thread. Therefore, in order to protect system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, John H.; Belcourt, Kenneth Noel
Completion of the CASL L3 milestone THM.CFD.P6.03 provides a tabular material properties capability to the Hydra code. A tabular interpolation package used in Sandia codes was modified to support the needs of multi-phase solvers in Hydra. Use of the interface is described. The package was released to Hydra under a government use license. A dummy physics was created in Hydra to prototype use of the interpolation routines. Finally, a test using the dummy physics verifies the correct behavior of the interpolation for a test water table. 3
Security Police Officer Utilization Field, AFSCs 8111, 8116, 8121, and 8124.
1981-06-01
STATEMENT A M C Approved for public release 82 0 4 26Distribution Unlimited C=DCC=D= APS 8 1 X CECI I CODING INSTRUCTIONS -- -- -" Print the booklet copy...m == NAME (Last, First, Middle Initial) DATE OF BIRTH SEX YR NO DAY (MALE -"(11-34) (3s-5- rayo (41) PRESENT GRADE: SOCIAL SECURITY ACCOUNT NUMBER...Branch - 11. OIC Missile Support Branch - __m 12. OIC Weapons Systems Security , 4 8 CODE 99 X ~.t ! AFS 81XX -mmm -C - . .’ .9 - =, BACKGROUND
1988-01-21
DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release; 2b. DECLASSIFICATION /’DOWNGRADING SCHEDULE Distribution unlimited 4. PERFORMING ORGANIZATION ...REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) AFGL-TR-88-0016 6a, NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF...MONITORING ORGANIZATION Air Force Geophysics (If applicable) Laboratory I oc. ADDRESS (City, State, and ZIP Code) 7b ADDRESS (City, Stare, and ZIP Code
2012-01-01
Action potentials at the neurons and graded signals at the synapses are primary codes in the brain. In terms of their functional interaction, the studies were focused on the influence of presynaptic spike patterns on synaptic activities. How the synapse dynamics quantitatively regulates the encoding of postsynaptic digital spikes remains unclear. We investigated this question at unitary glutamatergic synapses on cortical GABAergic neurons, especially the quantitative influences of release probability on synapse dynamics and neuronal encoding. Glutamate release probability and synaptic strength are proportionally upregulated by presynaptic sequential spikes. The upregulation of release probability and the efficiency of probability-driven synaptic facilitation are strengthened by elevating presynaptic spike frequency and Ca2+. The upregulation of release probability improves spike capacity and timing precision at postsynaptic neuron. These results suggest that the upregulation of presynaptic glutamate release facilitates a conversion of synaptic analogue signals into digital spikes in postsynaptic neurons, i.e., a functional compatibility between presynaptic and postsynaptic partners. PMID:22852823
Tasiemski, Aurélie; Hammad, Hamida; Vandenbulcke, Franck; Breton, Christophe; Bilfinger, Thomas J; Pestel, Joel; Salzet, Michel
2002-07-15
Chromogranin A (CGA) and chromogranin B (CGB) are acidic proteins stored in secretory organelles of endocrine cells and neurons. In addition to their roles as helper proteins in the packaging of peptides, they may serve as prohormones to generate biologically active peptides such as vasostatin-1 and secretolytin. These molecules derived from CGA and CGB, respectively, possess antimicrobial properties. The present study demonstrates that plasmatic levels of both vasostatin-1 and secretolytin increase during surgery in patients undergoing cardiopulmonary bypass (CPB). Vasostatin-1 and secretolytin, initially present in plasma at low levels, are released just after skin incision. Consequently, they can be added to enkelytin, an antibacterial peptide derived from proenkephalin A, for the panoply of components acting as a first protective barrier against hypothetical invasion of pathogens, which may occur during surgery. CGA and CGB, more commonly viewed as markers for endocrine and neuronal cells, were also found to have an immune origin. RNA messengers coding for CGB were amplified by reverse transcription-polymerase chain reaction in human monocytes, and immunocytochemical analysis by confocal microscopy revealed the presence of CGA or CGB or both in monocytes and neutrophils. A combination of techniques including confocal microscopic analysis, mass spectrometry measurement, and antibacterial tests allowed for the identification of the positive role of interleukin 6 (IL-6) in the secretolytin release from monocytes in vitro. Because IL-6 release is known to be strongly enhanced during CPB, we suggest a possible relationship between IL-6 and the increased level of secretolytin in patients undergoing CPB.
Application Program Interface for the Orion Aerodynamics Database
NASA Technical Reports Server (NTRS)
Robinson, Philip E.; Thompson, James
2013-01-01
The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced
2013-05-23
for Public Release; Distribution is Unlimited REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burd9l1 for this...Ave. Ft. Leavenworth, KS 66027 11. SUPPLEMENTARY NOTES 12a. DISTRIBUTION/AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE APPROVED FOR PUBLIC ...General Staff College, the United States Army, the Department of Defense, or any other US government agency. Cleared for public release
An Evaluation of Sea Turtle Populations and Survival Status on Vieques Island.
1982-06-22
Pritchard T. H. Stubbs Florida Audubon Society (N66001-80-C-0560) 22 June 1982 Prepared for Marine Sciences Division Approved for public release...the Florida Audubon Society for NOSC Marine Sciences Division (Code 513). I Released by Under authority of S. Yamamoto. Head H.O. Porter. Head Marine ...Coollco so wo . i .0*W ow a AomY b block4 m.. Reptiles -. HawksbilIVieques Isand Loaesbsd Green turtle Nestn Turtles Loatharback 2 0. AGSTA ACT
Biometric Identification Verification Technology Status and Feasibility Study
1994-09-01
L’., .- CONTRACT No. DNA 001 -93-C-01 37 Approved for public release;T distribution Is unlimited. ~v 94g’ Destroy this report when it is no longer...DISTRIBUI ION/AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is unlimited. 13. ABSTRACT (Maximurm 200 wvrds) DoD...guys." 4lie issue is then reduced to one of positive identification and control. Traditiozal~y, this has beeýu accomplished by posting a guard or entry
Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.
2012-07-01
This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less
Douzery, Emmanuel J P; Scornavacca, Celine; Romiguier, Jonathan; Belkhir, Khalid; Galtier, Nicolas; Delsuc, Frédéric; Ranwez, Vincent
2014-07-01
Comparative genomic studies extensively rely on alignments of orthologous sequences. Yet, selecting, gathering, and aligning orthologous exons and protein-coding sequences (CDS) that are relevant for a given evolutionary analysis can be a difficult and time-consuming task. In this context, we developed OrthoMaM, a database of ORTHOlogous MAmmalian Markers describing the evolutionary dynamics of orthologous genes in mammalian genomes using a phylogenetic framework. Since its first release in 2007, OrthoMaM has regularly evolved, not only to include newly available genomes but also to incorporate up-to-date software in its analytic pipeline. This eighth release integrates the 40 complete mammalian genomes available in Ensembl v73 and provides alignments, phylogenies, evolutionary descriptor information, and functional annotations for 13,404 single-copy orthologous CDS and 6,953 long exons. The graphical interface allows to easily explore OrthoMaM to identify markers with specific characteristics (e.g., taxa availability, alignment size, %G+C, evolutionary rate, chromosome location). It hence provides an efficient solution to sample preprocessed markers adapted to user-specific needs. OrthoMaM has proven to be a valuable resource for researchers interested in mammalian phylogenomics, evolutionary genomics, and has served as a source of benchmark empirical data sets in several methodological studies. OrthoMaM is available for browsing, query and complete or filtered downloads at http://www.orthomam.univ-montp2.fr/. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Tang, Songsong; Gu, Yuan; Lu, Huiting; Dong, Haifeng; Zhang, Kai; Dai, Wenhao; Meng, Xiangdan; Yang, Fan; Zhang, Xueji
2018-04-03
Herein, a highly-sensitive microRNA (miRNA) detection strategy was developed by combining bio-bar-code assay (BBA) with catalytic hairpin assembly (CHA). In the proposed system, two nanoprobes of magnetic nanoparticles functionalized with DNA probes (MNPs-DNA) and gold nanoparticles with numerous barcode DNA (AuNPs-DNA) were designed. In the presence of target miRNA, the MNP-DNA and AuNP-DNA hybridized with target miRNA to form a "sandwich" structure. After "sandwich" structures were separated from the solution by the magnetic field and dehybridized by high temperature, the barcode DNA sequences were released by dissolving AuNPs. The released barcode DNA sequences triggered the toehold strand displacement assembly of two hairpin probes, leading to recycle of barcode DNA sequences and producing numerous fluorescent CHA products for miRNA detection. Under the optimal experimental conditions, the proposed two-stage amplification system could sensitively detect target miRNA ranging from 10 pM to 10 aM with a limit of detection (LOD) down to 97.9 zM. It displayed good capability to discriminate single base and three bases mismatch due to the unique sandwich structure. Notably, it presented good feasibility for selective multiplexed detection of various combinations of synthetic miRNA sequences and miRNAs extracted from different cell lysates, which were in agreement with the traditional polymerase chain reaction analysis. The two-stage amplification strategy may be significant implication in the biological detection and clinical diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medin, Stanislav A.; Basko, Mikhail M.; Orlov, Yurii N.
2012-07-11
Radiation hydrodynamics 1D simulations were performed with two concurrent codes, DEIRA and RAMPHY. The DEIRA code was used for DT capsule implosion and burn, and the RAMPHY code was used for computation of X-ray and fast ions deposition in the first wall liquid film of the reactor chamber. The simulations were run for 740 MJ direct drive DT capsule and Pb thin liquid wall reactor chamber of 10 m diameter. Temporal profiles for DT capsule leaking power of X-rays, neutrons and fast {sup 4}He ions were obtained and spatial profiles of the liquid film flow parameter were computed and analyzed.
The jmzQuantML programming interface and validator for the mzQuantML data standard.
Qi, Da; Krishna, Ritesh; Jones, Andrew R
2014-03-01
The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Natural selection drove metabolic specialization of the chromatophore in Paulinella chromatophora.
Valadez-Cano, Cecilio; Olivares-Hernández, Roberto; Resendis-Antonio, Osbaldo; DeLuna, Alexander; Delaye, Luis
2017-04-14
Genome degradation of host-restricted mutualistic endosymbionts has been attributed to inactivating mutations and genetic drift while genes coding for host-relevant functions are conserved by purifying selection. Unlike their free-living relatives, the metabolism of mutualistic endosymbionts and endosymbiont-originated organelles is specialized in the production of metabolites which are released to the host. This specialization suggests that natural selection crafted these metabolic adaptations. In this work, we analyzed the evolution of the metabolism of the chromatophore of Paulinella chromatophora by in silico modeling. We asked whether genome reduction is driven by metabolic engineering strategies resulted from the interaction with the host. As its widely known, the loss of enzyme coding genes leads to metabolic network restructuring sometimes improving the production rates. In this case, the production rate of reduced-carbon in the metabolism of the chromatophore. We reconstructed the metabolic networks of the chromatophore of P. chromatophora CCAC 0185 and a close free-living relative, the cyanobacterium Synechococcus sp. WH 5701. We found that the evolution of free-living to host-restricted lifestyle rendered a fragile metabolic network where >80% of genes in the chromatophore are essential for metabolic functionality. Despite the lack of experimental information, the metabolic reconstruction of the chromatophore suggests that the host provides several metabolites to the endosymbiont. By using these metabolites as intracellular conditions, in silico simulations of genome evolution by gene lose recover with 77% accuracy the actual metabolic gene content of the chromatophore. Also, the metabolic model of the chromatophore allowed us to predict by flux balance analysis a maximum rate of reduced-carbon released by the endosymbiont to the host. By inspecting the central metabolism of the chromatophore and the free-living cyanobacteria we found that by improvements in the gluconeogenic pathway the metabolism of the endosymbiont uses more efficiently the carbon source for reduced-carbon production. In addition, our in silico simulations of the evolutionary process leading to the reduced metabolic network of the chromatophore showed that the predicted rate of released reduced-carbon is obtained in less than 5% of the times under a process guided by random gene deletion and genetic drift. We interpret previous findings as evidence that natural selection at holobiont level shaped the rate at which reduced-carbon is exported to the host. Finally, our model also predicts that the ABC phosphate transporter (pstSACB) which is conserved in the genome of the chromatophore of P. chromatophora strain CCAC 0185 is a necessary component to release reduced-carbon molecules to the host. Our evolutionary analysis suggests that in the case of Paulinella chromatophora natural selection at the holobiont level played a prominent role in shaping the metabolic specialization of the chromatophore. We propose that natural selection acted as a "metabolic engineer" by favoring metabolic restructurings that led to an increased release of reduced-carbon to the host.
Xu, Xuebin; Hu, Xin; Ding, Zhuhong; Chen, Yijun
2017-12-01
The potential release of toxic elements and the stability of carbon in sludge-based biochars are important on their application in soil remediation and wastewater treatment. In this study, municipal sludge was co-pyrolyzed with calcium carbonate (CaCO 3 ) and calcium dihydrogen phosphate [Ca(H 2 PO 4 ) 2 ] under 300 and 600 °C, respectively. The basic physicochemical properties of the resultant biochars were characterized and laboratory chemical oxidation and leaching experiments of toxic elements were conducted to evaluate the chemical stability of carbon in biochars and the potential release of toxic elements from biochars. Results show that the exogenous minerals changed the physico-chemical properties of the resultant biochars greatly. Biochars with exogenous minerals, especially Ca(H 2 PO 4 ) 2 , decreased the release of Zn, Cr, Ni, Cu, Pb, and As and the release ratios were less than 1%. Tessier's sequential extraction analysis revealed that labile toxic elements were transferred to residual fraction in the biochars with high pyrolysis temperature (600 °C) and exogenous minerals. Low risks for biochar-bound Pb, Zn, Cd, As, Cr, and Cu were confirmed according to risk assessment code (RAC) while the potential ecological risk index (PERI) revealed that the exogenous Ca(H 2 PO 4 ) 2 significantly decreased the risks from considerable to moderate level. Moreover, the exogenous minerals significantly increased the chemical stability of carbon in 600 °C-pyrolyzed biochars by 10-20%. These results indicated that the copyrolysis of sludge with phosphate and carbonate, especially phosphate, were effective methods to prepare the sludge-based biochars with immobilized toxic elements and enhanced chemical stability of carbon. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stand-alone containment analysis of Phébus FPT tests with ASTEC and MELCOR codes: the FPT-2 test.
Gonfiotti, Bruno; Paci, Sandro
2018-03-01
During the last 40 years, many studies have been carried out to investigate the different phenomena occurring during a Severe Accident (SA) in a Nuclear Power Plant (NPP). Such efforts have been supported by the execution of different experimental campaigns, and the integral Phébus FP tests were probably some of the most important experiments in this field. In these tests, the degradation of a Pressurized Water Reactor (PWR) fuel bundle was investigated employing different control rod materials and burn-up levels in strongly or weakly oxidizing conditions. From the findings on these and previous tests, numerical codes such as ASTEC and MELCOR have been developed to analyze the evolution of a SA in real NPPs. After the termination of the Phébus FP campaign, these two codes have been furthermore improved to implement the more recent findings coming from different experimental campaigns. Therefore, continuous verification and validation is still necessary to check that the new improvements introduced in such codes allow also a better prediction of these Phébus tests. The aim of the present work is to re-analyze the Phébus FPT-2 test employing the updated ASTEC and MELCOR code versions. The analysis focuses on the stand-alone containment aspects of this test, and three different spatial nodalizations of the containment vessel (CV) have been developed. The paper summarizes the main thermal-hydraulic results and presents different sensitivity analyses carried out on the aerosols and fission products (FP) behavior. When possible, a comparison among the results obtained during this work and by different authors in previous work is also performed. This paper is part of a series of publications covering the four Phébus FP tests using a PWR fuel bundle: FPT-0, FPT-1, FPT-2, and FPT-3, excluding the FPT-4 one, related to the study of the release of low-volatility FP and transuranic elements from a debris bed and a pool of melted fuel.
CLUMPY: A code for γ-ray signals from dark matter structures
NASA Astrophysics Data System (ADS)
Charbonnier, Aldée; Combet, Céline; Maurin, David
2012-03-01
We present the first public code for semi-analytical calculation of the γ-ray flux astrophysical J-factor from dark matter annihilation/decay in the Galaxy, including dark matter substructures. The core of the code is the calculation of the line of sight integral of the dark matter density squared (for annihilations) or density (for decaying dark matter). The code can be used in three modes: i) to draw skymaps from the Galactic smooth component and/or the substructure contributions, ii) to calculate the flux from a specific halo (that is not the Galactic halo, e.g. dwarf spheroidal galaxies) or iii) to perform simple statistical operations from a list of allowed DM profiles for a given object. Extragalactic contributions and other tracers of DM annihilation (e.g. positrons, anti-protons) will be included in a second release.
Three empirical essays on consumer behavior related to climate change and energy
NASA Astrophysics Data System (ADS)
Jacobsen, Grant Douglas
This dissertation consists of three essays. All of the chapters address a topic in the area of household and consumer behavior related to climate change or energy. The first chapter is titled "The Al Gore Effect: An Inconvenient Truth and Voluntary Carbon Offsets". This chapter examines the relationship between climate change awareness and household behavior by testing whether Al Gore's documentary An Inconvenient Truth caused an increase in the purchase of voluntary carbon offsets. The analysis shows that in the two months following the film's release, zip codes within a 10-mile radius of a zip code where the film was shown experienced a 50 percent relative increase in the purchase of voluntary carbon offsets. The second chapter is titled "Are Building Codes Effective at Saving Energy? Evidence from Residential Billing Data in Florida". The analysis shows that Florida's energy-code change that took effect in 2002 is associated with a 4-percent decrease in electricity consumption and a 6-percent decrease in natural-gas consumption in Gainesville, FL. The estimated private payback period for the average residence is 6.4 years and the social payback period ranges between 3.5 and 5.3 years. The third chapter in this dissertation is titled "Do Environmental Offsets Increase Demand for Dirty Goods? Evidence from Residential Electricity Demand". This study evaluates the relationship between green products and existing patterns of consumer behavior by examining the relationship between household enrollment in a green electricity program and consumption of residential electricity. The results suggest there are two different types of green consumers. One type makes a small monthly donation and partially views the donation as a substitute for a previously existing pattern of green behavior, in this case, energy conservation. The other type makes a larger monthly donation and views the donation as a way to make strictly additional improvements in environmental quality.
HYDRATE v1.5 OPTION OF TOUGH+ v1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, George
HYDRATE v1.5 is a numerical code that for the simulation of the behavior of hydrate-bearing geologic systems, and represents the third update of the code since its first release [Moridis et al., 2008]. It is an option of TOUGH+ v1.5 [Moridis and Pruess, 2014], a successor to the TOUGH2 [Pruess et al., 1999, 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. HYDRATE v1.5 needs the TOUGH+ v1.5 core code in order to compile and execute. It is written in standard FORTRAN 95/2003, and can be run on any computational platformmore » (workstation, PC, Macintosh) for which such compilers are available. By solving the coupled equations of mass and heat balance, the fully operational TOUGH+HYDRATE code can model the non-isothermal gas release, phase behavior and flow of fluids and heat under conditions typical of common natural CH 4-hydrate deposits (i.e., in the permafrost and in deep ocean sediments) in complex geological media at any scale (from laboratory to reservoir) at which Darcy's law is valid. TOUGH+HYDRATE v1.5 includes both an equilibrium and a kinetic model of hydrate formation and dissociation. The model accounts for heat and up to four mass components, i.e., water, CH 4, hydrate, and water-soluble inhibitors such as salts or alcohols. These are partitioned among four possible phases (gas phase, liquid phase, ice phase and hydrate phase). Hydrate dissociation or formation, phase changes and the corresponding thermal effects are fully described, as are the effects of inhibitors. The model can describe all possible hydrate dissociation mechanisms, i.e., depressurization, thermal stimulation, salting-out effects and inhibitor-induced effects.« less
Annual Stock Assessment - CWT [Coded Wire Tag program] (USFWS), Annual Report 2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastor, Stephen M.
2009-07-21
In 1989 the Bonneville Power Administration (BPA) began funding the evaluation of production groups of juvenile anadromous fish not being coded-wire tagged for other programs. These groups were the 'Missing Production Groups'. Production fish released by the U.S. Fish and Wildlife Service (FWS) without representative coded-wire tags during the 1980s are indicated as blank spaces on the survival graphs in this report. This program is now referred to as 'Annual Stock Assessment - CWT'. The objectives of the 'Annual Stock Assessment' program are to: (1) estimate the total survival of each production group, (2) estimate the contribution of each productionmore » group to fisheries, and (3) prepare an annual report for USFWS hatcheries in the Columbia River basin. Coded-wire tag recovery information will be used to evaluate the relative success of individual brood stocks. This information can also be used by salmon harvest managers to develop plans to allow the harvest of excess hatchery fish while protecting threatened, endangered, or other stocks of concern. All fish release information, including marked/unmarked ratios, is reported to the Pacific States Marine Fisheries Commission (PSMFC). Fish recovered in the various fisheries or at the hatcheries are sampled to recover coded-wire tags. This recovery information is also reported to PSMFC. This report has been prepared annually starting with the report labeled 'Annual Report 1994'. Although the current report has the title 'Annual Report 2007', it was written in fall of 2008 using data available from RMIS that same year, and submitted as final in January 2009. The main objective of the report is to evaluate survival of groups which have been tagged under this ongoing project.« less
NASA Technical Reports Server (NTRS)
Koppenhoefer, Kyle C.; Gullerud, Arne S.; Ruggieri, Claudio; Dodds, Robert H., Jr.; Healy, Brian E.
1998-01-01
This report describes theoretical background material and commands necessary to use the WARP3D finite element code. WARP3D is under continuing development as a research code for the solution of very large-scale, 3-D solid models subjected to static and dynamic loads. Specific features in the code oriented toward the investigation of ductile fracture in metals include a robust finite strain formulation, a general J-integral computation facility (with inertia, face loading), an element extinction facility to model crack growth, nonlinear material models including viscoplastic effects, and the Gurson-Tver-gaard dilatant plasticity model for void growth. The nonlinear, dynamic equilibrium equations are solved using an incremental-iterative, implicit formulation with full Newton iterations to eliminate residual nodal forces. The history integration of the nonlinear equations of motion is accomplished with Newmarks Beta method. A central feature of WARP3D involves the use of a linear-preconditioned conjugate gradient (LPCG) solver implemented in an element-by-element format to replace a conventional direct linear equation solver. This software architecture dramatically reduces both the memory requirements and CPU time for very large, nonlinear solid models since formation of the assembled (dynamic) stiffness matrix is avoided. Analyses thus exhibit the numerical stability for large time (load) steps provided by the implicit formulation coupled with the low memory requirements characteristic of an explicit code. In addition to the much lower memory requirements of the LPCG solver, the CPU time required for solution of the linear equations during each Newton iteration is generally one-half or less of the CPU time required for a traditional direct solver. All other computational aspects of the code (element stiffnesses, element strains, stress updating, element internal forces) are implemented in the element-by- element, blocked architecture. This greatly improves vectorization of the code on uni-processor hardware and enables straightforward parallel-vector processing of element blocks on multi-processor hardware.
A model describing intra-granular fission gas behaviour in oxide fuel for advanced engineering tools
NASA Astrophysics Data System (ADS)
Pizzocri, D.; Pastore, G.; Barani, T.; Magni, A.; Luzzi, L.; Van Uffelen, P.; Pitts, S. A.; Alfonsi, A.; Hales, J. D.
2018-04-01
The description of intra-granular fission gas behaviour is a fundamental part of any model for the prediction of fission gas release and swelling in nuclear fuel. In this work we present a model describing the evolution of intra-granular fission gas bubbles in terms of bubble number density and average size, coupled to gas release to grain boundaries. The model considers the fundamental processes of single gas atom diffusion, gas bubble nucleation, re-solution and gas atom trapping at bubbles. The model is derived from a detailed cluster dynamics formulation, yet it consists of only three differential equations in its final form; hence, it can be efficiently applied in engineering fuel performance codes while retaining a physical basis. We discuss improvements relative to previous single-size models for intra-granular bubble evolution. We validate the model against experimental data, both in terms of bubble number density and average bubble radius. Lastly, we perform an uncertainty and sensitivity analysis by propagating the uncertainties in the parameters to model results.
Bostelmann, Friederike; Hammer, Hans R.; Ortensi, Javier; ...
2015-12-30
Within the framework of the IAEA Coordinated Research Project on HTGR Uncertainty Analysis in Modeling, criticality calculations of the Very High Temperature Critical Assembly experiment were performed as the validation reference to the prismatic MHTGR-350 lattice calculations. Criticality measurements performed at several temperature points at this Japanese graphite-moderated facility were recently included in the International Handbook of Evaluated Reactor Physics Benchmark Experiments, and represent one of the few data sets available for the validation of HTGR lattice physics. Here, this work compares VHTRC criticality simulations utilizing the Monte Carlo codes Serpent and SCALE/KENO-VI. Reasonable agreement was found between Serpent andmore » KENO-VI, but only the use of the latest ENDF cross section library release, namely the ENDF/B-VII.1 library, led to an improved match with the measured data. Furthermore, the fourth beta release of SCALE 6.2/KENO-VI showed significant improvements from the current SCALE 6.1.2 version, compared to the experimental values and Serpent.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter Andrew
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less
TERA submitted by University of California, Riverside and given the tracking designation of R-03-0001. The microorganism has been modified to carry a coding sequence of DsRed for expressing a red fluorescent protein.
RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, S.L.; Miller, L.A.; Monroe, D.K.
1998-04-01
This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less
NASA Astrophysics Data System (ADS)
Miller, C. J.; Gasson, D.; Fuentes, E.
2007-10-01
The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).
Construction of a cDNA microarray derived from the ascidian Ciona intestinalis.
Azumi, Kaoru; Takahashi, Hiroki; Miki, Yasufumi; Fujie, Manabu; Usami, Takeshi; Ishikawa, Hisayoshi; Kitayama, Atsusi; Satou, Yutaka; Ueno, Naoto; Satoh, Nori
2003-10-01
A cDNA microarray was constructed from a basal chordate, the ascidian Ciona intestinalis. The draft genome of Ciona has been read and inferred to contain approximately 16,000 protein-coding genes, and cDNAs for transcripts of 13,464 genes have been characterized and compiled as the "Ciona intestinalis Gene Collection Release I". In the present study, we constructed a cDNA microarray of these 13,464 Ciona genes. A preliminary experiment with Cy3- and Cy5-labeled probes showed extensive differential gene expression between fertilized eggs and larvae. In addition, there was a good correlation between results obtained by the present microarray analysis and those from previous EST analyses. This first microarray of a large collection of Ciona intestinalis cDNA clones should facilitate the analysis of global gene expression and gene networks during the embryogenesis of basal chordates.
NASA Technical Reports Server (NTRS)
Darling, Douglas; Radhakrishnan, Krishnan; Oyediran, Ayo
1995-01-01
Premixed combustors, which are being considered for low NOx engines, are susceptible to instabilities due to feedback between pressure perturbations and combustion. This feedback can cause damaging mechanical vibrations of the system as well as degrade the emissions characteristics and combustion efficiency. In a lean combustor instabilities can also lead to blowout. A model was developed to perform linear combustion-acoustic stability analysis using detailed chemical kinetic mechanisms. The Lewis Kinetics and Sensitivity Analysis Code, LSENS, was used to calculate the sensitivities of the heat release rate to perturbations in density and temperature. In the present work, an assumption was made that the mean flow velocity was small relative to the speed of sound. Results of this model showed the regions of growth of perturbations to be most sensitive to the reflectivity of the boundary when reflectivities were close to unity.
Generalized Support Software: Domain Analysis and Implementation
NASA Technical Reports Server (NTRS)
Stark, Mike; Seidewitz, Ed
1995-01-01
For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.
Kepler Uniform Modeling of KOIs: MCMC Notes for Data Release 25
NASA Technical Reports Server (NTRS)
Hoffman, Kelsey L.; Rowe, Jason F.
2017-01-01
This document describes data products related to the reported planetary parameters and uncertainties for the Kepler Objects of Interest (KOIs) based on a Markov-Chain-Monte-Carlo (MCMC) analysis. Reported parameters, uncertainties and data products can be found at the NASA Exoplanet Archive . The codes used for this data analysis are available on the Github website (Rowe 2016). The relevant paper for details of the calculations is Rowe et al. (2015). The main differences between the model fits discussed here and those in the DR24 catalogue are that the DR25 light curves were used in the analysis, our processing of the MAST light curves took into account different data flags, the number of chains calculated was doubled to 200 000, and the parameters which are reported are based on a damped least-squares fit, instead of the median value from the Markov chain or the chain with the lowest 2 as reported in the past.
Mapping the polysaccharide degradation potential of Aspergillus niger
2012-01-01
Background The degradation of plant materials by enzymes is an industry of increasing importance. For sustainable production of second generation biofuels and other products of industrial biotechnology, efficient degradation of non-edible plant polysaccharides such as hemicellulose is required. For each type of hemicellulose, a complex mixture of enzymes is required for complete conversion to fermentable monosaccharides. In plant-biomass degrading fungi, these enzymes are regulated and released by complex regulatory structures. In this study, we present a methodology for evaluating the potential of a given fungus for polysaccharide degradation. Results Through the compilation of information from 203 articles, we have systematized knowledge on the structure and degradation of 16 major types of plant polysaccharides to form a graphical overview. As a case example, we have combined this with a list of 188 genes coding for carbohydrate-active enzymes from Aspergillus niger, thus forming an analysis framework, which can be queried. Combination of this information network with gene expression analysis on mono- and polysaccharide substrates has allowed elucidation of concerted gene expression from this organism. One such example is the identification of a full set of extracellular polysaccharide-acting genes for the degradation of oat spelt xylan. Conclusions The mapping of plant polysaccharide structures along with the corresponding enzymatic activities is a powerful framework for expression analysis of carbohydrate-active enzymes. Applying this network-based approach, we provide the first genome-scale characterization of all genes coding for carbohydrate-active enzymes identified in A. niger. PMID:22799883
Mapping the polysaccharide degradation potential of Aspergillus niger.
Andersen, Mikael R; Giese, Malene; de Vries, Ronald P; Nielsen, Jens
2012-07-16
The degradation of plant materials by enzymes is an industry of increasing importance. For sustainable production of second generation biofuels and other products of industrial biotechnology, efficient degradation of non-edible plant polysaccharides such as hemicellulose is required. For each type of hemicellulose, a complex mixture of enzymes is required for complete conversion to fermentable monosaccharides. In plant-biomass degrading fungi, these enzymes are regulated and released by complex regulatory structures. In this study, we present a methodology for evaluating the potential of a given fungus for polysaccharide degradation. Through the compilation of information from 203 articles, we have systematized knowledge on the structure and degradation of 16 major types of plant polysaccharides to form a graphical overview. As a case example, we have combined this with a list of 188 genes coding for carbohydrate-active enzymes from Aspergillus niger, thus forming an analysis framework, which can be queried. Combination of this information network with gene expression analysis on mono- and polysaccharide substrates has allowed elucidation of concerted gene expression from this organism. One such example is the identification of a full set of extracellular polysaccharide-acting genes for the degradation of oat spelt xylan. The mapping of plant polysaccharide structures along with the corresponding enzymatic activities is a powerful framework for expression analysis of carbohydrate-active enzymes. Applying this network-based approach, we provide the first genome-scale characterization of all genes coding for carbohydrate-active enzymes identified in A. niger.
Unit Testing for the Application Control Language (ACL) Software
NASA Technical Reports Server (NTRS)
Heinich, Christina Marie
2014-01-01
In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.
Probabilistic Seismic Hazard Assessment for Iraq
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq
Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less
Microwave Spectroscopy of 2-PENTANONE
NASA Astrophysics Data System (ADS)
Andresen, Maike; Nguyen, Ha Vinh Lam; Kleiner, Isabelle; Stahl, Wolfgang
2017-06-01
Methyl propyl ketone (MPK) or 2-Pentanone is known to be an alarm pheroromone released by the mandibular glands of the bees. It is a highly volatile compound. This molecule was studied by a combination of quantum chemical calculations and microwave spectroscopy in order to get informations about the lowest energy conformers and their structures.The rotational spectrum of 2-pentanone was measured using the molecular beam Fourier transform microwave spectrometer in Aachen operating between 2 and 26.5 GHz. Ab initio calculations determine 4 conformers but only two of them are observed in our jet-beam conditions.The lowest conformer has a C_{1} structure and its spectrum shows internal rotation splittings arising from two methyl groups. The internal splittings of 305 transitions for this conformer were analyzed using the XIAM code It led to the determination of the values for the barrier heights hindering the internal rotation of two methyl groups of 239 cm^{-1} and 980 cm^{-1} respectively. The next energy conformer has a C_{s} structure and the analysis of the internal splittings of 134 transitions using the XIAM code and the BELGI code led to the determination of internal rotation barrier height of 186 cm^{-1}. Comparisons of quantum chemistry and experimental results will be discussed. H. Hartwig, H. Dreizler, Z. Naturforsch. 51a, 923 (1996). J. T. Hougen, I. Kleiner and M. Godefroid, J. Mol. Spectrosc., 163, 559-586 (1994).
Reliability in content analysis: The case of semantic feature norms classification.
Bolognesi, Marianna; Pilgram, Roosmaryn; van den Heerik, Romy
2017-12-01
Semantic feature norms (e.g., STIMULUS: car → RESPONSE:
Center for Parallel Optimization
1993-09-30
BOLLING AFB DC 20332-0001 _ii _ 11. SUPPLEMENTARY NOTES 12a. DISTRIBUTION/ AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE APPROVED FOR PUBLIC RELEASE...Machines Corporation, March 16-19, 1993 , A Branch- and-Bound Method for Mixed Integer Programming on the CM-.5 "* Dr. Roberto Musmanno, University of
28 CFR 2.14 - Subsequent proceedings.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Subsequent proceedings. 2.14 Section 2.14 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS, YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.14 Subsequent...
28 CFR 2.14 - Subsequent proceedings.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Subsequent proceedings. 2.14 Section 2.14 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS, YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.14 Subsequent...
28 CFR 2.5 - Sentence aggregation.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Sentence aggregation. 2.5 Section 2.5 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS, YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.5 Sentence...
28 CFR 2.5 - Sentence aggregation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Sentence aggregation. 2.5 Section 2.5 Judicial Administration DEPARTMENT OF JUSTICE PAROLE, RELEASE, SUPERVISION AND RECOMMITMENT OF PRISONERS, YOUTH OFFENDERS, AND JUVENILE DELINQUENTS United States Code Prisoners and Parolees § 2.5 Sentence...
A Framework for Global Electronic Commerce: An Executive Summary.
ERIC Educational Resources Information Center
Office of the Press Secretary of the White House
1997-01-01
An abbreviated version of a longer policy document on electronic commerce released by the Clinton Administration, this article examines principles and recommendations on tariffs, taxes, electronic payment systems, uniform commercial code for electronic commerce, intellectual property protection, privacy, security, telecommunications infrastructure…
32 CFR 518.15 - General provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... required for disclosure from a PA system of records, to include the subject's attorney. (4) Release of...; nonjudicial punishment of military personnel under the Uniform Code of Military Justice, Article 15... for action, including the recommendations of the transmitting agency and copies of the requested...
Optical Constant Determination of Bacterial Spores in the MIR
2005-12-05
7 - REPORT DOCUMENTATION PAGE Form Approved OMB NO. 0704-0188 Public Reporting burden for this collection of information... public release; distribution unlimited. 12 b. DISTRIBUTION CODE . 13. ABSTRACT (Maximum 200 words...Important Results ...................................................................... 3 Publications and Technical Reports Submitted under this
Goloborodko, Anton A; Levitsky, Lev I; Ivanov, Mark V; Gorshkov, Mikhail V
2013-02-01
Pyteomics is a cross-platform, open-source Python library providing a rich set of tools for MS-based proteomics. It provides modules for reading LC-MS/MS data, search engine output, protein sequence databases, theoretical prediction of retention times, electrochemical properties of polypeptides, mass and m/z calculations, and sequence parsing. Pyteomics is available under Apache license; release versions are available at the Python Package Index http://pypi.python.org/pyteomics, the source code repository at http://hg.theorchromo.ru/pyteomics, documentation at http://packages.python.org/pyteomics. Pyteomics.biolccc documentation is available at http://packages.python.org/pyteomics.biolccc/. Questions on installation and usage can be addressed to pyteomics mailing list: pyteomics@googlegroups.com.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments.
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E; Rost, Burkhard; Goldberg, Tatyana
2016-11-15
The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is 'web ready': written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/Supplementary information: Supplementary data are available at Bioinformatics online. msa@bio.sh. © The Author 2016. Published by Oxford University Press.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E.; Rost, Burkhard; Goldberg, Tatyana
2016-01-01
Summary: The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is ‘web ready’: written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. Availability and Implementation: The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: msa@bio.sh PMID:27412096
gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data
NASA Astrophysics Data System (ADS)
Hummel, Jacob A.
2016-11-01
We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.
Changes in the prevalence of alcohol in rap music lyrics 1979-2009.
Herd, Denise
2014-02-01
This study examines the prevalence and context of alcohol references in rap music lyrics from 1979 through 2009. Four hundred nine top-ranked rap music songs released were sampled from Billboard magazine rating charts. Songs were analyzed using systematic content analysis and were coded for alcohol beverage types and brand names, drinking behaviors, drinking contexts, attitudes towards alcohol, and consequences of drinking. Trends were analyzed using regression analyses. The results of the study reveal significant increases in the presence of alcohol in rap songs; a decline in negative attitudes towards alcohol; decreases in consequences attributed to alcohol; increases in the association of alcohol with glamour and wealth, drugs, and nightclubs; and increases in references to liquor and champagne.
CREME96 and Related Error Rate Prediction Methods
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.
2012-01-01
Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.
A new nuclide transport model in soil in the GENII-LIN health physics code
NASA Astrophysics Data System (ADS)
Teodori, F.
2017-11-01
The nuclide soil transfer model, originally included in the GENII-LIN software system, was intended for residual contamination from long term activities and from waste form degradation. Short life nuclides were supposed absent or at equilibrium with long life parents. Here we present an enhanced soil transport model, where short life nuclide contributions are correctly accounted. This improvement extends the code capabilities to handle incidental release of contaminant to soil, by evaluating exposure since the very beginning of the contamination event, before the radioactive decay chain equilibrium is reached.
[The Abbreviated Injury Scale (AIS). Options and problems in application].
Haasper, C; Junge, M; Ernstberger, A; Brehme, H; Hannawald, L; Langer, C; Nehmzow, J; Otte, D; Sander, U; Krettek, C; Zwipp, H
2010-05-01
The new AIS (Abbreviated Injury Scale) was released with an update by the AAAM (Association for the Advancement of Automotive Medicine) in 2008. It is a universal scoring system in the field of trauma applicable in clinic and research. In engineering it is used as a classification system for vehicle safety. The AIS can therefore be considered as an international, interdisciplinary and universal code of injury severity. This review focuses on a historical overview, potential applications and new coding options in the current version and also outlines the associated problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbajo, J.J.
1995-12-31
This study compares results obtained with two U.S. Nuclear Regulatory Commission (NRC)-sponsored codes, MELCOR version 1.8.3 (1.8PQ) and SCDAP/RELAP5 Mod3.1 release C, for the same transient - a low-pressure, short-term station blackout accident at the Browns Ferry nuclear plant. This work is part of MELCOR assessment activities to compare core damage progression calculations of MELCOR against SCDAP/RELAP5 since the two codes model core damage progression very differently.
dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver
NASA Astrophysics Data System (ADS)
White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.
2018-03-01
dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.