Sample records for scale code package

  1. GenomeDiagram: a python package for the visualization of large-scale genomic data.

    PubMed

    Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K

    2006-03-01

    We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.

  2. Development of an object-oriented ORIGEN for advanced nuclear fuel modeling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, S.; Havloej, F.; Lago, D.

    2013-07-01

    The ORIGEN package serves as the core depletion and decay calculation module within the SCALE code system. A recent major re-factor to the ORIGEN code architecture as part of an overall modernization of the SCALE code system has both greatly enhanced its maintainability as well as afforded several new capabilities useful for incorporating depletion analysis into other code frameworks. This paper will present an overview of the improved ORIGEN code architecture (including the methods and data structures introduced) as well as current and potential future applications utilizing the new ORIGEN framework. (authors)

  3. Moving from Batch to Field Using the RT3D Reactive Transport Modeling System

    NASA Astrophysics Data System (ADS)

    Clement, T. P.; Gautam, T. R.

    2002-12-01

    The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.

  4. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    PubMed

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  5. 49 CFR 178.905 - Large Packaging identification codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Large Packaging identification codes. 178.905... FOR PACKAGINGS Large Packagings Standards § 178.905 Large Packaging identification codes. Large packaging code designations consist of: two numerals specified in paragraph (a) of this section; followed by...

  6. High Resolution Aerospace Applications using the NASA Columbia Supercomputer

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.; Aftosmis, Michael J.; Berger, Marsha

    2005-01-01

    This paper focuses on the parallel performance of two high-performance aerodynamic simulation packages on the newly installed NASA Columbia supercomputer. These packages include both a high-fidelity, unstructured, Reynolds-averaged Navier-Stokes solver, and a fully-automated inviscid flow package for cut-cell Cartesian grids. The complementary combination of these two simulation codes enables high-fidelity characterization of aerospace vehicle design performance over the entire flight envelope through extensive parametric analysis and detailed simulation of critical regions of the flight envelope. Both packages. are industrial-level codes designed for complex geometry and incorpor.ats. CuStomized multigrid solution algorithms. The performance of these codes on Columbia is examined using both MPI and OpenMP and using both the NUMAlink and InfiniBand interconnect fabrics. Numerical results demonstrate good scalability on up to 2016 CPUs using the NUMAIink4 interconnect, with measured computational rates in the vicinity of 3 TFLOP/s, while InfiniBand showed some performance degradation at high CPU counts, particularly with multigrid. Nonetheless, the results are encouraging enough to indicate that larger test cases using combined MPI/OpenMP communication should scale well on even more processors.

  7. Eddylicious: A Python package for turbulent inflow generation

    NASA Astrophysics Data System (ADS)

    Mukha, Timofey; Liefvendahl, Mattias

    2018-01-01

    A Python package for generating inflow for scale-resolving computer simulations of turbulent flow is presented. The purpose of the package is to unite existing inflow generation methods in a single code-base and make them accessible to users of various Computational Fluid Dynamics (CFD) solvers. The currently existing functionality consists of an accurate inflow generation method suitable for flows with a turbulent boundary layer inflow and input/output routines for coupling with the open-source CFD solver OpenFOAM.

  8. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less

  9. Salvus: A flexible open-source package for waveform modelling and inversion from laboratory to global scales

    NASA Astrophysics Data System (ADS)

    Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; May, D.; Rietmann, M.; Fichtner, A.

    2016-12-01

    Recent years have been witness to the application of waveform inversion to new and exciting domains, ranging from non-destructive testing to global seismology. Often, each new application brings with it novel wave propagation physics, spatial and temporal discretizations, and models of variable complexity. Adapting existing software to these novel applications often requires a significant investment of time, and acts as a barrier to progress. To combat these problems we introduce Salvus, a software package designed to solve large-scale full-waveform inverse problems, with a focus on both flexibility and performance. Based on a high order finite (spectral) element discretization, we have built Salvus to work on unstructured quad/hex meshes in both 2 or 3 dimensions, with support for P1-P3 bases on triangles and tetrahedra. A diverse (and expanding) collection of wave propagation physics are supported (i.e. coupled solid-fluid). With a focus on the inverse problem, functionality is provided to ease integration with internal and external optimization libraries. Additionally, a python-based meshing package is included to simplify the generation and manipulation of regional to global scale Earth models (quad/hex), with interfaces available to external mesh generators for complex engineering-scale applications (quad/hex/tri/tet). Finally, to ensure that the code remains accurate and maintainable, we build upon software libraries such as PETSc and Eigen, and follow modern software design and testing protocols. Salvus bridges the gap between research and production codes with a design based on C++ mixins and Python wrappers that separates the physical equations from the numerical core. This allows domain scientists to add new equations using a high-level interface, without having to worry about optimized implementation details. Our goal in this presentation is to introduce the code, show several examples across the scales, and discuss some of the extensible design points.

  10. Salvus: A flexible high-performance and open-source package for waveform modelling and inversion from laboratory to global scales

    NASA Astrophysics Data System (ADS)

    Afanasiev, Michael; Boehm, Christian; van Driel, Martin; Krischer, Lion; May, Dave; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Recent years have been witness to the application of waveform inversion to new and exciting domains, ranging from non-destructive testing to global seismology. Often, each new application brings with it novel wave propagation physics, spatial and temporal discretizations, and models of variable complexity. Adapting existing software to these novel applications often requires a significant investment of time, and acts as a barrier to progress. To combat these problems we introduce Salvus, a software package designed to solve large-scale full-waveform inverse problems, with a focus on both flexibility and performance. Currently based on an abstract implementation of high order finite (spectral) elements, we have built Salvus to work on unstructured quad/hex meshes in both 2 or 3 dimensions, with support for P1-P3 bases on triangles and tetrahedra. A diverse (and expanding) collection of wave propagation physics are supported (i.e. viscoelastic, coupled solid-fluid). With a focus on the inverse problem, functionality is provided to ease integration with internal and external optimization libraries. Additionally, a python-based meshing package is included to simplify the generation and manipulation of regional to global scale Earth models (quad/hex), with interfaces available to external mesh generators for complex engineering-scale applications (quad/hex/tri/tet). Finally, to ensure that the code remains accurate and maintainable, we build upon software libraries such as PETSc and Eigen, and follow modern software design and testing protocols. Salvus bridges the gap between research and production codes with a design based on C++ template mixins and Python wrappers that separates the physical equations from the numerical core. This allows domain scientists to add new equations using a high-level interface, without having to worry about optimized implementation details. Our goal in this presentation is to introduce the code, show several examples across the scales, and discuss some of the extensible design points.

  11. Large-scale 3D simulations of ICF and HEDP targets

    NASA Astrophysics Data System (ADS)

    Marinak, Michael M.

    2000-10-01

    The radiation hydrodynamics code HYDRA continues to be developed and applied to 3D simulations of a variety of targets for both inertial confinement fusion (ICF) and high energy density physics. Several packages have been added enabling this code to perform ICF target simulations with similar accuracy as two-dimensional codes of long-time historical use. These include a laser ray trace and deposition package, a heavy ion deposition package, implicit Monte Carlo photonics, and non-LTE opacities, derived from XSN or the linearized response matrix approach.(R. More, T. Kato, Phys. Rev. Lett. 81, 814 (1998), S. Libby, F. Graziani, R. More, T. Kato, Proceedings of the 13th International Conference on Laser Interactions and Related Plasma Phenomena, (AIP, New York, 1997).) LTE opacities can also be calculated for arbitrary mixtures online by combining tabular values generated by different opacity codes. Thermonuclear burn, charged particle transport, neutron energy deposition, electron-ion coupling and conduction, and multigroup radiation diffusion packages are also installed. HYDRA can employ ALE hydrodynamics; a number of grid motion algorithms are available. Multi-material flows are resolved using material interface reconstruction. Results from large-scale simulations run on up to 1680 processors, using a combination of massively parallel processing and symmetric multiprocessing, will be described. A large solid angle simulation of Rayleigh-Taylor instability growth in a NIF ignition capsule has resolved simultaneously the full spectrum of the most dangerous modes that grow from surface roughness. Simulations of a NIF hohlraum illuminated with the initial 96 beam configuration have also been performed. The effect of the hohlraum’s 3D intrinsic drive asymmetry on the capsule implosion will be considered. We will also discuss results from a Nova experiment in which a copper sphere is crushed by a planar shock. Several interacting hydrodynamic instabilities, including the Widnall instability, cause breakup of the resulting vortex ring.

  12. tran-SAS v1.0: a numerical model to compute catchment-scale hydrologic transport using StorAge Selection functions

    NASA Astrophysics Data System (ADS)

    Benettin, Paolo; Bertuzzo, Enrico

    2018-04-01

    This paper presents the tran-SAS package, which includes a set of codes to model solute transport and water residence times through a hydrological system. The model is based on a catchment-scale approach that aims at reproducing the integrated response of the system at one of its outlets. The codes are implemented in MATLAB and are meant to be easy to edit, so that users with minimal programming knowledge can adapt them to the desired application. The problem of large-scale solute transport has both theoretical and practical implications. On the one side, the ability to represent the ensemble of water flow trajectories through a heterogeneous system helps unraveling streamflow generation processes and allows us to make inferences on plant-water interactions. On the other side, transport models are a practical tool that can be used to estimate the persistence of solutes in the environment. The core of the package is based on the implementation of an age master equation (ME), which is solved using general StorAge Selection (SAS) functions. The age ME is first converted into a set of ordinary differential equations, each addressing the transport of an individual precipitation input through the catchment, and then it is discretized using an explicit numerical scheme. Results show that the implementation is efficient and allows the model to run in short times. The numerical accuracy is critically evaluated and it is shown to be satisfactory in most cases of hydrologic interest. Additionally, a higher-order implementation is provided within the package to evaluate and, if necessary, to improve the numerical accuracy of the results. The codes can be used to model streamflow age and solute concentration, but a number of additional outputs can be obtained by editing the codes to further advance the ability to understand and model catchment transport processes.

  13. Development of a SCALE Tool for Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2013-01-01

    Two methods for calculating eigenvalue sensitivity coefficients in continuous-energy Monte Carlo applications were implemented in the KENO code within the SCALE code package. The methods were used to calculate sensitivity coefficients for several criticality safety problems and produced sensitivity coefficients that agreed well with both reference sensitivities and multigroup TSUNAMI-3D sensitivity coefficients. The newly developed CLUTCH method was observed to produce sensitivity coefficients with high figures of merit and low memory requirements, and both continuous-energy sensitivity methods met or exceeded the accuracy of the multigroup TSUNAMI-3D calculations.

  14. Characterization of open-cycle coal-fired MHD generators. Quarterly technical summary report No. 6, October 1--December 31, 1977. [PACKAGE code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, C.E.; Yousefian, V.; Wormhoudt, J.

    1978-01-30

    Research has included theoretical modeling of important plasma chemical effects such as: conductivity reductions due to condensed slag/electron interactions; conductivity and generator efficiency reductions due to the formation of slag-related negative ion species; and the loss of alkali seed due to chemical combination with condensed slag. A summary of the major conclusions in each of these areas is presented. A major output of the modeling effort has been the development of an MHD plasma chemistry core flow model. This model has been formulated into a computer program designated the PACKAGE code (Plasma Analysis, Chemical Kinetics, And Generator Efficiency). The PACKAGEmore » code is designed to calculate the effect of coal rank, ash percentage, ash composition, air preheat temperatures, equivalence ratio, and various generator channel parameters on the overall efficiency of open-cycle, coal-fired MHD generators. A complete description of the PACKAGE code and a preliminary version of the PACKAGE user's manual are included. A laboratory measurements program involving direct, mass spectrometric sampling of the positive and negative ions formed in a one atmosphere coal combustion plasma was also completed during the contract's initial phase. The relative ion concentrations formed in a plasma due to the methane augmented combustion of pulverized Montana Rosebud coal with potassium carbonate seed and preheated air are summarized. Positive ions measured include K/sup +/, KO/sup +/, Na/sup +/, Rb/sup +/, Cs/sup +/, and CsO/sup +/, while negative ions identified include PO/sub 3//sup -/, PO/sub 2//sup -/, BO/sub 2//sup -/, OH/sup -/, SH/sup -/, and probably HCrO/sub 3/, HMoO/sub 4//sup -/, and HWO/sub 3//sup -/. Comparison of the measurements with PACKAGE code predictions are presented. Preliminary design considerations for a mass spectrometric sampling probe capable of characterizing coal combustion plasmas from full scale combustors and flow trains are presented and discussed.« less

  15. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  16. Implementation of radiation shielding calculation methods. Volume 1: Synopsis of methods and summary of results

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.

  17. A QR code identification technology in package auto-sorting system

    NASA Astrophysics Data System (ADS)

    di, Yi-Juan; Shi, Jian-Ping; Mao, Guo-Yong

    2017-07-01

    Traditional manual sorting operation is not suitable for the development of Chinese logistics. For better sorting packages, a QR code recognition technology is proposed to identify the QR code label on the packages in package auto-sorting system. The experimental results compared with other algorithms in literatures demonstrate that the proposed method is valid and its performance is superior to other algorithms.

  18. Translation of one high-level language to another: COBOL to ADA, an example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, J.A.

    1986-01-01

    This dissertation discusses the difficulties encountered in, and explores possible solutions to, the task of automatically converting programs written in one HLL, COBOL, into programs written in another HLL, Ada, and still maintain readability. This paper presents at least one set of techniques and algorithms to solve many of the problems that were encountered. The differing view of records is solved by isolating those instances where it is a problem, then using the RENAMES option of Ada. Several solutions to doing the decimal-arithmetic translation are discussed. One method used is to emulate COBOL arithmetic in an arithmetic package. Another partialmore » solution suggested is to convert the values to decimal-scaled integers and use modular arithmetic. Conversion to fixed-point type and floating-point type are the third and fourth methods. The work of another researcher, Bobby Othmer, is utilized to correct any unstructured code, to remap statements not directly translatable such as ALTER, and to pull together isolated code sections. Algorithms are then presented to convert this restructured COBOL code into Ada code with local variables, parameters, and packages. The input/output requirements are partially met by mapping them to a series of procedure calls that interface with Ada's standard input-output package. Several examples are given of hand translations of COBOL programs. In addition, a possibly new method is shown for measuring the readability of programs.« less

  19. Optimization and Control of Burning Plasmas Through High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankin, Alexei

    This project has revived the FACETS code, that has been developed under SciDAC fund- ing in 2008-2012. The code has been dormant for a number of years after the SciDAC funding stopped. FACETS depends on external packages. The external packages and libraries such as PETSc, FFTW, HDF5 and NETCDF that are included in FACETS have evolved during these years. Some packages in FACETS are also parts of other codes such as PlasmaState, NUBEAM, GACODES, and UEDGE. These packages have been also evolved together with their host codes which include TRANSP, TGYRO and XPTOR. Finally, there is also a set ofmore » packages in FACETS that are being developed and maintained by Tech-X. These packages include BILDER, SciMake, and FcioWrappers. Many of these packages evolved significantly during the last several years and FACETS had to be updated to synchronize with the re- cent progress in the external packages. The PI has introduced new changes to the BILDER package to support the updated interfaces to the external modules. During the last year of the project, the FACETS version of the UEDGE code has been extracted from FACETS as a standalone package. The PI collaborates with the scientists from LLNL on the updated UEDGE model in FACETS. Drs. T. Rognlien, M. Umansky and A. Dimits from LLNL are contributing to this task.« less

  20. Radio controlled release apparatus for animal data acquisition devices

    DOEpatents

    Stamps, James Frederick

    2000-01-01

    A novel apparatus for reliably and selectively releasing a data acquisition package from an animal for recovery. The data package comprises two parts: 1) an animal data acquisition device and 2) a co-located release apparatus. One embodiment, which is useful for land animals, the release apparatus includes two major components: 1) an electronics package, comprising a receiver; a decoder comparator, having at plurality of individually selectable codes; and an actuator circuit and 2) a release device, which can be a mechanical device, which acts to release the data package from the animal. To release a data package from a particular animal, a radio transmitter sends a coded signal which is decoded to determine if the code is valid for that animal data package. Having received a valid code, the release device is activated to release the data package from the animal for subsequent recovery. A second embodiment includes floatation means and is useful for releasing animal data acquisition devices attached to sea animals. This embodiment further provides for releasing a data package underwater by employing an acoustic signal.

  1. Efficient population-scale variant analysis and prioritization with VAPr.

    PubMed

    Birmingham, Amanda; Mark, Adam M; Mazzaferro, Carlo; Xu, Guorong; Fisch, Kathleen M

    2018-04-06

    With the growing availability of population-scale whole-exome and whole-genome sequencing, demand for reproducible, scalable variant analysis has spread within genomic research communities. To address this need, we introduce the Python package VAPr (Variant Analysis and Prioritization). VAPr leverages existing annotation tools ANNOVAR and MyVariant.info with MongoDB-based flexible storage and filtering functionality. It offers biologists and bioinformatics generalists easy-to-use and scalable analysis and prioritization of genomic variants from large cohort studies. VAPr is developed in Python and is available for free use and extension under the MIT License. An install package is available on PyPi at https://pypi.python.org/pypi/VAPr, while source code and extensive documentation are on GitHub at https://github.com/ucsd-ccbb/VAPr. kfisch@ucsd.edu.

  2. InterProScan 5: genome-scale protein function classification

    PubMed Central

    Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah

    2014-01-01

    Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626

  3. Programming with BIG data in R: Scaling analytics from one to thousands of nodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Drew; Chen, Wei -Chen; Matheson, Michael A.

    Here, we present a tutorial overview showing how one can achieve scalable performance with R. We do so by utilizing several package extensions, including those from the pbdR project. These packages consist of high performance, high-level interfaces to and extensions of MPI, PBLAS, ScaLAPACK, I/O libraries, profiling libraries, and more. While these libraries shine brightest on large distributed platforms, they also work rather well on small clusters and often, surprisingly, even on a laptop with only two cores. Our tutorial begins with recommendations on how to get more performance out of your R code before considering parallel implementations. Because Rmore » is a high-level language, a function can have a deep hierarchy of operations. For big data, this can easily lead to inefficiency. Profiling is an important tool to understand the performance of an R code for both serial and parallel improvements.« less

  4. Programming with BIG data in R: Scaling analytics from one to thousands of nodes

    DOE PAGES

    Schmidt, Drew; Chen, Wei -Chen; Matheson, Michael A.; ...

    2016-11-09

    Here, we present a tutorial overview showing how one can achieve scalable performance with R. We do so by utilizing several package extensions, including those from the pbdR project. These packages consist of high performance, high-level interfaces to and extensions of MPI, PBLAS, ScaLAPACK, I/O libraries, profiling libraries, and more. While these libraries shine brightest on large distributed platforms, they also work rather well on small clusters and often, surprisingly, even on a laptop with only two cores. Our tutorial begins with recommendations on how to get more performance out of your R code before considering parallel implementations. Because Rmore » is a high-level language, a function can have a deep hierarchy of operations. For big data, this can easily lead to inefficiency. Profiling is an important tool to understand the performance of an R code for both serial and parallel improvements.« less

  5. Design Aspects of the Rayleigh Convection Code

    NASA Astrophysics Data System (ADS)

    Featherstone, N. A.

    2017-12-01

    Understanding the long-term generation of planetary or stellar magnetic field requires complementary knowledge of the large-scale fluid dynamics pervading large fractions of the object's interior. Such large-scale motions are sensitive to the system's geometry which, in planets and stars, is spherical to a good approximation. As a result, computational models designed to study such systems often solve the MHD equations in spherical geometry, frequently employing a spectral approach involving spherical harmonics. We present computational and user-interface design aspects of one such modeling tool, the Rayleigh convection code, which is suitable for deployment on desktop and petascale-hpc architectures alike. In this poster, we will present an overview of this code's parallel design and its built-in diagnostics-output package. Rayleigh has been developed with NSF support through the Computational Infrastructure for Geodynamics and is expected to be released as open-source software in winter 2017/2018.

  6. PSRPOPPy: an open-source package for pulsar population simulations

    NASA Astrophysics Data System (ADS)

    Bates, S. D.; Lorimer, D. R.; Rane, A.; Swiggum, J.

    2014-04-01

    We have produced a new software package for the simulation of pulsar populations, PSRPOPPY, based on the PSRPOP package. The codebase has been re-written in Python (save for some external libraries, which remain in their native Fortran), utilizing the object-oriented features of the language, and improving the modularity of the code. Pre-written scripts are provided for running the simulations in `standard' modes of operation, but the code is flexible enough to support the writing of personalised scripts. The modular structure also makes the addition of experimental features (such as new models for period or luminosity distributions) more straightforward than with the previous code. We also discuss potential additions to the modelling capabilities of the software. Finally, we demonstrate some potential applications of the code; first, using results of surveys at different observing frequencies, we find pulsar spectral indices are best fitted by a normal distribution with mean -1.4 and standard deviation 1.0. Secondly, we model pulsar spin evolution to calculate the best fit for a relationship between a pulsar's luminosity and spin parameters. We used the code to replicate the analysis of Faucher-Giguère & Kaspi, and have subsequently optimized their power-law dependence of radio luminosity, L, with period, P, and period derivative, Ṗ. We find that the underlying population is best described by L ∝ P-1.39±0.09 Ṗ0.48±0.04 and is very similar to that found for γ-ray pulsars by Perera et al. Using this relationship, we generate a model population and examine the age-luminosity relation for the entire pulsar population, which may be measurable after future large-scale surveys with the Square Kilometre Array.

  7. SC'11 Poster: A Highly Efficient MGPT Implementation for LAMMPS; with Strong Scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oppelstrup, T; Stukowski, A; Marian, J

    2011-12-07

    The MGPT potential has been implemented as a drop in package to the general molecular dynamics code LAMMPS. We implement an improved communication scheme that shrinks the communication layer thickness, and increases the load balancing. This results in unprecedented strong scaling, and speedup continuing beyond 1/8 atom/core. In addition, we have optimized the small matrix linear algebra with generic blocking (for all processors) and specific SIMD intrinsics for vectorization on Intel, AMD, and BlueGene CPUs.

  8. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    NASA Astrophysics Data System (ADS)

    Swales, Dustin J.; Pincus, Robert; Bodas-Salcedo, Alejandro

    2018-01-01

    The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP) gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  9. Computational thermochemistry: Automated generation of scale factors for vibrational frequencies calculated by electronic structure model chemistries

    NASA Astrophysics Data System (ADS)

    Yu, Haoyu S.; Fiedler, Lucas J.; Alecu, I. M.; Truhlar, Donald G.

    2017-01-01

    We present a Python program, FREQ, for calculating the optimal scale factors for calculating harmonic vibrational frequencies, fundamental vibrational frequencies, and zero-point vibrational energies from electronic structure calculations. The program utilizes a previously published scale factor optimization model (Alecu et al., 2010) to efficiently obtain all three scale factors from a set of computed vibrational harmonic frequencies. In order to obtain the three scale factors, the user only needs to provide zero-point energies of 15 or 6 selected molecules. If the user has access to the Gaussian 09 or Gaussian 03 program, we provide the option for the user to run the program by entering the keywords for a certain method and basis set in the Gaussian 09 or Gaussian 03 program. Four other Python programs, input.py, input6, pbs.py, and pbs6.py, are also provided for generating Gaussian 09 or Gaussian 03 input and PBS files. The program can also be used with data from any other electronic structure package. A manual of how to use this program is included in the code package.

  10. Toddler foods, children's foods: assessing sodium in packaged supermarket foods targeted at children.

    PubMed

    Elliott, Charlene D; Conlon, Martin J

    2011-03-01

    To critically examine child-oriented packaged food products sold in Canada for their sodium content, and to assess them light of intake recommendations, the current policy context and suggested targets. Baby/toddler foods (n 186) and child-oriented packaged foods (n 354) were coded for various attributes (including sodium). Summary statistics were created for sodium, then the children's food products were compared with the UK Food Standards Agency (FSA) 'targets' for sodium in packaged foods. Also assessed were the products' per-serving sodium levels were assessed in light of the US Institute of Medicine's dietary reference intakes and Canada's Food Guide. Calgary, Alberta, Canada. None. Twenty per cent of products could be classified as having high sodium levels. Certain sub-categories of food (i.e. toddler entrées, children's packaged lunches, soups and canned pastas) were problematic. Significantly, when scaled in according to Schedule M or viewed in light of the serving sizes on the Nutrition Facts table, the sodium level in various dry goods products generally fell within, and below, the Adequate Intake (AI)/Tolerable Upper Intake Level (UL) band for sodium. When scaled in accordance with the UK FSA targets, however, none of the (same) products met the targets. In light of AI/UL thresholds based on age and per-serving cut-offs, packaged foodstuffs for youngsters fare relatively well, with the exception of some problematic areas. 'Stealth sodium' and 'subtle sodium' are important considerations; so is use of the FSA's scaling method to evaluate sodium content, because it is highly sensitive to the difference between the reference amount and the actual real-world serving size for the product being considered.

  11. Corrigendum to "Nearest neighbor imputation of species-level, plot-scale forest structure attributes from LiDAR data"

    Treesearch

    Andrew T. Hudak; Nicholas L. Crookston; Jeffrey S. Evans; David E. hall; Michael J. Falkowski

    2009-01-01

    The authors regret that an error was discovered in the code within the R software package, yaImpute (Crookston & Finley, 2008), which led to incorrect results reported in the above article. The Most Similar Neighbor (MSN) method computes the distance between reference observations and target observations in a projected space defined using canonical correlation...

  12. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  13. ON UPGRADING THE NUMERICS IN COMBUSTION CHEMISTRY CODES. (R824970)

    EPA Science Inventory

    A method of updating and reusing legacy FORTRAN codes for combustion simulations is presented using the DAEPACK software package. The procedure is demonstrated on two codes that come with the CHEMKIN-II package, CONP and SENKIN, for the constant-pressure batch reactor simulati...

  14. Watershed boundaries and digital elevation model of Oklahoma derived from 1:100,000-scale digital topographic maps

    USGS Publications Warehouse

    Cederstrand, J.R.; Rea, A.H.

    1995-01-01

    This document provides a general description of the procedures used to develop the data sets included on this compact disc. This compact disc contains watershed boundaries for Oklahoma, a digital elevation model, and other data sets derived from the digital elevation model. The digital elevation model was produced using the ANUDEM software package, written by Michael Hutchinson and licensed from the Centre for Resource and Environmental Studies at The Australian National University. Elevation data (hypsography) and streams (hydrography) from digital versions of the U.S. Geological Survey 1:100,000-scale topographic maps were used by the ANUDEM package to produce a hydrologically conditioned digital elevation model with a 60-meter cell size. This digital elevation model is well suited for drainage-basin delineation using automated techniques. Additional data sets include flow-direction, flow-accumulation, and shaded-relief grids, all derived from the digital elevation model, and the hydrography data set used in producing the digital elevation model. The watershed boundaries derived from the digital elevation model have been edited to be consistent with contours and streams from the U.S. Geological Survey 1:100,000-scale topographic maps. The watershed data set includes boundaries for 11-digit Hydrologic Unit Codes (watersheds) within Oklahoma, and 8-digit Hydrologic Unit Codes (cataloging units) outside Oklahoma. Cataloging-unit boundaries based on 1:250,000-scale maps outside Oklahoma for the Arkansas, Red, and White River basins are included. The other data sets cover Oklahoma, and where available, portions of 1:100,000-scale quadrangles adjoining Oklahoma.

  15. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  16. Increasing the availability of national mapping products.

    USGS Publications Warehouse

    Roney, J.I.; Ogilvie, B.C.

    1981-01-01

    A discussion of the means employed by the US Geological Survey to facilitate map usage, covering aspects of project Map Accessibility Program including special rolled and folded map packaging, new market testing, parks and campgrounds program, expanded map dealer program, new booklet-type State sales index and catalog and new USGS map reference code. The USGS is seen as the producer of a tremendous nation-wide inventory of topographic and related map products available in unprecedented types, formats and scales, and as endeavouring to increase access to its products. The new USGS map reference code is appended. -J.C.Stone

  17. Continuous-energy eigenvalue sensitivity coefficient calculations in TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, C. M.; Rearden, B. T.

    2013-07-01

    Two methods for calculating eigenvalue sensitivity coefficients in continuous-energy Monte Carlo applications were implemented in the KENO code within the SCALE code package. The methods were used to calculate sensitivity coefficients for several test problems and produced sensitivity coefficients that agreed well with both reference sensitivities and multigroup TSUNAMI-3D sensitivity coefficients. The newly developed CLUTCH method was observed to produce sensitivity coefficients with high figures of merit and a low memory footprint, and both continuous-energy sensitivity methods met or exceeded the accuracy of the multigroup TSUNAMI-3D calculations. (authors)

  18. A predictive transport modeling code for ICRF-heated tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, C.K.; Hwang, D.Q.; Houlberg, W.

    In this report, a detailed description of the physic included in the WHIST/RAZE package as well as a few illustrative examples of the capabilities of the package will be presented. An in depth analysis of ICRF heating experiments using WHIST/RAZE will be discussed in a forthcoming report. A general overview of philosophy behind the structure of the WHIST/RAZE package, a summary of the features of the WHIST code, and a description of the interface to the RAZE subroutines are presented in section 2 of this report. Details of the physics contained in the RAZE code are examined in section 3.more » Sample results from the package follow in section 4, with concluding remarks and a discussion of possible improvements to the package discussed in section 5.« less

  19. A predictive transport modeling code for ICRF-heated tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, C.K.; Hwang, D.Q.; Houlberg, W.

    1992-02-01

    In this report, a detailed description of the physic included in the WHIST/RAZE package as well as a few illustrative examples of the capabilities of the package will be presented. An in depth analysis of ICRF heating experiments using WHIST/RAZE will be discussed in a forthcoming report. A general overview of philosophy behind the structure of the WHIST/RAZE package, a summary of the features of the WHIST code, and a description of the interface to the RAZE subroutines are presented in section 2 of this report. Details of the physics contained in the RAZE code are examined in section 3.more » Sample results from the package follow in section 4, with concluding remarks and a discussion of possible improvements to the package discussed in section 5.« less

  20. Efficient parallel simulation of CO2 geologic sequestration insaline aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Doughty, Christine; Wu, Yu-Shu

    2007-01-01

    An efficient parallel simulator for large-scale, long-termCO2 geologic sequestration in saline aquifers has been developed. Theparallel simulator is a three-dimensional, fully implicit model thatsolves large, sparse linear systems arising from discretization of thepartial differential equations for mass and energy balance in porous andfractured media. The simulator is based on the ECO2N module of the TOUGH2code and inherits all the process capabilities of the single-CPU TOUGH2code, including a comprehensive description of the thermodynamics andthermophysical properties of H2O-NaCl- CO2 mixtures, modeling singleand/or two-phase isothermal or non-isothermal flow processes, two-phasemixtures, fluid phases appearing or disappearing, as well as saltprecipitation or dissolution. The newmore » parallel simulator uses MPI forparallel implementation, the METIS software package for simulation domainpartitioning, and the iterative parallel linear solver package Aztec forsolving linear equations by multiple processors. In addition, theparallel simulator has been implemented with an efficient communicationscheme. Test examples show that a linear or super-linear speedup can beobtained on Linux clusters as well as on supercomputers. Because of thesignificant improvement in both simulation time and memory requirement,the new simulator provides a powerful tool for tackling larger scale andmore complex problems than can be solved by single-CPU codes. Ahigh-resolution simulation example is presented that models buoyantconvection, induced by a small increase in brine density caused bydissolution of CO2.« less

  1. GDCRNATools: an R/Bioconductor package for integrative analysis of lncRNA, miRNA, and mRNA data in GDC.

    PubMed

    Li, Ruidong; Qu, Han; Wang, Shibo; Wei, Julong; Zhang, Le; Ma, Renyuan; Lu, Jianming; Zhu, Jianguo; Zhong, Wei-De; Jia, Zhenyu

    2018-03-02

    The large-scale multidimensional omics data in the Genomic Data Commons (GDC) provides opportunities to investigate the crosstalk among different RNA species and their regulatory mechanisms in cancers. Easy-to-use bioinformatics pipelines are needed to facilitate such studies. We have developed a user-friendly R/Bioconductor package, named GDCRNATools, for downloading, organizing, and analyzing RNA data in GDC with an emphasis on deciphering the lncRNA-mRNA related competing endogenous RNAs (ceRNAs) regulatory network in cancers. Many widely used bioinformatics tools and databases are utilized in our package. Users can easily pack preferred downstream analysis pipelines or integrate their own pipelines into the workflow. Interactive shiny web apps built in GDCRNATools greatly improve visualization of results from the analysis. GDCRNATools is an R/Bioconductor package that is freely available at Bioconductor (http://bioconductor.org/packages/devel/bioc/html/GDCRNATools.html). Detailed instructions, manual and example code are also available in Github (https://github.com/Jialab-UCR/GDCRNATools). arthur.jia@ucr.edu or zhongwd2009@live.cn or doctorzhujianguo@163.com.

  2. Building America Case Study: Performance of a Hot-Dry Climate Whole House Retrofit, Stockton, California (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ARBI

    2014-09-01

    The Stockton house retrofit is a two-story tudor style single family deep retrofit in the hot-dry climate of Stockton, CA. The home is representative of a deep retrofit option of the scaled home energy upgrade packages offered to targeted neighborhoods under the pilot Large-Scale Retrofit Program (LSRP) administered by the Alliance for Residential Building Innovation (ARBI). Deep retrofit packages expand on the standard package by adding HVAC, water heater and window upgrades to the ducting, attic and floor insulation, domestic hot water insulation, envelope sealing, lighting and ventilation upgrades. Site energy savings with the deep retrofit were 23% compared tomore » the pre-retrofit case, and 15% higher than the savings estimated for the standard retrofit package. Energy savings were largely a result of the water heater upgrade, and a combination of the envelope sealing, insulation and HVAC upgrade. The HVAC system was of higher efficiency than the building code standard. Overall the financed retrofit would have been more cost effective had a less expensive HVAC system been selected and barriers to wall insulation remedied. The homeowner experienced improved comfort throughout the monitored period and was satisfied with the resulting utility bill savings.« less

  3. Cluster-lensing: A Python Package for Galaxy Clusters and Miscentering

    NASA Astrophysics Data System (ADS)

    Ford, Jes; VanderPlas, Jake

    2016-12-01

    We describe a new open source package for calculating properties of galaxy clusters, including Navarro, Frenk, and White halo profiles with and without the effects of cluster miscentering. This pure-Python package, cluster-lensing, provides well-documented and easy-to-use classes and functions for calculating cluster scaling relations, including mass-richness and mass-concentration relations from the literature, as well as the surface mass density {{Σ }}(R) and differential surface mass density {{Δ }}{{Σ }}(R) profiles, probed by weak lensing magnification and shear. Galaxy cluster miscentering is especially a concern for stacked weak lensing shear studies of galaxy clusters, where offsets between the assumed and the true underlying matter distribution can lead to a significant bias in the mass estimates if not accounted for. This software has been developed and released in a public GitHub repository, and is licensed under the permissive MIT license. The cluster-lensing package is archived on Zenodo. Full documentation, source code, and installation instructions are available at http://jesford.github.io/cluster-lensing/.

  4. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  5. MIFT: GIFT Combinatorial Geometry Input to VCS Code

    DTIC Science & Technology

    1977-03-01

    r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package

  6. The Model 9977 Radioactive Material Packaging Primer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, G.

    2015-10-09

    The Model 9977 Packaging is a single containment drum style radioactive material (RAM) shipping container designed, tested and analyzed to meet the performance requirements of Title 10 the Code of Federal Regulations Part 71. A radioactive material shipping package, in combination with its contents, must perform three functions (please note that the performance criteria specified in the Code of Federal Regulations have alternate limits for normal operations and after accident conditions): Containment, the package must “contain” the radioactive material within it; Shielding, the packaging must limit its users and the public to radiation doses within specified limits; and Subcriticality, themore » package must maintain its radioactive material as subcritical« less

  7. Deployment of the OSIRIS EM-PIC code on the Intel Knights Landing architecture

    NASA Astrophysics Data System (ADS)

    Fonseca, Ricardo

    2017-10-01

    Electromagnetic particle-in-cell (EM-PIC) codes such as OSIRIS have found widespread use in modelling the highly nonlinear and kinetic processes that occur in several relevant plasma physics scenarios, ranging from astrophysical settings to high-intensity laser plasma interaction. Being computationally intensive, these codes require large scale HPC systems, and a continuous effort in adapting the algorithm to new hardware and computing paradigms. In this work, we report on our efforts on deploying the OSIRIS code on the new Intel Knights Landing (KNL) architecture. Unlike the previous generation (Knights Corner), these boards are standalone systems, and introduce several new features, include the new AVX-512 instructions and on-package MCDRAM. We will focus on the parallelization and vectorization strategies followed, as well as memory management, and present a detailed performance evaluation of code performance in comparison with the CPU code. This work was partially supported by Fundaçã para a Ciência e Tecnologia (FCT), Portugal, through Grant No. PTDC/FIS-PLA/2940/2014.

  8. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  9. A Turbine Based Combined Cycle Engine Inlet Model and Mode Transition Simulation Based on HiTECC Tool

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Stueber, Thomas

    2012-01-01

    An inlet system is being tested to evaluate methodologies for a turbine based combined cycle propulsion system to perform a controlled inlet mode transition. Prior to wind tunnel based hardware testing of controlled mode transitions, simulation models are used to test, debug, and validate potential control algorithms. One candidate simulation package for this purpose is the High Mach Transient Engine Cycle Code (HiTECC). The HiTECC simulation package models the inlet system, propulsion systems, thermal energy, geometry, nozzle, and fuel systems. This paper discusses the modification and redesign of the simulation package and control system to represent the NASA large-scale inlet model for Combined Cycle Engine mode transition studies, mounted in NASA Glenn s 10-foot by 10-foot Supersonic Wind Tunnel. This model will be used for designing and testing candidate control algorithms before implementation.

  10. A Turbine Based Combined Cycle Engine Inlet Model and Mode Transition Simulation Based on HiTECC Tool

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Stueber, Thomas J.

    2012-01-01

    An inlet system is being tested to evaluate methodologies for a turbine based combined cycle propulsion system to perform a controlled inlet mode transition. Prior to wind tunnel based hardware testing of controlled mode transitions, simulation models are used to test, debug, and validate potential control algorithms. One candidate simulation package for this purpose is the High Mach Transient Engine Cycle Code (HiTECC). The HiTECC simulation package models the inlet system, propulsion systems, thermal energy, geometry, nozzle, and fuel systems. This paper discusses the modification and redesign of the simulation package and control system to represent the NASA large-scale inlet model for Combined Cycle Engine mode transition studies, mounted in NASA Glenn s 10- by 10-Foot Supersonic Wind Tunnel. This model will be used for designing and testing candidate control algorithms before implementation.

  11. Modeling the Galaxy-Halo Connection: An open-source approach with Halotools

    NASA Astrophysics Data System (ADS)

    Hearin, Andrew

    2016-03-01

    Although the modern form of galaxy-halo modeling has been in place for over ten years, there exists no common code base for carrying out large-scale structure calculations. Considering, for example, the advances in CMB science made possible by Boltzmann-solvers such as CMBFast, CAMB and CLASS, there are clear precedents for how theorists working in a well-defined subfield can mutually benefit from such a code base. Motivated by these and other examples, I present Halotools: an open-source, object-oriented python package for building and testing models of the galaxy-halo connection. Halotools is community-driven, and already includes contributions from over a dozen scientists spread across numerous universities. Designed with high-speed performance in mind, the package generates mock observations of synthetic galaxy populations with sufficient speed to conduct expansive MCMC likelihood analyses over a diverse and highly customizable set of models. The package includes an automated test suite and extensive web-hosted documentation and tutorials (halotools.readthedocs.org). I conclude the talk by describing how Halotools can be used to analyze existing datasets to obtain robust and novel constraints on galaxy evolution models, and by outlining the Halotools program to prepare the field of cosmology for the arrival of Stage IV dark energy experiments.

  12. Increasing Flexibility in Energy Code Compliance: Performance Packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Rosenberg, Michael I.

    Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less

  13. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  14. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    PubMed

    Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei

    2014-01-01

    Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  15. GPU-accelerated Red Blood Cells Simulations with Transport Dissipative Particle Dynamics.

    PubMed

    Blumers, Ansel L; Tang, Yu-Hang; Li, Zhen; Li, Xuejin; Karniadakis, George E

    2017-08-01

    Mesoscopic numerical simulations provide a unique approach for the quantification of the chemical influences on red blood cell functionalities. The transport Dissipative Particles Dynamics (tDPD) method can lead to such effective multiscale simulations due to its ability to simultaneously capture mesoscopic advection, diffusion, and reaction. In this paper, we present a GPU-accelerated red blood cell simulation package based on a tDPD adaptation of our red blood cell model, which can correctly recover the cell membrane viscosity, elasticity, bending stiffness, and cross-membrane chemical transport. The package essentially processes all computational workloads in parallel by GPU, and it incorporates multi-stream scheduling and non-blocking MPI communications to improve inter-node scalability. Our code is validated for accuracy and compared against the CPU counterpart for speed. Strong scaling and weak scaling are also presented to characterizes scalability. We observe a speedup of 10.1 on one GPU over all 16 cores within a single node, and a weak scaling efficiency of 91% across 256 nodes. The program enables quick-turnaround and high-throughput numerical simulations for investigating chemical-driven red blood cell phenomena and disorders.

  16. RevEcoR: an R package for the reverse ecology analysis of microbiomes.

    PubMed

    Cao, Yang; Wang, Yuanyuan; Zheng, Xiaofei; Li, Fei; Bo, Xiaochen

    2016-07-29

    All species live in complex ecosystems. The structure and complexity of a microbial community reflects not only diversity and function, but also the environment in which it occurs. However, traditional ecological methods can only be applied on a small scale and for relatively well-understood biological systems. Recently, a graph-theory-based algorithm called the reverse ecology approach has been developed that can analyze the metabolic networks of all the species in a microbial community, and predict the metabolic interface between species and their environment. Here, we present RevEcoR, an R package and a Shiny Web application that implements the reverse ecology algorithm for determining microbe-microbe interactions in microbial communities. This software allows users to obtain large-scale ecological insights into species' ecology directly from high-throughput metagenomic data. The software has great potential for facilitating the study of microbiomes. RevEcoR is open source software for the study of microbial community ecology. The RevEcoR R package is freely available under the GNU General Public License v. 2.0 at http://cran.r-project.org/web/packages/RevEcoR/ with the vignette and typical usage examples, and the interactive Shiny web application is available at http://yiluheihei.shinyapps.io/shiny-RevEcoR , or can be installed locally with the source code accessed from https://github.com/yiluheihei/shiny-RevEcoR .

  17. The ENSDF Java Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonzogni, A.A.

    2005-05-24

    A package of computer codes has been developed to process and display nuclear structure and decay data stored in the ENSDF (Evaluated Nuclear Structure Data File) library. The codes were written in an object-oriented fashion using the java language. This allows for an easy implementation across multiple platforms as well as deployment on web pages. The structure of the different java classes that make up the package is discussed as well as several different implementations.

  18. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results.« less

  19. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    NASA Astrophysics Data System (ADS)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  20. Optimization of large matrix calculations for execution on the Cray X-MP vector supercomputer

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1988-01-01

    A considerable volume of large computational computer codes were developed for NASA over the past twenty-five years. This code represents algorithms developed for machines of earlier generation. With the emergence of the vector supercomputer as a viable, commercially available machine, an opportunity exists to evaluate optimization strategies to improve the efficiency of existing software. This result is primarily due to architectural differences in the latest generation of large-scale machines and the earlier, mostly uniprocessor, machines. A sofware package being used by NASA to perform computations on large matrices is described, and a strategy for conversion to the Cray X-MP vector supercomputer is also described.

  1. Parallel Monte Carlo transport modeling in the context of a time-dependent, three-dimensional multi-physics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Procassini, R.J.

    1997-12-31

    The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less

  2. 49 CFR 178.523 - Standards for composite packagings with inner glass, porcelain, or stoneware receptacles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for composite packagings with inner... Packaging Standards § 178.523 Standards for composite packagings with inner glass, porcelain, or stoneware receptacles. (a) The following are identification codes for composite packagings with inner receptacles of...

  3. An Object-Oriented Serial DSMC Simulation Package

    NASA Astrophysics Data System (ADS)

    Liu, Hongli; Cai, Chunpei

    2011-05-01

    A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.

  4. PlasmaPy: beginning a community developed Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration

    2016-10-01

    In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.

  5. 76 FR 30551 - Specifications for Packagings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... 178 Specifications for Packagings CFR Correction In Title 49 of the Code of Federal Regulations, Parts... design qualification test and each periodic retest on a packaging, a test report must be prepared. The test report must be maintained at each location where the packaging is manufactured and each location...

  6. 27 CFR 19.276 - Package scales.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Package scales. 19.276... Package scales. Proprietors shall ensure the accuracy of scales used for weighing packages of spirits through tests conducted at intervals of not more than 6 months or whenever scales are adjusted or repaired...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D. P.

    Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.

  8. sbtools: A package connecting R to cloud-based data for collaborative online research

    USGS Publications Warehouse

    Winslow, Luke; Chamberlain, Scott; Appling, Alison P.; Read, Jordan S.

    2016-01-01

    The adoption of high-quality tools for collaboration and reproducible research such as R and Github is becoming more common in many research fields. While Github and other version management systems are excellent resources, they were originally designed to handle code and scale poorly to large text-based or binary datasets. A number of scientific data repositories are coming online and are often focused on dataset archival and publication. To handle collaborative workflows using large scientific datasets, there is increasing need to connect cloud-based online data storage to R. In this article, we describe how the new R package sbtools enables direct access to the advanced online data functionality provided by ScienceBase, the U.S. Geological Survey’s online scientific data storage platform.

  9. 49 CFR 178.522 - Standards for composite packagings with inner plastic receptacles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for composite packagings with inner... Standards for composite packagings with inner plastic receptacles. (a) The following are the identification codes for composite packagings with inner plastic receptacles: (1) 6HA1 for a plastic receptacle within...

  10. Enhancement of the CAVE computer code. [aerodynamic heating package for nose cones and scramjet engine sidewalls

    NASA Technical Reports Server (NTRS)

    Rathjen, K. A.; Burk, H. O.

    1983-01-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  11. The Composite Analytic and Simulation Package or RFI (CASPR) on a coded channel

    NASA Technical Reports Server (NTRS)

    Freedman, Jeff; Berman, Ted

    1993-01-01

    CASPR is an analysis package which determines the performance of a coded signal in the presence of Radio Frequency Interference (RFI) and Additive White Gaussian Noise (AWGN). It can analyze a system with convolutional coding, Reed-Solomon (RS) coding, or a concatenation of the two. The signals can either be interleaved or non-interleaved. The model measures the system performance in terms of either the E(sub b)/N(sub 0) required to achieve a given Bit Error Rate (BER) or the BER needed for a constant E(sub b)/N(sub 0).

  12. LakeMetabolizer: An R package for estimating lake metabolism from free-water oxygen using diverse statistical models

    USGS Publications Warehouse

    Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.

    2016-01-01

    Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.

  13. BEARCLAW: Boundary Embedded Adaptive Refinement Conservation LAW package

    NASA Astrophysics Data System (ADS)

    Mitran, Sorin

    2011-04-01

    The BEARCLAW package is a multidimensional, Eulerian AMR-capable computational code written in Fortran to solve hyperbolic systems for astrophysical applications. It is part of AstroBEAR, a hydrodynamic & magnetohydrodynamic code environment designed for a variety of astrophysical applications which allows simulations in 2, 2.5 (i.e., cylindrical), and 3 dimensions, in either cartesian or curvilinear coordinates.

  14. LB3D: A parallel implementation of the Lattice-Boltzmann method for simulation of interacting amphiphilic fluids

    NASA Astrophysics Data System (ADS)

    Schmieschek, S.; Shamardin, L.; Frijters, S.; Krüger, T.; Schiller, U. D.; Harting, J.; Coveney, P. V.

    2017-08-01

    We introduce the lattice-Boltzmann code LB3D, version 7.1. Building on a parallel program and supporting tools which have enabled research utilising high performance computing resources for nearly two decades, LB3D version 7 provides a subset of the research code functionality as an open source project. Here, we describe the theoretical basis of the algorithm as well as computational aspects of the implementation. The software package is validated against simulations of meso-phases resulting from self-assembly in ternary fluid mixtures comprising immiscible and amphiphilic components such as water-oil-surfactant systems. The impact of the surfactant species on the dynamics of spinodal decomposition are tested and quantitative measurement of the permeability of a body centred cubic (BCC) model porous medium for a simple binary mixture is described. Single-core performance and scaling behaviour of the code are reported for simulations on current supercomputer architectures.

  15. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  16. 78 FR 29016 - Establishing Quality Assurance Programs for Packaging Used in Transport of Radioactive Material

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Establishing Quality Assurance Programs for Packaging Used in Transport of Radioactive Material AGENCY: Nuclear..., ``Establishing Quality Assurance Programs for Packaging Used in Transport of Radioactive Material.'' This draft... regulations for the packaging and transportation of radioactive material in Part 71 of Title 10 of the Code of...

  17. 76 FR 5215 - Draft Regulatory Guide: Issuance, Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-28

    ... Compliance with Packaging Requirements for Shipment and Receipt of Radioactive Material,'' is temporarily... Code of Federal Regulations, Part 71, ``Packaging and Transportation of Radioactive Material'' (10 CFR... Compliance with Packaging Requirements for Shipments of Radioactive Materials,'' as an acceptable process for...

  18. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  19. HZETRN: A heavy ion/nucleon transport code for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.

    1991-01-01

    The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.

  20. Code Development in Coupled PARCS/RELAP5 for Supercritical Water Reactor

    DOE PAGES

    Hu, Po; Wilson, Paul

    2014-01-01

    The new capability is added to the existing coupled code package PARCS/RELAP5, in order to analyze SCWR design under supercritical pressure with the separated water coolant and moderator channels. This expansion is carried out on both codes. In PARCS, modification is focused on extending the water property tables to supercritical pressure, modifying the variable mapping input file and related code module for processing thermal-hydraulic information from separated coolant/moderator channels, and modifying neutronics feedback module to deal with the separated coolant/moderator channels. In RELAP5, modification is focused on incorporating more accurate water properties near SCWR operation/transient pressure and temperature in themore » code. Confirming tests of the modifications is presented and the major analyzing results from the extended codes package are summarized.« less

  1. Development of Ultra-Fine Multigroup Cross Section Library of the AMPX/SCALE Code Packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Byoung Kyu; Sik Yang, Won; Kim, Kang Seog

    The Consortium for Advanced Simulation of Light Water Reactors Virtual Environment for Reactor Applications (VERA) neutronic simulator MPACT is being developed by Oak Ridge National Laboratory and the University of Michigan for various reactor applications. The MPACT and simplified MPACT 51- and 252-group cross section libraries have been developed for the MPACT neutron transport calculations by using the AMPX and Standardized Computer Analyses for Licensing Evaluations (SCALE) code packages developed at Oak Ridge National Laboratory. It has been noted that the conventional AMPX/SCALE procedure has limited applications for fast-spectrum systems such as boiling water reactor (BWR) fuels with very highmore » void fractions and fast reactor fuels because of its poor accuracy in unresolved and fast energy regions. This lack of accuracy can introduce additional error sources to MPACT calculations, which is already limited by the Bondarenko approach for resolved resonance self-shielding calculation. To enhance the prediction accuracy of MPACT for fast-spectrum reactor analyses, the accuracy of the AMPX/SCALE code packages should be improved first. The purpose of this study is to identify the major problems of the AMPX/SCALE procedure in generating fast-spectrum cross sections and to devise ways to improve the accuracy. For this, various benchmark problems including a typical pressurized water reactor fuel, BWR fuels with various void fractions, and several fast reactor fuels were analyzed using the AMPX 252-group libraries. Isotopic reaction rates were determined by SCALE multigroup (MG) calculations and compared with continuous energy (CE) Monte Carlo calculation results. This reaction rate analysis revealed three main contributors to the observed differences in reactivity and reaction rates: (1) the limitation of the Bondarenko approach in coarse energy group structure, (2) the normalization issue of probability tables, and (3) neglect of the self-shielding effect of resonance-like cross sections at high energy range such as (n,p) cross section of Cl35. The first error source can be eliminated by an ultra-fine group (UFG) structure in which the broad scattering resonances of intermediate-weight nuclides can be represented accurately by a piecewise constant function. A UFG AMPX library was generated with modified probability tables and tested against various benchmark problems. The reactivity and reaction rates determined with the new UFG AMPX library agreed very well with respect to Monte Carlo Neutral Particle (MCNP) results. To enhance the lattice calculation accuracy without significantly increasing the computational time, performing the UFG lattice calculation in two steps was proposed. In the first step, a UFG slowing-down calculation is performed for the corresponding homogenized composition, and UFG cross sections are collapsed into an intermediate group structure. In the second step, the lattice calculation is performed for the intermediate group level using the condensed group cross sections. A preliminary test showed that the condensed library reproduces the results obtained with the UFG cross section library. This result suggests that the proposed two-step lattice calculation approach is a promising option to enhance the applicability of the AMPX/SCALE system to fast system analysis.« less

  2. Modeling Solar Wind Flow with the Multi-Scale Fluid-Kinetic Simulation Suite

    DOE PAGES

    Pogorelov, N.V.; Borovikov, S. N.; Bedford, M. C.; ...

    2013-04-01

    Multi-Scale Fluid-Kinetic Simulation Suite (MS-FLUKSS) is a package of numerical codes capable of performing adaptive mesh refinement simulations of complex plasma flows in the presence of discontinuities and charge exchange between ions and neutral atoms. The flow of the ionized component is described with the ideal MHD equations, while the transport of atoms is governed either by the Boltzmann equation or multiple Euler gas dynamics equations. We have enhanced the code with additional physical treatments for the transport of turbulence and acceleration of pickup ions in the interplanetary space and at the termination shock. In this article, we present themore » results of our numerical simulation of the solar wind (SW) interaction with the local interstellar medium (LISM) in different time-dependent and stationary formulations. Numerical results are compared with the Ulysses, Voyager, and OMNI observations. Finally, the SW boundary conditions are derived from in-situ spacecraft measurements and remote observations.« less

  3. Parallel Computation of the Regional Ocean Modeling System (ROMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, P; Song, Y T; Chao, Y

    2005-04-05

    The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds ofmore » processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.« less

  4. ECCD-induced tearing mode stabilization in coupled IPS/NIMROD/GENRAY HPC simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Kruger, S. E.; Held, E. D.; Harvey, R. W.; Elwasif, W. R.

    2012-03-01

    We summarize ongoing developments toward an integrated, predictive model for determining optimal ECCD-based NTM stabilization strategies in ITER. We demonstrate the capability of the SWIM Project's Integrated Plasma Simulator (IPS) framework to choreograph multiple executions of, and data exchanges between, physics codes modeling various spatiotemporal scales of this coupled RF/MHD problem on several thousand HPC processors. As NIMROD evolves fluid equations to model bulk plasma behavior, self-consistent propagation/deposition of RF power in the ensuing plasma profiles is calculated by GENRAY. Data from both codes is then processed by computational geometry packages to construct the RF-induced quasilinear diffusion tensor; moments of this tensor (entering as additional terms in NIMROD's fluid equations due to the disparity in RF/MHD spatiotemporal scales) influence the dynamics of current, momentum, and energy evolution as well as the MHD closures. Initial results are shown to correctly capture the physics of magnetic island stabilization; we also discuss the development of a numerical plasma control system for active feedback stabilization of tearing modes.

  5. 9 CFR 381.144 - Packaging materials.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., from the packaging supplier under whose brand name and firm name the material is marketed to the... distinguishing brand name or code designation appearing on the packaging material shipping container; must....13) will be acceptable. The management of the establishment must maintain a file containing...

  6. Fault-tolerant, high-level quantum circuits: form, compilation and description

    NASA Astrophysics Data System (ADS)

    Paler, Alexandru; Polian, Ilia; Nemoto, Kae; Devitt, Simon J.

    2017-06-01

    Fault-tolerant quantum error correction is a necessity for any quantum architecture destined to tackle interesting, large-scale problems. Its theoretical formalism has been well founded for nearly two decades. However, we still do not have an appropriate compiler to produce a fault-tolerant, error-corrected description from a higher-level quantum circuit for state-of the-art hardware models. There are many technical hurdles, including dynamic circuit constructions that occur when constructing fault-tolerant circuits with commonly used error correcting codes. We introduce a package that converts high-level quantum circuits consisting of commonly used gates into a form employing all decompositions and ancillary protocols needed for fault-tolerant error correction. We call this form the (I)initialisation, (C)NOT, (M)measurement form (ICM) and consists of an initialisation layer of qubits into one of four distinct states, a massive, deterministic array of CNOT operations and a series of time-ordered X- or Z-basis measurements. The form allows a more flexible approach towards circuit optimisation. At the same time, the package outputs a standard circuit or a canonical geometric description which is a necessity for operating current state-of-the-art hardware architectures using topological quantum codes.

  7. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    NASA Astrophysics Data System (ADS)

    Frisoni, Manuela

    2016-03-01

    ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1) containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2) containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG) of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E) are shown and discussed in this paper.

  8. Watermarking spot colors in packaging

    NASA Astrophysics Data System (ADS)

    Reed, Alastair; Filler, TomáÅ.¡; Falkenstern, Kristyn; Bai, Yang

    2015-03-01

    In January 2014, Digimarc announced Digimarc® Barcode for the packaging industry to improve the check-out efficiency and customer experience for retailers. Digimarc Barcode is a machine readable code that carries the same information as a traditional Universal Product Code (UPC) and is introduced by adding a robust digital watermark to the package design. It is imperceptible to the human eye but can be read by a modern barcode scanner at the Point of Sale (POS) station. Compared to a traditional linear barcode, Digimarc Barcode covers the whole package with minimal impact on the graphic design. This significantly improves the Items per Minute (IPM) metric, which retailers use to track the checkout efficiency since it closely relates to their profitability. Increasing IPM by a few percent could lead to potential savings of millions of dollars for retailers, giving them a strong incentive to add the Digimarc Barcode to their packages. Testing performed by Digimarc showed increases in IPM of at least 33% using the Digimarc Barcode, compared to using a traditional barcode. A method of watermarking print ready image data used in the commercial packaging industry is described. A significant proportion of packages are printed using spot colors, therefore spot colors needs to be supported by an embedder for Digimarc Barcode. Digimarc Barcode supports the PANTONE spot color system, which is commonly used in the packaging industry. The Digimarc Barcode embedder allows a user to insert the UPC code in an image while minimizing perceptibility to the Human Visual System (HVS). The Digimarc Barcode is inserted in the printing ink domain, using an Adobe Photoshop plug-in as the last step before printing. Since Photoshop is an industry standard widely used by pre-press shops in the packaging industry, a Digimarc Barcode can be easily inserted and proofed.

  9. Progress on 3-D ICF simulations and Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.; Fyfe, David E.

    2016-10-01

    We have performed 3D simulations of Omega-scale and NIF-scale spherical direct-drive targets with the massively parallel fastrad3d code. Of particular interest is the robustness of the targets to the low mode perturbations impressed on the target by the laser system and how it compares to the influence of the perturbations produced by laser imprinting. As part of this simulation capability, we have upgraded our smoothed 3D raytrace package to run in spherical geometry. This package, which connects rays to form bundles and performs power deposition calculations on the bundles, can decrease laser absorption noise while using fewer rays and less message passing. This model produces both the imprint and the low-mode asymmetry drive that we are interested in here. We show recent simulation results of directly-driven targets using conventional ignition drive, and report on the influences of the two sources - low mode asymmetry and laser imprint - as the pellet conditions (e.g. adiabat) are varied. Work supported by DoE/NNSA.

  10. Kranc: a Mathematica package to generate numerical codes for tensorial evolution equations

    NASA Astrophysics Data System (ADS)

    Husa, Sascha; Hinder, Ian; Lechner, Christiane

    2006-06-01

    We present a suite of Mathematica-based computer-algebra packages, termed "Kranc", which comprise a toolbox to convert certain (tensorial) systems of partial differential evolution equations to parallelized C or Fortran code for solving initial boundary value problems. Kranc can be used as a "rapid prototyping" system for physicists or mathematicians handling very complicated systems of partial differential equations, but through integration into the Cactus computational toolkit we can also produce efficient parallelized production codes. Our work is motivated by the field of numerical relativity, where Kranc is used as a research tool by the authors. In this paper we describe the design and implementation of both the Mathematica packages and the resulting code, we discuss some example applications, and provide results on the performance of an example numerical code for the Einstein equations. Program summaryTitle of program: Kranc Catalogue identifier: ADXS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXS_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Distribution format: tar.gz Computer for which the program is designed and others on which it has been tested: General computers which run Mathematica (for code generation) and Cactus (for numerical simulations), tested under Linux Programming language used: Mathematica, C, Fortran 90 Memory required to execute with typical data: This depends on the number of variables and gridsize, the included ADM example requires 4308 KB Has the code been vectorized or parallelized: The code is parallelized based on the Cactus framework. Number of bytes in distributed program, including test data, etc.: 1 578 142 Number of lines in distributed program, including test data, etc.: 11 711 Nature of physical problem: Solution of partial differential equations in three space dimensions, which are formulated as an initial value problem. In particular, the program is geared towards handling very complex tensorial equations as they appear, e.g., in numerical relativity. The worked out examples comprise the Klein-Gordon equations, the Maxwell equations, and the ADM formulation of the Einstein equations. Method of solution: The method of numerical solution is finite differencing and method of lines time integration, the numerical code is generated through a high level Mathematica interface. Restrictions on the complexity of the program: Typical numerical relativity applications will contain up to several dozen evolution variables and thousands of source terms, Cactus applications have shown scaling up to several thousand processors and grid sizes exceeding 500 3. Typical running time: This depends on the number of variables and the grid size: the included ADM example takes approximately 100 seconds on a 1600 MHz Intel Pentium M processor. Unusual features of the program: based on Mathematica and Cactus

  11. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANN, F.M.

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  12. HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics

    NASA Astrophysics Data System (ADS)

    Wiebusch, Martin

    2015-10-01

    This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.

  13. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weeratunga, S K

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less

  14. The Modularized Software Package ASKI - Full Waveform Inversion Based on Waveform Sensitivity Kernels Utilizing External Seismic Wave Propagation Codes

    NASA Astrophysics Data System (ADS)

    Schumacher, F.; Friederich, W.

    2015-12-01

    We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion.

  15. Quality assurance of reference standards from nine European solar-ultraviolet monitoring laboratories.

    PubMed

    Gröbner, Julian; Rembges, Diana; Bais, Alkiviadis F; Blumthaler, Mario; Cabot, Thierry; Josefsson, Weine; Koskela, Tapani; Thorseth, Trond M; Webb, Ann R; Wester, Ulf

    2002-07-20

    A program for quality assurance of reference standards has been initiated among nine solar-UV monitoring laboratories. By means of a traveling lamp package that comprises several 1000-W ANSI code DXW-type quartz-halogen lamps, a 0.1-ohm shunt, and a 6-1/2 digit voltmeter, the irradiance scales used by the nine laboratories were compared with one another; a relative uncertainty of 1.2% was found. The comparison of 15 reference standards yielded differences of as much as 9%; the average difference was less than 3%.

  16. Substructured multibody molecular dynamics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary Stephen; Stevens, Mark Jackson; Plimpton, Steven James

    2006-11-01

    We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.

  17. Integrated Electronic Warfare System Advanced Development Model (ADM); Appendix 1 - Functional Requirement Specification.

    DTIC Science & Technology

    1977-10-01

    APPROVED DATE FUNCTION APPROVED jDATE WRITER J . K-olanek 2/6/76 REVISIONS CHK DESCRIPTION REV CHK DESCRIPTION IREV REVISION jJ ~ ~ ~~~ _ II SHEET NO...DOCUMENT (CDBDD) 45 5.5 COMPUTER PROGRAM PACKAGE (CPP)- j 45 5.6 COMPUTER PROGRAM OPERATOR’S MANUAL (CPOM) 45 5.7 COMPUTER PROGRAM TEST PLAN (CPTPL) 45...I LIST OF FIGURES Number Page 1 JEWS Simplified Block Diagram 4 2 System Controller Architecture 5 SIZE CODE IDENT NO DRAWING NO. A 49956 SCALE REV J

  18. Sky Polarization Data for Volcanic and Non-Volcanic Periods.

    DTIC Science & Technology

    1986-10-01

    CLASSIFICATION ]UNCLASSIFIED/UNLIMITED 0l SAME AS RPT [I-DTIC USERS Unclassgified 22a NAME OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code) 22c...dr, i I rt and group number. The vertical axes have bee ., chuse so that ),A UB and DBA all have the same scale. Nute that the )Iott rinq package has...tfout 2 n i microns, comparable in size to tcg and clouci (irclplets. ;,s shown by observations in the course of a cl]-.r day, the (iuS , and pollen are

  19. Advanced Software Development Workstation Project

    NASA Technical Reports Server (NTRS)

    Lee, Daniel

    1989-01-01

    The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.

  20. 49 CFR 173.242 - Bulk packagings for certain medium hazard liquids and solids, including solids with dual hazards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... provisions specified in column 7 of the § 172.101 table. (a) Rail cars: Class DOT 103, 104, 105, 109, 111... IM 101, IM 102, and UN portable tanks when a T Code is specified in Column (7) of the § 172.101... authorized according to the IBC packaging code specified for the specific hazardous material in Column (7) of...

  1. 49 CFR 173.242 - Bulk packagings for certain medium hazard liquids and solids, including solids with dual hazards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... provisions specified in column 7 of the § 172.101 table. (a) Rail cars: Class DOT 103, 104, 105, 109, 111...; Specification IM 101, IM 102, and UN portable tanks when a T Code is specified in Column (7) of the § 172.101... authorized according to the IBC packaging code specified for the specific hazardous material in Column (7) of...

  2. 49 CFR 173.242 - Bulk packagings for certain medium hazard liquids and solids, including solids with dual hazards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... provisions specified in column 7 of the § 172.101 table. (a) Rail cars: Class DOT 103, 104, 105, 109, 111... IM 101, IM 102, and UN portable tanks when a T Code is specified in Column (7) of the § 172.101... authorized according to the IBC packaging code specified for the specific hazardous material in Column (7) of...

  3. 49 CFR 173.242 - Bulk packagings for certain medium hazard liquids and solids, including solids with dual hazards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... provisions specified in column 7 of the § 172.101 table. (a) Rail cars: Class DOT 103, 104, 105, 109, 111... IM 101, IM 102, and UN portable tanks when a T Code is specified in Column (7) of the § 172.101... authorized according to the IBC packaging code specified for the specific hazardous material in Column (7) of...

  4. 49 CFR 173.242 - Bulk packagings for certain medium hazard liquids and solids, including solids with dual hazards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... provisions specified in column 7 of the § 172.101 table. (a) Rail cars: Class DOT 103, 104, 105, 109, 111... IM 101, IM 102, and UN portable tanks when a T Code is specified in Column (7) of the § 172.101... authorized according to the IBC packaging code specified for the specific hazardous material in Column (7) of...

  5. MELCOR computer code manuals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  6. Software Integration in Multi-scale Simulations: the PUPIL System

    NASA Astrophysics Data System (ADS)

    Torras, J.; Deumens, E.; Trickey, S. B.

    2006-10-01

    The state of the art for computational tools in both computational chemistry and computational materials physics includes many algorithms and functionalities which are implemented again and again. Several projects aim to reduce, eliminate, or avoid this problem. Most such efforts seem to be focused within a particular specialty, either quantum chemistry or materials physics. Multi-scale simulations, by their very nature however, cannot respect that specialization. In simulation of fracture, for example, the energy gradients that drive the molecular dynamics (MD) come from a quantum mechanical treatment that most often derives from quantum chemistry. That “QM” region is linked to a surrounding “CM” region in which potentials yield the forces. The approach therefore requires the integration or at least inter-operation of quantum chemistry and materials physics algorithms. The same problem occurs in “QM/MM” simulations in computational biology. The challenge grows if pattern recognition or other analysis codes of some kind must be used as well. The most common mode of inter-operation is user intervention: codes are modified as needed and data files are managed “by hand” by the user (interactively and via shell scripts). User intervention is however inefficient by nature, difficult to transfer to the community, and prone to error. Some progress (e.g Sethna’s work at Cornell [C.R. Myers et al., Mat. Res. Soc. Symp. Proc., 538(1999) 509, C.-S. Chen et al., Poster presented at the Material Research Society Meeting (2000)]) has been made on using Python scripts to achieve a more efficient level of interoperation. In this communication we present an alternative approach to merging current working packages without the necessity of major recoding and with only a relatively light wrapper interface. The scheme supports communication among the different components required for a given multi-scale calculation and access to the functionalities of those components for the potential user. A general main program allows the management of every package with a special communication protocol between their interfaces following the directives introduced by the user which are stored in an XML structured file. The initial prototype of the PUPIL (Program for User Packages Interfacing and Linking) system has been done using Java as a fast, easy prototyping object oriented (OO) language. In order to test it, we have applied this prototype to a previously studied problem, the fracture of a silica nanorod. We did so joining two different packages to do a QM/MD calculation. The results show the potential for this software system to do different kind of simulations and its simplicity of maintenance.

  7. 49 CFR 178.523 - Standards for composite packagings with inner glass, porcelain, or stoneware receptacles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... glass, porcelain, or stoneware receptacles. 178.523 Section 178.523 Transportation Other Regulations... Standards § 178.523 Standards for composite packagings with inner glass, porcelain, or stoneware receptacles. (a) The following are identification codes for composite packagings with inner receptacles of glass...

  8. 49 CFR 178.522 - Standards for composite packagings with inner plastic receptacles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... plastic receptacles. 178.522 Section 178.522 Transportation Other Regulations Relating to Transportation... packagings with inner plastic receptacles. (a) The following are the identification codes for composite packagings with inner plastic receptacles: (1) 6HA1 for a plastic receptacle within a protective steel drum...

  9. 49 CFR 178.522 - Standards for composite packagings with inner plastic receptacles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... plastic receptacles. 178.522 Section 178.522 Transportation Other Regulations Relating to Transportation... packagings with inner plastic receptacles. (a) The following are the identification codes for composite packagings with inner plastic receptacles: (1) 6HA1 for a plastic receptacle within a protective steel drum...

  10. 49 CFR 178.522 - Standards for composite packagings with inner plastic receptacles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... plastic receptacles. 178.522 Section 178.522 Transportation Other Regulations Relating to Transportation... packagings with inner plastic receptacles. (a) The following are the identification codes for composite packagings with inner plastic receptacles: (1) 6HA1 for a plastic receptacle within a protective steel drum...

  11. 49 CFR 178.522 - Standards for composite packagings with inner plastic receptacles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... plastic receptacles. 178.522 Section 178.522 Transportation Other Regulations Relating to Transportation... packagings with inner plastic receptacles. (a) The following are the identification codes for composite packagings with inner plastic receptacles: (1) 6HA1 for a plastic receptacle within a protective steel drum...

  12. Mixing of the Interstellar and Solar Plasmas at the Heliospheric Interface

    DOE PAGES

    Pogorelov, N. V.; Borovikov, S. N.

    2015-10-12

    From the ideal MHD perspective, the heliopause is a tangential discontinuity that separates the solar wind plasma from the local interstellar medium plasma. There are physical processes, however, that make the heliopause permeable. They can be subdivided into kinetic and MHD categories. Kinetic processes occur on small length and time scales, and cannot be resolved with MHD equations. On the other hand, MHD instabilities of the heliopause have much larger scales and can be easily observed by spacecraft. The heliopause may also be a subject of magnetic reconnection. In this paper, we discuss mechanisms of plasma mixing at the heliopausemore » in the context of Voyager 1 observations. Numerical results are obtained with a Multi-Scale Fluid-Kinetic Simulation Suite (MS-FLUKSS), which is a package of numerical codes capable of performing adaptive mesh refinement simulations of complex plasma flows in the presence of discontinuities and charge exchange between ions and neutral atoms. The flow of the ionized component is described with the ideal MHD equations, while the transport of atoms is governed either by the Boltzmann equation or multiple Euler gas dynamics equations. The code can also treat nonthermal ions and turbulence produced by them.« less

  13. Basic Business and Economics: Understanding the Uses of the Universal Product Code

    ERIC Educational Resources Information Center

    Blockhus, Wanda

    1977-01-01

    Describes the Universal Product Code (UPC), the two-part food labeling and packaging code which is both human- and electronic scanner-readable. Discusses how it affects both consumer and business, and suggests how to teach the UPC code to business education students. (HD)

  14. Supporting 64-bit global indices in Epetra and other Trilinos packages :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jhurani, Chetan; Austin, Travis M.; Heroux, Michael Allen

    The Trilinos Project is an effort to facilitate the design, development, integration and ongoing support of mathematical software libraries within an object-oriented framework. It is intended for large-scale, complex multiphysics engineering and scientific applications [2, 4, 3]. Epetra is one of its basic packages. It provides serial and parallel linear algebra capabilities. Before Trilinos version 11.0, released in 2012, Epetra used the C++ int data-type for storing global and local indices for degrees of freedom (DOFs). Since int is typically 32-bit, this limited the largest problem size to be smaller than approximately two billion DOFs. This was true even ifmore » a distributed memory machine could handle larger problems. We have added optional support for C++ long long data-type, which is at least 64-bit wide, for global indices. To save memory, maintain the speed of memory-bound operations, and reduce further changes to the code, the local indices are still 32-bit. We document the changes required to achieve this feature and how the new functionality can be used. We also report on the lessons learned in modifying a mature and popular package from various perspectives design goals, backward compatibility, engineering decisions, C++ language features, effects on existing users and other packages, and build integration.« less

  15. FRAMES Metadata Reporting Templates for Ecohydrological Observations, version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christianson, Danielle; Varadharajan, Charuleka; Christoffersen, Brad

    FRAMES is a a set of Excel metadata files and package-level descriptive metadata that are designed to facilitate and improve capture of desired metadata for ecohydrological observations. The metadata are bundled with data files into a data package and submitted to a data repository (e.g. the NGEE Tropics Data Repository) via a web form. FRAMES standardizes reporting of diverse ecohydrological and biogeochemical data for synthesis across a range of spatiotemporal scales and incorporates many best data science practices. This version of FRAMES supports observations for primarily automated measurements collected by permanently located sensors, including sap flow (tree water use), leafmore » surface temperature, soil water content, dendrometry (stem diameter growth increment), and solar radiation. Version 1.1 extend the controlled vocabulary and incorporates functionality to facilitate programmatic use of data and FRAMES metadata (R code available at NGEE Tropics Data Repository).« less

  16. A study of the compatibility of an existing CFD package with a broader class of material constitutions

    NASA Technical Reports Server (NTRS)

    French, K. W., Jr.

    1985-01-01

    The flexibility of the PHOENICS computational fluid dynamics package was assessed along two general avenues; parallel modeling and analog modeling. In parallel modeling the dependent and independent variables retain their identity within some scaling factors, even though the boundary conditions and especially the constitutive relations do not correspond to any realistic fluid dynamic situation. PHOENICS was used to generate a CFD model that should exhibit the physical anomalies of a granular medium and permit reasonable similarity with boundary conditions typical to membrane or porous piston loading. A considerable portion of the study was spent prying into the existing code with a prejudice toward rate type and disarming any inherent fluid behavior. The final stages of the study were directed at the more specific problem of multiaxis loading of cylindrical geometry with a concern for the appearance of bulging, cross slab shear failure modes.

  17. Parallel Adaptive Mesh Refinement Library

    NASA Technical Reports Server (NTRS)

    Mac-Neice, Peter; Olson, Kevin

    2005-01-01

    Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.

  18. 49 CFR 178.503 - Marking of packagings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that is represented as manufactured to meet a UN standard with the marks specified in this section. The... marks should be used to separate this information. A packaging conforming to a UN standard must be... “UN”) may be applied in place of the symbol;; (2) A packaging identification code designating the type...

  19. 49 CFR 178.503 - Marking of packagings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that is represented as manufactured to meet a UN standard with the marks specified in this section. The... marks should be used to separate this information. A packaging conforming to a UN standard must be... “UN”) may be applied in place of the symbol;; (2) A packaging identification code designating the type...

  20. 49 CFR 178.503 - Marking of packagings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that is represented as manufactured to meet a UN standard with the marks specified in this section. The... marks should be used to separate this information. A packaging conforming to a UN standard must be... “UN” may be applied in place of the symbol); (2) A packaging identification code designating the type...

  1. 49 CFR 178.503 - Marking of packagings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that is represented as manufactured to meet a UN standard with the marks specified in this section. The... marks should be used to separate this information. A packaging conforming to a UN standard must be... “UN” may be applied in place of the symbol); (2) A packaging identification code designating the type...

  2. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    NASA Astrophysics Data System (ADS)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  3. PARAMESH: A Parallel Adaptive Mesh Refinement Community Toolkit

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Olson, Kevin M.; Mobarry, Clark; deFainchtein, Rosalinda; Packer, Charles

    1999-01-01

    In this paper, we describe a community toolkit which is designed to provide parallel support with adaptive mesh capability for a large and important class of computational models, those using structured, logically cartesian meshes. The package of Fortran 90 subroutines, called PARAMESH, is designed to provide an application developer with an easy route to extend an existing serial code which uses a logically cartesian structured mesh into a parallel code with adaptive mesh refinement. Alternatively, in its simplest use, and with minimal effort, it can operate as a domain decomposition tool for users who want to parallelize their serial codes, but who do not wish to use adaptivity. The package can provide them with an incremental evolutionary path for their code, converting it first to uniformly refined parallel code, and then later if they so desire, adding adaptivity.

  4. JADAMILU: a software code for computing selected eigenvalues of large sparse symmetric matrices

    NASA Astrophysics Data System (ADS)

    Bollhöfer, Matthias; Notay, Yvan

    2007-12-01

    A new software code for computing selected eigenvalues and associated eigenvectors of a real symmetric matrix is described. The eigenvalues are either the smallest or those closest to some specified target, which may be in the interior of the spectrum. The underlying algorithm combines the Jacobi-Davidson method with efficient multilevel incomplete LU (ILU) preconditioning. Key features are modest memory requirements and robust convergence to accurate solutions. Parameters needed for incomplete LU preconditioning are automatically computed and may be updated at run time depending on the convergence pattern. The software is easy to use by non-experts and its top level routines are written in FORTRAN 77. Its potentialities are demonstrated on a few applications taken from computational physics. Program summaryProgram title: JADAMILU Catalogue identifier: ADZT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 101 359 No. of bytes in distributed program, including test data, etc.: 7 493 144 Distribution format: tar.gz Programming language: Fortran 77 Computer: Intel or AMD with g77 and pgf; Intel EM64T or Itanium with ifort; AMD Opteron with g77, pgf and ifort; Power (IBM) with xlf90. Operating system: Linux, AIX RAM: problem dependent Word size: real:8; integer: 4 or 8, according to user's choice Classification: 4.8 Nature of problem: Any physical problem requiring the computation of a few eigenvalues of a symmetric matrix. Solution method: Jacobi-Davidson combined with multilevel ILU preconditioning. Additional comments: We supply binaries rather than source code because JADAMILU uses the following external packages: MC64. This software is copyrighted software and not freely available. COPYRIGHT (c) 1999 Council for the Central Laboratory of the Research Councils. AMD. Copyright (c) 2004-2006 by Timothy A. Davis, Patrick R. Amestoy, and Iain S. Duff. Source code is distributed by the authors under the GNU LGPL licence. BLAS. The reference BLAS is a freely-available software package. It is available from netlib via anonymous ftp and the World Wide Web. LAPACK. The complete LAPACK package or individual routines from LAPACK are freely available on netlib and can be obtained via the World Wide Web or anonymous ftp. For maximal benefit to the community, we added the sources we are proprietary of to the tar.gz file submitted for inclusion in the CPC library. However, as explained in the README file, users willing to compile the code instead of using binaries should first obtain the sources for the external packages mentioned above (email and/or web addresses are provided). Running time: Problem dependent; the test examples provided with the code only take a few seconds to run; timing results for large scale problems are given in Section 5.

  5. An electron-beam dose deposition experiment: TIGER 1-D simulation code versus thermoluminescent dosimetry

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Tipton, Charles W.; Self, Charles T.

    1991-03-01

    The dose absorbed in an integrated circuit (IC) die exposed to a pulse of low-energy electrons is a strong function of both electron energy and surrounding packaging materials. This report describes an experiment designed to measure how well the Integrated TIGER Series one-dimensional (1-D) electron transport simulation program predicts dose correction factors for a state-of-the-art IC package and package/printed circuit board (PCB) combination. These derived factors are compared with data obtained experimentally using thermoluminescent dosimeters (TLD's) and the FX-45 flash x-ray machine (operated in electron-beam (e-beam) mode). The results of this experiment show that the TIGER 1-D simulation code can be used to accurately predict FX-45 e-beam dose deposition correction factors for reasonably complex IC packaging configurations.

  6. SpecBit, DecayBit and PrecisionBit: GAMBIT modules for computing mass spectra, particle decay rates and precision observables

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balázs, Csaba; Dal, Lars A.; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Kvellestad, Anders; McKay, James; Putze, Antje; Rogan, Chris; Scott, Pat; Weniger, Christoph; White, Martin

    2018-01-01

    We present the GAMBIT modules SpecBit, DecayBit and PrecisionBit. Together they provide a new framework for linking publicly available spectrum generators, decay codes and other precision observable calculations in a physically and statistically consistent manner. This allows users to automatically run various combinations of existing codes as if they are a single package. The modular design allows software packages fulfilling the same role to be exchanged freely at runtime, with the results presented in a common format that can easily be passed to downstream dark matter, collider and flavour codes. These modules constitute an essential part of the broader GAMBIT framework, a major new software package for performing global fits. In this paper we present the observable calculations, data, and likelihood functions implemented in the three modules, as well as the conventions and assumptions used in interfacing them with external codes. We also present 3-BIT-HIT, a command-line utility for computing mass spectra, couplings, decays and precision observables in the MSSM, which shows how the three modules can easily be used independently of GAMBIT.

  7. Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes

    NASA Astrophysics Data System (ADS)

    Dash, S. M.; Pergament, H. S.; Thorpe, R. D.

    1980-05-01

    Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.

  8. Computational models for the viscous/inviscid analysis of jet aircraft exhaust plumes. [predicting afterbody drag

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Pergament, H. S.; Thorpe, R. D.

    1980-01-01

    Computational models which analyze viscous/inviscid flow processes in jet aircraft exhaust plumes are discussed. These models are component parts of an NASA-LaRC method for the prediction of nozzle afterbody drag. Inviscid/shock processes are analyzed by the SCIPAC code which is a compact version of a generalized shock capturing, inviscid plume code (SCIPPY). The SCIPAC code analyzes underexpanded jet exhaust gas mixtures with a self-contained thermodynamic package for hydrocarbon exhaust products and air. A detailed and automated treatment of the embedded subsonic zones behind Mach discs is provided in this analysis. Mixing processes along the plume interface are analyzed by two upgraded versions of an overlaid, turbulent mixing code (BOAT) developed previously for calculating nearfield jet entrainment. The BOATAC program is a frozen chemistry version of BOAT containing the aircraft thermodynamic package as SCIPAC; BOATAB is an afterburning version with a self-contained aircraft (hydrocarbon/air) finite-rate chemistry package. The coupling of viscous and inviscid flow processes is achieved by an overlaid procedure with interactive effects accounted for by a displacement thickness type correction to the inviscid plume interface.

  9. MELCOR computer code manuals: Primer and user`s guides, Version 1.8.3 September 1994. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, andmore » combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users` Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.« less

  10. Easily extensible unix software for spectral analysis, display, modification, and synthesis of musical sounds

    NASA Astrophysics Data System (ADS)

    Beauchamp, James W.

    2002-11-01

    Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.

  11. The R package 'Luminescence': a history of unexpected complexity and concepts to deal with it

    NASA Astrophysics Data System (ADS)

    Kreutzer, Sebastian; Burow, Christoph; Dietze, Michael; Fuchs, Margret C.; Friedrich, Johannes; Fischer, Manfred; Schmidt, Christoph

    2017-04-01

    Overcoming limitations in the so far used standard software, developing an efficient solution of low weight for a very specific task or creating graphs of high quality: the reasons that may had initially lead a scientist to work with R are manifold. And as long as developed solutions, e.g., R scripts, are needed for personal use only, code can remain unstructured and a documentation is not compulsory. However, this changes with the first friendly request for help after the code has been reused by others. In contrast to single scripts, written without intention to ever get published, for R packages the CRAN policy demands a more structured and elaborated approach including a minimum of documentation. Nevertheless, growing projects with thousands of lines of code that need to be maintained can become overwhelming, in particular as researchers are not by definition experts on managing software projects. The R package 'Luminescence' (Kreutzer et al., 2017), a collection of tools dealing with the analysis of luminescence data in a geoscientific, geochronological context, started as one single R script, but quickly evolved into a comprehensive solution connected with various other R packages. We present (1) a very brief development history of the package 'Luminescence', before we (2) sketch technical challenges encountered over time and solutions that have been found to deal with it by using various open source tools. Our presentation is considered as a collection of concepts and approaches to set up R projects in geosciences. References. Kreutzer, S., Dietze, M., Burow, C., Fuchs, M. C., Schmidt, C., Fischer, M., Friedrich, J., 2017. Luminescence: Comprehensive Luminescence Dating Data Analysis. R package version 0.6.4. https://CRAN.R-project.org/package=Luminescence

  12. Higgs mass prediction in the MSSM at three-loop level in a pure \\overline{{ {DR}}} context

    NASA Astrophysics Data System (ADS)

    Harlander, Robert V.; Klappert, Jonas; Voigt, Alexander

    2017-12-01

    The impact of the three-loop effects of order α _tα _s^2 on the mass of the light CP-even Higgs boson in the { {MSSM}} is studied in a pure \\overline{{ {DR}}} context. For this purpose, we implement the results of Kant et al. (JHEP 08:104, 2010) into the C++ module Himalaya and link it to FlexibleSUSY, a Mathematica and C++ package to create spectrum generators for BSM models. The three-loop result is compared to the fixed-order two-loop calculations of the original FlexibleSUSY and of FeynHiggs, as well as to the result based on an EFT approach. Aside from the expected reduction of the renormalization scale dependence with respect to the lower-order results, we find that the three-loop contributions significantly reduce the difference from the EFT prediction in the TeV-region of the { {SUSY}} scale {M_S}. Himalaya can be linked also to other two-loop \\overline{{ {DR}}} codes, thus allowing for the elevation of these codes to the three-loop level.

  13. ECCD-induced tearing mode stabilization in coupled IPS/NIMROD/GENRAY HPC simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Kruger, S. E.; Held, E. D.; Harvey, R. W.; Elwasif, W. R.; Schnack, D. D.; SWIM Project Team

    2011-10-01

    We present developments toward an integrated, predictive model for determining optimal ECCD-based NTM stabilization strategies in ITER. We demonstrate the capability of the SWIM Project's Integrated Plasma Simulator (IPS) framework to choreograph multiple executions of, and data exchanges between, physics codes modeling various spatiotemporal scales of this coupled RF/MHD problem on several thousand HPC processors. As NIMROD evolves fluid equations to model bulk plasma behavior, self-consistent propagation/deposition of RF power in the ensuing plasma profiles is calculated by GENRAY. A third code (QLCALC) then interfaces with computational geometry packages to construct the RF-induced quasilinear diffusion tensor from NIMROD/GENRAY data, and the moments of this tensor (entering as additional terms in NIMROD's fluid equations due to the disparity in RF/MHD spatiotemporal scales) influence the dynamics of current, momentum, and energy evolution. Initial results are shown to correctly capture the physics of magnetic island stabilization [Jenkins et al., PoP 17, 012502 (2010)]; we also discuss the development of a numerical plasma control system for active feedback stabilization of tearing modes. Funded by USDoE SciDAC.

  14. ARES: automated response function code. Users manual. [HPGAM and LSQVM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maung, T.; Reynolds, G.M.

    This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.

  15. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  16. SPEXTRA: Optimal extraction code for long-slit spectra in crowded fields

    NASA Astrophysics Data System (ADS)

    Sarkisyan, A. N.; Vinokurov, A. S.; Solovieva, Yu. N.; Sholukhova, O. N.; Kostenkov, A. E.; Fabrika, S. N.

    2017-10-01

    We present a code for the optimal extraction of long-slit 2D spectra in crowded stellar fields. Its main advantage and difference from the existing spectrum extraction codes is the presence of a graphical user interface (GUI) and a convenient visualization system of data and extraction parameters. On the whole, the package is designed to study stars in crowded fields of nearby galaxies and star clusters in galaxies. Apart from the spectrum extraction for several stars which are closely located or superimposed, it allows the spectra of objects to be extracted with subtraction of superimposed nebulae of different shapes and different degrees of ionization. The package can also be used to study single stars in the case of a strong background. In the current version, the optimal extraction of 2D spectra with an aperture and the Gaussian function as PSF (point spread function) is proposed. In the future, the package will be supplemented with the possibility to build a PSF based on a Moffat function. We present the details of GUI, illustrate main features of the package, and show results of extraction of the several interesting spectra of objects from different telescopes.

  17. The equation of state package FEOS for high energy density matter

    NASA Astrophysics Data System (ADS)

    Faik, Steffen; Tauschwitz, Anna; Iosilevskiy, Igor

    2018-06-01

    Adequate equation of state (EOS) data is of high interest in the growing field of high energy density physics and especially essential for hydrodynamic simulation codes. The semi-analytical method used in the newly developed Frankfurt equation of state (FEOS) package provides an easy and fast access to the EOS of - in principle - arbitrary materials. The code is based on the well known QEOS model (More et al., 1988; Young and Corey, 1995) and is a further development of the MPQeos code (Kemp and Meyer-ter Vehn, 1988; Kemp and Meyer-ter Vehn, 1998) from Max-Planck-Institut für Quantenoptik (MPQ) in Garching Germany. The list of features contains the calculation of homogeneous mixtures of chemical elements and the description of the liquid-vapor two-phase region with or without a Maxwell construction. Full flexibility of the package is assured by its structure: A program library provides the EOS with an interface designed for Fortran or C/C++ codes. Two additional software tools allow for the generation of EOS tables in different file output formats and for the calculation and visualization of isolines and Hugoniot shock adiabats. As an example the EOS of fused silica (SiO2) is calculated and compared to experimental data and other EOS codes.

  18. Packaging of electro-microfluidic devices

    DOEpatents

    Benavides, Gilbert L.; Galambos, Paul C.; Emerson, John A.; Peterson, Kenneth A.; Giunta, Rachel K.; Zamora, David Lee; Watson, Robert D.

    2003-04-15

    A new architecture for packaging surface micromachined electro-microfluidic devices is presented. This architecture relies on two scales of packaging to bring fluid to the device scale (picoliters) from the macro-scale (microliters). The architecture emulates and utilizes electronics packaging technology. The larger package consists of a circuit board with embedded fluidic channels and standard fluidic connectors (e.g. Fluidic Printed Wiring Board). The embedded channels connect to the smaller package, an Electro-Microfluidic Dual-Inline-Package (EMDIP) that takes fluid to the microfluidic integrated circuit (MIC). The fluidic connection is made to the back of the MIC through Bosch-etched holes that take fluid to surface micromachined channels on the front of the MIC. Electrical connection is made to bond pads on the front of the MIC.

  19. Packaging of electro-microfluidic devices

    DOEpatents

    Benavides, Gilbert L.; Galambos, Paul C.; Emerson, John A.; Peterson, Kenneth A.; Giunta, Rachel K.; Watson, Robert D.

    2002-01-01

    A new architecture for packaging surface micromachined electro-microfluidic devices is presented. This architecture relies on two scales of packaging to bring fluid to the device scale (picoliters) from the macro-scale (microliters). The architecture emulates and utilizes electronics packaging technology. The larger package consists of a circuit board with embedded fluidic channels and standard fluidic connectors (e.g. Fluidic Printed Wiring Board). The embedded channels connect to the smaller package, an Electro-Microfluidic Dual-Inline-Package (EMDIP) that takes fluid to the microfluidic integrated circuit (MIC). The fluidic connection is made to the back of the MIC through Bosch-etched holes that take fluid to surface micromachined channels on the front of the MIC. Electrical connection is made to bond pads on the front of the MIC.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, John H.; Belcourt, Kenneth Noel

    Completion of the CASL L3 milestone THM.CFD.P6.03 provides a tabular material properties capability to the Hydra code. A tabular interpolation package used in Sandia codes was modified to support the needs of multi-phase solvers in Hydra. Use of the interface is described. The package was released to Hydra under a government use license. A dummy physics was created in Hydra to prototype use of the interpolation routines. Finally, a test using the dummy physics verifies the correct behavior of the interpolation for a test water table. 3

  1. The VENUS/NWChem software package. Tight coupling between chemical dynamics simulations and electronic structure theory

    NASA Astrophysics Data System (ADS)

    Lourderaj, Upakarasamy; Sun, Rui; Kohale, Swapnil C.; Barnes, George L.; de Jong, Wibe A.; Windus, Theresa L.; Hase, William L.

    2014-03-01

    The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling since the two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface that accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.

  2. User's guide to the Variably Saturated Flow (VSF) process to MODFLOW

    USGS Publications Warehouse

    Thoms, R. Brad; Johnson, Richard L.; Healy, Richard W.

    2006-01-01

    A new process for simulating three-dimensional (3-D) variably saturated flow (VSF) using Richards' equation has been added to the 3-D modular finite-difference ground-water model MODFLOW. Five new packages are presented here as part of the VSF Process--the Richards' Equation Flow (REF1) Package, the Seepage Face (SPF1) Package, the Surface Ponding (PND1) Package, the Surface Evaporation (SEV1) Package, and the Root Zone Evapotranspiration (RZE1) Package. Additionally, a new Adaptive Time-Stepping (ATS1) Package is presented for use by both the Ground-Water Flow (GWF) Process and VSF. The VSF Process allows simulation of flow in unsaturated media above the ground-water zone and facilitates modeling of ground-water/surface-water interactions. Model performance is evaluated by comparison to an analytical solution for one-dimensional (1-D) constant-head infiltration (Dirichlet boundary condition), field experimental data for a 1-D constant-head infiltration, laboratory experimental data for two-dimensional (2-D) constant-flux infiltration (Neumann boundary condition), laboratory experimental data for 2-D transient drainage through a seepage face, and numerical model results (VS2DT) of a 2-D flow-path simulation using realistic surface boundary conditions. A hypothetical 3-D example case also is presented to demonstrate the new capability using periodic boundary conditions (for example, daily precipitation) and varied surface topography over a larger spatial scale (0.133 square kilometer). The new model capabilities retain the modular structure of the MODFLOW code and preserve MODFLOW's existing capabilities as well as compatibility with commercial pre-/post-processors. The overall success of the VSF Process in simulating mixed boundary conditions and variable soil types demonstrates its utility for future hydrologic investigations. This report presents a new flow package implementing the governing equations for variably saturated ground-water flow, four new boundary condition packages unique to unsaturated flow, the Adaptive Time-Stepping Package for use with both the GWF Process and the new VSF Process, detailed descriptions of the input and output files for each package, and six simulation examples verifying model performance.

  3. The 9th international symposium on the packaging and transportation of radioactive materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1989-06-01

    This three-volume document contains the papers and poster sessions presented at the symposium. Volume 3 contains 87 papers on topics such as structural codes and benchmarking, shipment of plutonium by air, spent fuel shipping, planning, package design and risk assessment, package testing, OCRWN operations experience and regulations. Individual papers were processed separately for the data base. (TEM)

  4. AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.

    PubMed

    Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld

    2016-08-01

    There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  6. ExoData: A Python package to handle large exoplanet catalogue data

    NASA Astrophysics Data System (ADS)

    Varley, Ryan

    2016-10-01

    Exoplanet science often involves using the system parameters of real exoplanets for tasks such as simulations, fitting routines, and target selection for proposals. Several exoplanet catalogues are already well established but often lack a version history and code friendly interfaces. Software that bridges the barrier between the catalogues and code enables users to improve the specific repeatability of results by facilitating the retrieval of exact system parameters used in articles results along with unifying the equations and software used. As exoplanet science moves towards large data, gone are the days where researchers can recall the current population from memory. An interface able to query the population now becomes invaluable for target selection and population analysis. ExoData is a Python interface and exploratory analysis tool for the Open Exoplanet Catalogue. It allows the loading of exoplanet systems into Python as objects (Planet, Star, Binary, etc.) from which common orbital and system equations can be calculated and measured parameters retrieved. This allows researchers to use tested code of the common equations they require (with units) and provides a large science input catalogue of planets for easy plotting and use in research. Advanced querying of targets is possible using the database and Python programming language. ExoData is also able to parse spectral types and fill in missing parameters according to programmable specifications and equations. Examples of use cases are integration of equations into data reduction pipelines, selecting planets for observing proposals and as an input catalogue to large scale simulation and analysis of planets. ExoData is a Python package available freely on GitHub.

  7. Method Of Packaging And Assembling Electro-Microfluidic Devices

    DOEpatents

    Benavides, Gilbert L.; Galambos, Paul C.; Emerson, John A.; Peterson, Kenneth A.; Giunta, Rachel K.; Zamora, David Lee; Watson, Robert D.

    2004-11-23

    A new architecture for packaging surface micromachined electro-microfluidic devices is presented. This architecture relies on two scales of packaging to bring fluid to the device scale (picoliters) from the macro-scale (microliters). The architecture emulates and utilizes electronics packaging technology. The larger package consists of a circuit board with embedded fluidic channels and standard fluidic connectors (e.g. Fluidic Printed Wiring Board). The embedded channels connect to the smaller package, an Electro-Microfluidic Dual-Inline-Package (EMDIP) that takes fluid to the microfluidic integrated circuit (MIC). The fluidic connection is made to the back of the MIC through Bosch-etched holes that take fluid to surface micromachined channels on the front of the MIC. Electrical connection is made to bond pads on the front of the MIC.

  8. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  9. Enhancement of the CAVE computer code

    NASA Astrophysics Data System (ADS)

    Rathjen, K. A.; Burk, H. O.

    1983-12-01

    The computer code CAVE (Conduction Analysis via Eigenvalues) is a convenient and efficient computer code for predicting two dimensional temperature histories within thermal protection systems for hypersonic vehicles. The capabilities of CAVE were enhanced by incorporation of the following features into the code: real gas effects in the aerodynamic heating predictions, geometry and aerodynamic heating package for analyses of cone shaped bodies, input option to change from laminar to turbulent heating predictions on leading edges, modification to account for reduction in adiabatic wall temperature with increase in leading sweep, geometry package for two dimensional scramjet engine sidewall, with an option for heat transfer to external and internal surfaces, print out modification to provide tables of select temperatures for plotting and storage, and modifications to the radiation calculation procedure to eliminate temperature oscillations induced by high heating rates. These new features are described.

  10. A multicenter collaborative approach to reducing pediatric codes outside the ICU.

    PubMed

    Hayes, Leslie W; Dobyns, Emily L; DiGiovine, Bruno; Brown, Ann-Marie; Jacobson, Sharon; Randall, Kelly H; Wathen, Beth; Richard, Heather; Schwab, Carolyn; Duncan, Kathy D; Thrasher, Jodi; Logsdon, Tina R; Hall, Matthew; Markovitz, Barry

    2012-03-01

    The Child Health Corporation of America formed a multicenter collaborative to decrease the rate of pediatric codes outside the ICU by 50%, double the days between these events, and improve the patient safety culture scores by 5 percentage points. A multidisciplinary pediatric advisory panel developed a comprehensive change package of process improvement strategies and measures for tracking progress. Learning sessions, conference calls, and data submission facilitated collaborative group learning and implementation. Twenty Child Health Corporation of America hospitals participated in this 12-month improvement project. Each hospital identified at least 1 noncritical care target unit in which to implement selected elements of the change package. Strategies to improve prevention, detection, and correction of the deteriorating patient ranged from relatively simple, foundational changes to more complex, advanced changes. Each hospital selected a broad range of change package elements for implementation using rapid-cycle methodologies. The primary outcome measure was reduction in codes per 1000 patient days. Secondary outcomes were days between codes and change in patient safety culture scores. Code rate for the collaborative did not decrease significantly (3% decrease). Twelve hospitals reported additional data after the collaborative and saw significant improvement in code rates (24% decrease). Patient safety culture scores improved by 4.5% to 8.5%. A complex process, such as patient deterioration, requires sufficient time and effort to achieve improved outcomes and create a deeply embedded culture of patient safety. The collaborative model can accelerate improvements achieved by individual institutions.

  11. CImbinator: a web-based tool for drug synergy analysis in small- and large-scale datasets.

    PubMed

    Flobak, Åsmund; Vazquez, Miguel; Lægreid, Astrid; Valencia, Alfonso

    2017-08-01

    Drug synergies are sought to identify combinations of drugs particularly beneficial. User-friendly software solutions that can assist analysis of large-scale datasets are required. CImbinator is a web-service that can aid in batch-wise and in-depth analyzes of data from small-scale and large-scale drug combination screens. CImbinator offers to quantify drug combination effects, using both the commonly employed median effect equation, as well as advanced experimental mathematical models describing dose response relationships. CImbinator is written in Ruby and R. It uses the R package drc for advanced drug response modeling. CImbinator is available at http://cimbinator.bioinfo.cnio.es , the source-code is open and available at https://github.com/Rbbt-Workflows/combination_index . A Docker image is also available at https://hub.docker.com/r/mikisvaz/rbbt-ci_mbinator/ . asmund.flobak@ntnu.no or miguel.vazquez@cnio.es. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  12. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  13. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE PAGES

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...

    2017-03-01

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  14. MT3D-USGS version 1: A U.S. Geological Survey release of MT3DMS updated with new and expanded transport capabilities for use with MODFLOW

    USGS Publications Warehouse

    Bedekar, Vivek; Morway, Eric D.; Langevin, Christian D.; Tonkin, Matthew J.

    2016-09-30

    MT3D-USGS, a U.S. Geological Survey updated release of the groundwater solute transport code MT3DMS, includes new transport modeling capabilities to accommodate flow terms calculated by MODFLOW packages that were previously unsupported by MT3DMS and to provide greater flexibility in the simulation of solute transport and reactive solute transport. Unsaturated-zone transport and transport within streams and lakes, including solute exchange with connected groundwater, are among the new capabilities included in the MT3D-USGS code. MT3D-USGS also includes the capability to route a solute through dry cells that may occur in the Newton-Raphson formulation of MODFLOW (that is, MODFLOW-NWT). New chemical reaction Package options include the ability to simulate inter-species reactions and parent-daughter chain reactions. A new pump-and-treat recirculation package enables the simulation of dynamic recirculation with or without treatment for combinations of wells that are represented in the flow model, mimicking the above-ground treatment of extracted water. A reformulation of the treatment of transient mass storage improves conservation of mass and yields solutions for better agreement with analytical benchmarks. Several additional features of MT3D-USGS are (1) the separate specification of the partitioning coefficient (Kd) within mobile and immobile domains; (2) the capability to assign prescribed concentrations to the top-most active layer; (3) the change in mass storage owing to the change in water volume now appears as its own budget item in the global mass balance summary; (4) the ability to ignore cross-dispersion terms; (5) the definition of Hydrocarbon Spill-Source Package (HSS) mass loading zones using regular and irregular polygons, in addition to the currently supported circular zones; and (6) the ability to specify an absolute minimum thickness rather than the default percent minimum thickness in dry-cell circumstances.Benchmark problems that implement the new features and packages test the accuracy of new code through comparison to analytical benchmarks, as well as to solutions from other published codes. The input file structure for MT3D-USGS adheres to MT3DMS conventions for backward compatibility: the new capabilities and packages described herein are readily invoked by adding three-letter package name acronyms to the name file or by setting input flags as needed. Memory is managed in MT3D-USGS using FORTRAN modules in order to simplify code development and expansion.

  15. ALICE: A non-LTE plasma atomic physics, kinetics and lineshape package

    NASA Astrophysics Data System (ADS)

    Hill, E. G.; Pérez-Callejo, G.; Rose, S. J.

    2018-03-01

    All three parts of an atomic physics, atomic kinetics and lineshape code, ALICE, are described. Examples of the code being used to model the emissivity and opacity of plasmas are discussed and interesting features of the code which build on the existing corpus of models are shown throughout.

  16. Development of a New 47-Group Library for the CASL Neutronics Simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Williams, Mark L; Wiarda, Dorothea

    The CASL core simulator MPACT is under development for the neutronics and thermal-hydraulics coupled simulation for the pressurized light water reactors. The key characteristics of the MPACT code include a subgroup method for resonance self-shielding, and a whole core solver with a 1D/2D synthesis method. The ORNL AMPX/SCALE code packages have been significantly improved to support various intermediate resonance self-shielding approximations such as the subgroup and embedded self-shielding methods. New 47-group AMPX and MPACT libraries based on ENDF/B-VII.0 have been generated for the CASL core simulator MPACT of which group structure comes from the HELIOS library. The new 47-group MPACTmore » library includes all nuclear data required for static and transient core simulations. This study discusses a detailed procedure to generate the 47-group AMPX and MPACT libraries and benchmark results for the VERA progression problems.« less

  17. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  18. OPAL: An Open-Source MPI-IO Library over Cray XT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Weikuan; Vetter, Jeffrey S; Canon, Richard Shane

    Parallel IO over Cray XT is supported by a vendor-supplied MPI-IO package. This package contains a proprietary ADIO implementation built on top of the sysio library. While it is reasonable to maintain a stable code base for application scientists' convenience, it is also very important to the system developers and researchers to analyze and assess the effectiveness of parallel IO software, and accordingly, tune and optimize the MPI-IO implementation. A proprietary parallel IO code base relinquishes such flexibilities. On the other hand, a generic UFS-based MPI-IO implementation is typically used on many Linux-based platforms. We have developed an open-source MPI-IOmore » package over Lustre, referred to as OPAL (OPportunistic and Adaptive MPI-IO Library over Lustre). OPAL provides a single source-code base for MPI-IO over Lustre on Cray XT and Linux platforms. Compared to Cray implementation, OPAL provides a number of good features, including arbitrary specification of striping patterns and Lustre-stripe aligned file domain partitioning. This paper presents the performance comparisons between OPAL and Cray's proprietary implementation. Our evaluation demonstrates that OPAL achieves the performance comparable to the Cray implementation. We also exemplify the benefits of an open source package in revealing the underpinning of the parallel IO performance.« less

  19. Developing Information Power Grid Based Algorithms and Software

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack

    1998-01-01

    This was an exploratory study to enhance our understanding of problems involved in developing large scale applications in a heterogeneous distributed environment. It is likely that the large scale applications of the future will be built by coupling specialized computational modules together. For example, efforts now exist to couple ocean and atmospheric prediction codes to simulate a more complete climate system. These two applications differ in many respects. They have different grids, the data is in different unit systems and the algorithms for inte,-rating in time are different. In addition the code for each application is likely to have been developed on different architectures and tend to have poor performance when run on an architecture for which the code was not designed, if it runs at all. Architectural differences may also induce differences in data representation which effect precision and convergence criteria as well as data transfer issues. In order to couple such dissimilar codes some form of translation must be present. This translation should be able to handle interpolation from one grid to another as well as construction of the correct data field in the correct units from available data. Even if a code is to be developed from scratch, a modular approach will likely be followed in that standard scientific packages will be used to do the more mundane tasks such as linear algebra or Fourier transform operations. This approach allows the developers to concentrate on their science rather than becoming experts in linear algebra or signal processing. Problems associated with this development approach include difficulties associated with data extraction and translation from one module to another, module performance on different nodal architectures, and others. In addition to these data and software issues there exists operational issues such as platform stability and resource management.

  20. What's New in GSAS-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toby, Brian H.; Von Dreele, Robert B.

    The General Structure and Analysis Software II (GSAS-II) package is an all-new crystallographic analysis package written to replace and extend the capabilities of the universal and widely used GSAS and EXPGUI packages. GSAS-II was described in a 2013 article, but considerable work has been completed since then. This paper describes the advances, which include: rigid body fitting and structure solution modules; improved treatment for parametric refinements and equation of state fitting; and small-angle scattering data reduction and analysis. GSAS-II offers versatile and extensible modules for import and export of data and results. Capabilities are provided for users to select anymore » version of the code. Code documentation has reached 150 pages and 17 web-tutorials are offered. © 2014 International Centre for Diffraction Data.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dustin Popp; Zander Mausolff; Sedat Goluoglu

    We are proposing to use the code, TDKENO, to model TREAT. TDKENO solves the time dependent, three dimensional Boltzmann transport equation with explicit representation of delayed neutrons. Instead of directly integrating this equation, the neutron flux is factored into two components – a rapidly varying amplitude equation and a slowly varying shape equation and each is solved separately on different time scales. The shape equation is solved using the 3D Monte Carlo transport code KENO, from Oak Ridge National Laboratory’s SCALE code package. Using the Monte Carlo method to solve the shape equation is still computationally intensive, but the operationmore » is only performed when needed. The amplitude equation is solved deterministically and frequently, so the solution gives an accurate time-dependent solution without having to repeatedly We have modified TDKENO to incorporate KENO-VI so that we may accurately represent the geometries within TREAT. This paper explains the motivation behind using generalized geometry, and provides the results of our modifications. TDKENO uses the Improved Quasi-Static method to accomplish this. In this method, the neutron flux is factored into two components. One component is a purely time-dependent and rapidly varying amplitude function, which is solved deterministically and very frequently (small time steps). The other is a slowly varying flux shape function that weakly depends on time and is only solved when needed (significantly larger time steps).« less

  2. An efficient MPI/OpenMP parallelization of the Hartree–Fock–Roothaan method for the first generation of Intel® Xeon Phi™ processor architecture

    DOE PAGES

    Mironov, Vladimir; Moskovsky, Alexander; D’Mello, Michael; ...

    2017-10-04

    The Hartree-Fock (HF) method in the quantum chemistry package GAMESS represents one of the most irregular algorithms in computation today. Major steps in the calculation are the irregular computation of electron repulsion integrals (ERIs) and the building of the Fock matrix. These are the central components of the main Self Consistent Field (SCF) loop, the key hotspot in Electronic Structure (ES) codes. By threading the MPI ranks in the official release of the GAMESS code, we not only speed up the main SCF loop (4x to 6x for large systems), but also achieve a significant (>2x) reduction in the overallmore » memory footprint. These improvements are a direct consequence of memory access optimizations within the MPI ranks. We benchmark our implementation against the official release of the GAMESS code on the Intel R Xeon PhiTM supercomputer. Here, scaling numbers are reported on up to 7,680 cores on Intel Xeon Phi coprocessors.« less

  3. Tough2{_}MP: A parallel version of TOUGH2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris

    2003-04-09

    TOUGH2{_}MP is a massively parallel version of TOUGH2. It was developed for running on distributed-memory parallel computers to simulate large simulation problems that may not be solved by the standard, single-CPU TOUGH2 code. The new code implements an efficient massively parallel scheme, while preserving the full capacity and flexibility of the original TOUGH2 code. The new software uses the METIS software package for grid partitioning and AZTEC software package for linear-equation solving. The standard message-passing interface is adopted for communication among processors. Numerical performance of the current version code has been tested on CRAY-T3E and IBM RS/6000 SP platforms. Inmore » addition, the parallel code has been successfully applied to real field problems of multi-million-cell simulations for three-dimensional multiphase and multicomponent fluid and heat flow, as well as solute transport. In this paper, we will review the development of the TOUGH2{_}MP, and discuss the basic features, modules, and their applications.« less

  4. Optical Excitations and Energy Transfer in Nanoparticle Waveguides

    DTIC Science & Technology

    2009-03-01

    All calculations were performed using our own codes given in the Appendix section. The calculations were performed using Scilab programming package...January 2007, invited Speaker) 12. Scilab is a free software compatible to the famous Matlab package. It can be found at their webpage http

  5. Development of a MELCOR Sodium Chemistry (NAC) Package - FY17 Progress.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    This report describes the status of the development of MELCOR Sodium Chemistry (NAC) package. This development is based on the CONTAIN-LMR sodium physics and chemistry models to be implemented in MELCOR. In the past three years, the sodium equation of state as a working fluid from the nuclear fusion safety research and from the SIMMER code has been implemented into MELCOR. The chemistry models from the CONTAIN-LMR code, such as the spray and pool fire mode ls, have also been implemented into MELCOR. This report describes the implemented models and the issues encountered. Model descriptions and input descriptions are provided.more » Development testing of the spray and pool fire models is described, including the code-to-code comparison with CONTAIN-LMR. The report ends with an expected timeline for the remaining models to be implemented, such as the atmosphere chemistry, sodium-concrete interactions, and experimental validation tests .« less

  6. Supplemental Fingerprint Card Data (SFCD) for NIST Special Database 9

    National Institute of Standards and Technology Data Gateway

    Supplemental Fingerprint Card Data (SFCD) for NIST Special Database 9 (Web, free access)   NIST Special Database 10 (Supplemental Fingerprint Card Data for Special Database 9 - 8-Bit Gray Scale Images) provides a larger sample of fingerprint patterns that have a low natural frequency of occurrence and transitional fingerprint classes in NIST Special Database 9. The software is the same code used with NIST Special Database 4 and 9. A newer version of the compression/decompression software on the CDROM can be found at the website http://www.nist.gov/itl/iad/ig/nigos.cfm as part of the NBIS package.

  7. Documentation of the seawater intrusion (SWI2) package for MODFLOW

    USGS Publications Warehouse

    Bakker, Mark; Schaars, Frans; Hughes, Joseph D.; Langevin, Christian D.; Dausman, Alyssa M.

    2013-01-01

    The SWI2 Package is the latest release of the Seawater Intrusion (SWI) Package for MODFLOW. The SWI2 Package allows three-dimensional vertically integrated variable-density groundwater flow and seawater intrusion in coastal multiaquifer systems to be simulated using MODFLOW-2005. Vertically integrated variable-density groundwater flow is based on the Dupuit approximation in which an aquifer is vertically discretized into zones of differing densities, separated from each other by defined surfaces representing interfaces or density isosurfaces. The numerical approach used in the SWI2 Package does not account for diffusion and dispersion and should not be used where these processes are important. The resulting differential equations are equivalent in form to the groundwater flow equation for uniform-density flow. The approach implemented in the SWI2 Package allows density effects to be incorporated into MODFLOW-2005 through the addition of pseudo-source terms to the groundwater flow equation without the need to solve a separate advective-dispersive transport equation. Vertical and horizontal movement of defined density surfaces is calculated separately using a combination of fluxes calculated through solution of the groundwater flow equation and a simple tip and toe tracking algorithm. Use of the SWI2 Package in MODFLOW-2005 only requires the addition of a single additional input file and modification of boundary heads to freshwater heads referenced to the top of the aquifer. Fluid density within model layers can be represented using zones of constant density (stratified flow) or continuously varying density (piecewise linear in the vertical direction) in the SWI2 Package. The main advantage of using the SWI2 Package instead of variable-density groundwater flow and dispersive solute transport codes, such as SEAWAT and SUTRA, is that fewer model cells are required for simulations using the SWI2 Package because every aquifer can be represented by a single layer of cells. This reduction in number of required model cells and the elimination of the need to solve the advective-dispersive transport equation results in substantial model run-time savings, which can be large for regional aquifers. The accuracy and use of the SWI2 Package is demonstrated through comparison with existing exact solutions and numerical solutions with SEAWAT. Results for an unconfined aquifer are also presented to demonstrate application of the SWI2 Package to a large-scale regional problem.

  8. Next-generation acceleration and code optimization for light transport in turbid media using GPUs

    PubMed Central

    Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar

    2010-01-01

    A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498

  9. 78 FR 41721 - New Standards to Enhance Package Visibility

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-11

    ... supporting electronic documentation including piece-level address or ZIP+4[supreg] Code information effective... package strategy relies on the availability of piece- level information provided through the widespread use of IMpb. IMpb can offer a number of benefits to mailers by providing piece- level visibility...

  10. ULFEM time series analysis package

    USGS Publications Warehouse

    Karl, Susan M.; McPhee, Darcy K.; Glen, Jonathan M. G.; Klemperer, Simon L.

    2013-01-01

    This manual describes how to use the Ultra-Low-Frequency ElectroMagnetic (ULFEM) software package. Casual users can read the quick-start guide and will probably not need any more information than this. For users who may wish to modify the code, we provide further description of the routines.

  11. 78 FR 44894 - Specifications for Packagings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part 178 Specifications for Packagings CFR Correction 0 In Title 49 of the Code of Federal Regulations, Parts 178 to 199, revised as of October 1, 2012, in Sec. 178.68, on page 80, paragraph (i)(2) is...

  12. NPTFit: A Code Package for Non-Poissonian Template Fitting

    NASA Astrophysics Data System (ADS)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.

    2017-06-01

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ˜GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allow for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.

  13. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  14. Open-access programs for injury categorization using ICD-9 or ICD-10.

    PubMed

    Clark, David E; Black, Adam W; Skavdahl, David H; Hallagan, Lee D

    2018-04-09

    The article introduces Programs for Injury Categorization, using the International Classification of Diseases (ICD) and R statistical software (ICDPIC-R). Starting with ICD-8, methods have been described to map injury diagnosis codes to severity scores, especially the Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). ICDPIC was originally developed for this purpose using Stata, and ICDPIC-R is an open-access update that accepts both ICD-9 and ICD-10 codes. Data were obtained from the National Trauma Data Bank (NTDB), Admission Year 2015. ICDPIC-R derives CDC injury mechanism categories and an approximate ISS ("RISS") from either ICD-9 or ICD-10 codes. For ICD-9-coded cases, RISS is derived similar to the Stata package (with some improvements reflecting user feedback). For ICD-10-coded cases, RISS may be calculated in several ways: The "GEM" methods convert ICD-10 to ICD-9 (using General Equivalence Mapping tables from CMS) and then calculate ISS with options similar to the Stata package; a "ROCmax" method calculates RISS directly from ICD-10 codes, based on diagnosis-specific mortality in the NTDB, maximizing the C-statistic for predicting NTDB mortality while attempting to minimize the difference between RISS and ISS submitted by NTDB registrars (ISSAIS). Findings were validated using data from the National Inpatient Survey (NIS, 2015). NTDB contained 917,865 cases, of which 86,878 had valid ICD-10 injury codes. For a random 100,000 ICD-9-coded cases in NTDB, RISS using the GEM methods was nearly identical to ISS calculated by the Stata version, which has been previously validated. For ICD-10-coded cases in NTDB, categorized ISS using any version of RISS was similar to ISSAIS; for both NTDB and NIS cases, increasing ISS was associated with increasing mortality. Prediction of NTDB mortality was associated with C-statistics of 0.81 for ISSAIS, 0.75 for RISS using the GEM methods, and 0.85 for RISS using the ROCmax method; prediction of NIS mortality was associated with C-statistics of 0.75-0.76 for RISS using the GEM methods, and 0.78 for RISS using the ROCmax method. Instructions are provided for accessing ICDPIC-R at no cost. The ideal methods of injury categorization and injury severity scoring involve trained personnel with access to injured persons or their medical records. ICDPIC-R may be a useful substitute when this ideal cannot be obtained.

  15. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  16. clusterProfiler: an R package for comparing biological themes among gene clusters.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu

    2012-05-01

    Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.

  17. Vector-matrix-quaternion, array and arithmetic packages: All HAL/S functions implemented in Ada

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Kwong, David D.

    1986-01-01

    The HAL/S avionics programmers have enjoyed a variety of tools built into a language tailored to their special requirements. Ada is designed for a broader group of applications. Rather than providing built-in tools, Ada provides the elements with which users can build their own. Standard avionic packages remain to be developed. These must enable programmers to code in Ada as they have coded in HAL/S. The packages under development at JPL will provide all of the vector-matrix, array, and arithmetic functions described in the HAL/S manuals. In addition, the linear algebra package will provide all of the quaternion functions used in Shuttle steering and Galileo attitude control. Furthermore, using Ada's extensibility, many quaternion functions are being implemented as infix operations; equivalent capabilities were never implemented in HAL/S because doing so would entail modifying the compiler and expanding the language. With these packages, many HAL/S expressions will compile and execute in Ada, unchanged. Others can be converted simply by replacing the implicit HAL/S multiply operator with the Ada *. Errors will be trapped and identified. Input/output will be convenient and readable.

  18. Parallelization of Rocket Engine Simulator Software (PRESS)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1997-01-01

    Parallelization of Rocket Engine System Software (PRESS) project is part of a collaborative effort with Southern University at Baton Rouge (SUBR), University of West Florida (UWF), and Jackson State University (JSU). The second-year funding, which supports two graduate students enrolled in our new Master's program in Computer Science at Hampton University and the principal investigator, have been obtained for the period from October 19, 1996 through October 18, 1997. The key part of the interim report was new directions for the second year funding. This came about from discussions during Rocket Engine Numeric Simulator (RENS) project meeting in Pensacola on January 17-18, 1997. At that time, a software agreement between Hampton University and NASA Lewis Research Center had already been concluded. That agreement concerns off-NASA-site experimentation with PUMPDES/TURBDES software. Before this agreement, during the first year of the project, another large-scale FORTRAN-based software, Two-Dimensional Kinetics (TDK), was being used for translation to an object-oriented language and parallelization experiments. However, that package proved to be too complex and lacking sufficient documentation for effective translation effort to the object-oriented C + + source code. The focus, this time with better documented and more manageable PUMPDES/TURBDES package, was still on translation to C + + with design improvements. At the RENS Meeting, however, the new impetus for the RENS projects in general, and PRESS in particular, has shifted in two important ways. One was closer alignment with the work on Numerical Propulsion System Simulator (NPSS) through cooperation and collaboration with LERC ACLU organization. The other was to see whether and how NASA's various rocket design software can be run over local and intra nets without any radical efforts for redesign and translation into object-oriented source code. There were also suggestions that the Fortran based code be encapsulated in C + + code thereby facilitating reuse without undue development effort. The details are covered in the aforementioned section of the interim report filed on April 28, 1997.

  19. Nanotechnology for the Solid Waste Reduction of Military Food Packaging

    DTIC Science & Technology

    2016-06-01

    WP-200816) Nanotechnology for the Solid Waste Reduction of Military Food Packaging June 2016 This document has been cleared for public release...NAME OF RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 01/06/2016 Cost and Performance Report 04/01/2008 - 01/01/2015 Nanotechnology for... nanotechnology packaging. The PIs have been dedicated to these efforts, and it is anticipated that this technology will be used someday by the Warfighter

  20. The Islamic State Battle Plan: Press Release Natural Language Processing

    DTIC Science & Technology

    2016-06-01

    Processing, text mining , corpus, generalized linear model, cascade, R Shiny, leaflet, data visualization 15. NUMBER OF PAGES 83 16. PRICE CODE...Terrorism and Responses to Terrorism TDM Term Document Matrix TF Term Frequency TF-IDF Term Frequency-Inverse Document Frequency tm text mining (R...package=leaflet. Feinerer I, Hornik K (2015) Text Mining Package “tm,” Version 0.6-2. (Jul 3) https://cran.r-project.org/web/packages/tm/tm.pdf

  1. Assessment of the prevailing physics codes: LEOPARD, LASER, and EPRI-CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lan, J.S.

    1981-01-01

    In order to analyze core performance and fuel management, it is necessary to verify reactor physics codes in great detail. This kind of work not only serves the purpose of understanding and controlling the characteristics of each code, but also ensures the reliability as codes continually change due to constant modifications and machine transfers. This paper will present the results of a comprehensive verification of three code packages - LEOPARD, LASER, and EPRI-CELL.

  2. 49 CFR 178.515 - Standards for reconstituted wood boxes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Standards for reconstituted wood boxes. 178.515... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.515 Standards for reconstituted wood boxes. (a) The identification code for a reconstituted wood box is 4F. (b) Construction requirements for...

  3. 49 CFR 178.515 - Standards for reconstituted wood boxes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Standards for reconstituted wood boxes. 178.515... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.515 Standards for reconstituted wood boxes. (a) The identification code for a reconstituted wood box is 4F. (b) Construction requirements for...

  4. 49 CFR 178.515 - Standards for reconstituted wood boxes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Standards for reconstituted wood boxes. 178.515... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.515 Standards for reconstituted wood boxes. (a) The identification code for a reconstituted wood box is 4F. (b) Construction requirements for...

  5. 78 FR 19007 - Certain Products Having Laminated Packaging, Laminated Packaging, and Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-28

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-874] Certain Products Having Laminated... States Code AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that a complaint was filed with the U.S. International Trade Commission on February 20, 2013, under...

  6. 49 CFR 178.515 - Standards for reconstituted wood boxes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Standards for reconstituted wood boxes. 178.515... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.515 Standards for reconstituted wood boxes. (a) The identification code for a reconstituted wood box is 4F. (b) Construction requirements for...

  7. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  8. Effect of thermal cycling ramp rate on CSP assembly reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    2001-01-01

    A JPL-led chip scale package consortium of enterprises recently joined together to pool in-kind resources for developing the quality and reliability of chip scale packages for a variety of projects. The experience of the consortium in building more than 150 test vehicle assemblies, single and double sided multilayer PWBs, and the environmental test results has now been published as a chip scale package guidelines document.

  9. EEE Links. Volume 5

    NASA Technical Reports Server (NTRS)

    Humphrey, Robert (Editor)

    1999-01-01

    The EEE Links Newsletter is a quarterly publication produced by Code 562 in support of the NASA HQ funded NASA Electronic Parts and Packaging (NEPP) Program. The newsletter is produced as an electronic format deliverable made available via the referenced www site administered by Code 562, The newsletter publishes brief articles on topics of interest to NASA programs and projects in the area of electronic parts and packaging. The newsletter does not provide information pertaining to patented or proprietary information. The information provided is at the level of that produced by industry and university researchers and is published at national and international conferences.

  10. Medicare's "Global" terrorism: where is the pay for performance?

    PubMed

    Reed, R Lawrence; Luchette, Fred A; Esposito, Thomas J; Pyrz, Karen; Gamelli, Richard L

    2008-02-01

    Medicare and Medicaid Services (CMS) payment policies for surgical operations are based on a global package concept. CMS' physician fee schedule splits the global package into preoperative, intraoperative, and postoperative components of each procedure. We hypothesized that these global package component valuations were often lower than comparable evaluation and management (E&M) services and that billing for E&M services instead of the operation could often be more profitable. Our billing database and Trauma Registry were queried for the operative procedures and hospital lengths of stay for trauma patients during the past 5 years. Determinations of preoperative, intraoperative, and postoperative payments were calculated for 10-day and 90-day global packages, comparing them to CMS payments for comparable E&M codes. Of 90-day and 10-day Current Procedural Terminology codes, 88% and 100%, respectively, do not pay for the comprehensive history and physical that trauma patients usually receive, whereas 41% and 98%, respectively, do not even meet payment levels for a simple history and physical. Of 90-day global package procedures, 70% would have generated more revenue had comprehensive daily visits been billed instead of the operation ($3,057,500 vs. $1,658,058). For 10-day global package procedures, 56% would have generated more revenue with merely problem-focused daily visits instead of the operation ($161,855 vs. $156,318). Medicare's global surgical package underpays E&M services in trauma patients. In most cases, trauma surgeons would fare better by not billing for operations to receive higher reimbursement for E&M services that are considered "bundled" in the global package payment.

  11. ANTS — a simulation package for secondary scintillation Anger-camera type detector in thermal neutron imaging

    NASA Astrophysics Data System (ADS)

    Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Van Esch, P.; Zeitelhack, K.

    2012-08-01

    A custom and fully interactive simulation package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations) has been developed to optimize the design and operation conditions of secondary scintillation Anger-camera type gaseous detectors for thermal neutron imaging. The simulation code accounts for all physical processes related to the neutron capture, energy deposition pattern, drift of electrons of the primary ionization and secondary scintillation. The photons are traced considering the wavelength-resolved refraction and transmission of the output window. Photo-detection accounts for the wavelength-resolved quantum efficiency, angular response, area sensitivity, gain and single-photoelectron spectra of the photomultipliers (PMTs). The package allows for several geometrical shapes of the PMT photocathode (round, hexagonal and square) and offers a flexible PMT array configuration: up to 100 PMTs in a custom arrangement with the square or hexagonal packing. Several read-out patterns of the PMT array are implemented. Reconstruction of the neutron capture position (projection on the plane of the light emission) is performed using the center of gravity, maximum likelihood or weighted least squares algorithm. Simulation results reproduce well the preliminary results obtained with a small-scale detector prototype. ANTS executables can be downloaded from http://coimbra.lip.pt/~andrei/.

  12. SMOG 2: A Versatile Software Package for Generating Structure-Based Models.

    PubMed

    Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C

    2016-03-01

    Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.

  13. From Global to Cloud Resolving Scale: Experiments with a Scale- and Aerosol-Aware Physics Package and Impact on Tracer Transport

    NASA Astrophysics Data System (ADS)

    Grell, G. A.; Freitas, S. R.; Olson, J.; Bela, M.

    2017-12-01

    We will start by providing a summary of the latest cumulus parameterization modeling efforts at NOAA's Earth System Research Laboratory (ESRL) will be presented on both regional and global scales. The physics package includes a scale-aware parameterization of subgrid cloudiness feedback to radiation (coupled PBL, microphysics, radiation, shallow and congestus type convection), the stochastic Grell-Freitas (GF) scale- and aerosol-aware convective parameterization, and an aerosol aware microphysics package. GF is based on a stochastic approach originally implemented by Grell and Devenyi (2002) and described in more detail in Grell and Freitas (2014, ACP). It was expanded to include PDF's for vertical mass flux, as well as modifications to improve the diurnal cycle. This physics package will be used on different scales, spanning global to cloud resolving, to look at the impact on scalar transport and numerical weather prediction.

  14. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types

    PubMed Central

    Pagès, Hervé

    2018-01-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set. PMID:29723188

  15. beachmat: A Bioconductor C++ API for accessing high-throughput biological data from a variety of R matrix types.

    PubMed

    Lun, Aaron T L; Pagès, Hervé; Smith, Mike L

    2018-05-01

    Biological experiments involving genomics or other high-throughput assays typically yield a data matrix that can be explored and analyzed using the R programming language with packages from the Bioconductor project. Improvements in the throughput of these assays have resulted in an explosion of data even from routine experiments, which poses a challenge to the existing computational infrastructure for statistical data analysis. For example, single-cell RNA sequencing (scRNA-seq) experiments frequently generate large matrices containing expression values for each gene in each cell, requiring sparse or file-backed representations for memory-efficient manipulation in R. These alternative representations are not easily compatible with high-performance C++ code used for computationally intensive tasks in existing R/Bioconductor packages. Here, we describe a C++ interface named beachmat, which enables agnostic data access from various matrix representations. This allows package developers to write efficient C++ code that is interoperable with dense, sparse and file-backed matrices, amongst others. We evaluated the performance of beachmat for accessing data from each matrix representation using both simulated and real scRNA-seq data, and defined a clear memory/speed trade-off to motivate the choice of an appropriate representation. We also demonstrate how beachmat can be incorporated into the code of other packages to drive analyses of a very large scRNA-seq data set.

  16. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  17. 39 CFR Appendix A to Part 121 - Tables Depicting Service Standard Day Ranges

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 1-3 (AK)7 (JNU) 7 (KTN) 1 (HI)7 (GU) 1-2 1-2 6-7 5-6 Standard Mail 2 3 3 3-4 10 10 9 Package Services 1 2 2 2-3 8 8 7 AK = Alaska 3-digit ZIP Codes 995-997; JNU = Juneau AK 3-digit ZIP Code 998; KTN = Ketchikan AK 3-digit ZIP Code 999; HI = Hawaii 3-digit ZIP Codes 967 and 968; GU = Guam 3-digit ZIP Code 969...

  18. 39 CFR Appendix A to Part 121 - Tables Depicting Service Standard Day Ranges

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 1-3 (AK)7 (JNU) 7 (KTN) 1 (HI)7 (GU) 1-2 1-2 6-7 5-6 Standard Mail 2 3 3 3-4 10 10 9 Package Services 1 2 2 2-3 8 8 7 AK = Alaska 3-digit ZIP Codes 995-997; JNU = Juneau AK 3-digit ZIP Code 998; KTN = Ketchikan AK 3-digit ZIP Code 999; HI = Hawaii 3-digit ZIP Codes 967 and 968; GU = Guam 3-digit ZIP Code 969...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klaiman, Shachar; Gilary, Ido; Moiseyev, Nimrod

    Analytical expressions for the resonances of the long-range potential (LRP), V(r)=a/r-b/r{sup 2}, as a function of the Hamiltonian parameters were derived by Doolen a long time ago [Int. J. Quant. Chem. 14, 523 (1979)]. Here we show that converged numerical results are obtained by applying the shifted complex scaling and the smooth-exterior scaling (SES) methods rather than the usual complex coordinate method (i.e., complex scaling). The narrow and broad shape-type resonances are shown to be localized inside or over the potential barrier and not inside the potential well. Therefore, the resonances for Doolen LRP's are not associated with the tunnelingmore » through the potential barrier as one might expect. The fact that the SES provides a universal reflection-free absorbing potential is, in particular, important in view of future applications. In particular, it is most convenient to calculate the molecular autoionizing resonances by adding one-electron complex absorbing potentials into the codes of the available quantum molecular electronic packages.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, David; Klise, Katherine A.

    The PyEPANET package is a set of commands for the Python programming language that are built to wrap the EPANET toolkit library commands, without requiring the end user to program using the ctypes package. This package does not contain the EPANET code, nor does it implement the functions within the EPANET software, and it requires the separately downloaded or compiled EPANET2 toolkit dynamic library (epanet.dll, libepanent.so, or epanet.dylib) and/or the EPANET-MSX dynamic library in order to function.

  1. Motmot, an open-source toolkit for realtime video acquisition and analysis.

    PubMed

    Straw, Andrew D; Dickinson, Michael H

    2009-07-22

    Video cameras sense passively from a distance, offer a rich information stream, and provide intuitively meaningful raw data. Camera-based imaging has thus proven critical for many advances in neuroscience and biology, with applications ranging from cellular imaging of fluorescent dyes to tracking of whole-animal behavior at ecologically relevant spatial scales. Here we present 'Motmot': an open-source software suite for acquiring, displaying, saving, and analyzing digital video in real-time. At the highest level, Motmot is written in the Python computer language. The large amounts of data produced by digital cameras are handled by low-level, optimized functions, usually written in C. This high-level/low-level partitioning and use of select external libraries allow Motmot, with only modest complexity, to perform well as a core technology for many high-performance imaging tasks. In its current form, Motmot allows for: (1) image acquisition from a variety of camera interfaces (package motmot.cam_iface), (2) the display of these images with minimal latency and computer resources using wxPython and OpenGL (package motmot.wxglvideo), (3) saving images with no compression in a single-pass, low-CPU-use format (package motmot.FlyMovieFormat), (4) a pluggable framework for custom analysis of images in realtime and (5) firmware for an inexpensive USB device to synchronize image acquisition across multiple cameras, with analog input, or with other hardware devices (package motmot.fview_ext_trig). These capabilities are brought together in a graphical user interface, called 'FView', allowing an end user to easily view and save digital video without writing any code. One plugin for FView, 'FlyTrax', which tracks the movement of fruit flies in real-time, is included with Motmot, and is described to illustrate the capabilities of FView. Motmot enables realtime image processing and display using the Python computer language. In addition to the provided complete applications, the architecture allows the user to write relatively simple plugins, which can accomplish a variety of computer vision tasks and be integrated within larger software systems. The software is available at http://code.astraw.com/projects/motmot.

  2. User’s guide for GcClust—An R package for clustering of regional geochemical data

    USGS Publications Warehouse

    Ellefsen, Karl J.; Smith, David B.

    2016-04-08

    GcClust is a software package developed by the U.S. Geological Survey for statistical clustering of regional geochemical data, and similar data such as regional mineralogical data. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of the user’s guide are bundled together in R’s unit of sharable code, which is called a “package.” The user’s guide includes step-by-step instructions showing how the functions are used to cluster data and to evaluate the clustering results. These functions are demonstrated in this report using test data, which are included in the package.

  3. A VHDL Interface for Altera Design Files

    DTIC Science & Technology

    1990-01-01

    this requirement dictated that all prototype products developed during this research would have to mirror standard VHDL code . In fact, the final... product would have to meet the 20 syntactic and semantic requirements of standard VHDL . The coding style used to create the transformation program was the...Transformed Decoder File ....................... 47 C. Supplemental VHDL Package Source Code ........... 54 Altpk.vhd .................................... 54 D

  4. 49 CFR 178.505 - Standards for aluminum drums.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for aluminum drums. 178.505 Section 178... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.505 Standards for aluminum drums. (a) The following are the identification codes for aluminum drums: (1) 1B1 for a non-removable head aluminum drum...

  5. 49 CFR 178.519 - Standards for plastic film bags.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for plastic film bags. 178.519 Section... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.519 Standards for plastic film bags. (a) The identification code for a plastic film bag is 5H4. (b) Construction requirements for plastic film...

  6. 49 CFR 178.509 - Standards for plastic drums and jerricans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Standards for plastic drums and jerricans. 178.509... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.509 Standards for plastic drums and jerricans. (a) The following are identification codes for plastic drums and jerricans: (1) 1H1 for a non...

  7. 49 CFR 178.518 - Standards for woven plastic bags.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for woven plastic bags. 178.518 Section... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.518 Standards for woven plastic bags. (a) The following are identification codes for woven plastic bags: (1) 5H1 for an unlined or non-coated...

  8. 49 CFR 178.509 - Standards for plastic drums and jerricans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Standards for plastic drums and jerricans. 178.509... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.509 Standards for plastic drums and jerricans. (a) The following are identification codes for plastic drums and jerricans: (1) 1H1 for a non...

  9. 49 CFR 178.509 - Standards for plastic drums and jerricans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Standards for plastic drums and jerricans. 178.509... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.509 Standards for plastic drums and jerricans. (a) The following are identification codes for plastic drums and jerricans: (1) 1H1 for a non...

  10. 49 CFR 178.509 - Standards for plastic drums and jerricans.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Standards for plastic drums and jerricans. 178.509... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.509 Standards for plastic drums and jerricans. (a) The following are identification codes for plastic drums and jerricans: (1) 1H1 for a non...

  11. 49 CFR 178.512 - Standards for steel, aluminum or other metal boxes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Standards for steel, aluminum or other metal boxes...) SPECIFICATIONS FOR PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.512 Standards for steel, aluminum or other metal boxes. (a) The following are identification codes for steel, aluminum, or other...

  12. 49 CFR 178.512 - Standards for steel or aluminum boxes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Standards for steel or aluminum boxes. 178.512... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.512 Standards for steel or aluminum boxes. (a) The following are identification codes for steel or aluminum boxes: (1) 4A for a steel box; and...

  13. 49 CFR 178.512 - Standards for steel, aluminum or other metal boxes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Standards for steel, aluminum or other metal boxes...) SPECIFICATIONS FOR PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.512 Standards for steel, aluminum or other metal boxes. (a) The following are identification codes for steel, aluminum, or other...

  14. 49 CFR 178.512 - Standards for steel or aluminum boxes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for steel or aluminum boxes. 178.512... FOR PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.512 Standards for steel or aluminum boxes. (a) The following are identification codes for steel or aluminum boxes: (1) 4A for a steel...

  15. 49 CFR 178.512 - Standards for steel or aluminum boxes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false Standards for steel or aluminum boxes. 178.512... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.512 Standards for steel or aluminum boxes. (a) The following are identification codes for steel or aluminum boxes: (1) 4A for a steel box; and...

  16. The Effects of Prohibiting Gestures on Children's Lexical Retrieval Ability

    ERIC Educational Resources Information Center

    Pine, Karen J.; Bird, Hannah; Kirk, Elizabeth

    2007-01-01

    Two alternative accounts have been proposed to explain the role of gestures in thinking and speaking. The Information Packaging Hypothesis (Kita, 2000) claims that gestures are important for the conceptual packaging of information before it is coded into a linguistic form for speech. The Lexical Retrieval Hypothesis (Rauscher, Krauss & Chen, 1996)…

  17. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    PubMed

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  18. Nmrglue: An Open Source Python Package for the Analysis of Multidimensional NMR Data

    PubMed Central

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license. PMID:23456039

  19. ANITA-IEAF activation code package - updating of the decay and cross section data libraries and validation on the experimental data from the Karlsruhe Isochronous Cyclotron

    NASA Astrophysics Data System (ADS)

    Frisoni, Manuela

    2017-09-01

    ANITA-IEAF is an activation package (code and libraries) developed in the past in ENEA-Bologna in order to assess the activation of materials exposed to neutrons with energies greater than 20 MeV. An updated version of the ANITA-IEAF activation code package has been developed. It is suitable to be applied to the study of the irradiation effects on materials in facilities like the International Fusion Materials Irradiation Facility (IFMIF) and the DEMO Oriented Neutron Source (DONES), in which a considerable amount of neutrons with energies above 20 MeV is produced. The present paper summarizes the main characteristics of the updated version of ANITA-IEAF, able to use decay and cross section data based on more recent evaluated nuclear data libraries, i.e. the JEFF-3.1.1 Radioactive Decay Data Library and the EAF-2010 neutron activation cross section library. In this paper the validation effort related to the comparison between the code predictions and the activity measurements obtained from the Karlsruhe Isochronous Cyclotron is presented. In this integral experiment samples of two different steels, SS-316 and F82H, pure vanadium and a vanadium alloy, structural materials of interest in fusion technology, were activated in a neutron spectrum similar to the IFMIF neutron field.

  20. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  1. Radio Astronomy Tools in Python: Spectral-cube, pvextractor, and more

    NASA Astrophysics Data System (ADS)

    Ginsburg, A.; Robitaille, T.; Beaumont, C.; Rosolowsky, E.; Leroy, A.; Brogan, C.; Hunter, T.; Teuben, P.; Brisbin, D.

    2015-12-01

    The radio-astro-tools organization has been established to facilitate development of radio and millimeter analysis tools by the scientific community. The first packages developed under its umbrella are: • The spectral-cube package, for reading, writing, and analyzing spectral data cubes • The pvextractor package for extracting position-velocity slices from position-position-velocity cubes along aribitrary paths • The radio-beam package to handle gaussian beams in the context of the astropy quantity and unit framework • casa-python to enable installation of these packages - and any other - into users' CASA environments without conflicting with the underlying CASA package. Community input in the form of code contributions, suggestions, questions and commments is welcome on all of these tools. They can all be found at http://radio-astro-tools.github.io.

  2. A tactile-output paging communication system for the deaf-blind

    NASA Technical Reports Server (NTRS)

    Baer, J. A.

    1979-01-01

    A radio frequency paging communication system that has coded vibrotactile outputs suitable for use by deaf-blind people was developed. In concept, the system consists of a base station transmitting and receiving unit and many on-body transmitting and receiving units. The completed system has seven operating modes: fire alarm; time signal; repeated single character Morse code; manual Morse code; emergency aid request; operational status test; and message acknowledge. The on-body units can be addressed in three ways: all units; a group of units; or an individual unit. All the functions developed were integrated into a single package that can be worn on the user's wrist. The control portion of the on-body unit is implemented by a microcomputer. The microcomputer is packaged in a custom-designed hybrid circuit to reduce its physical size.

  3. Simulations of neutron transport at low energy: a comparison between GEANT and MCNP.

    PubMed

    Colonna, N; Altieri, S

    2002-06-01

    The use of the simulation tool GEANT for neutron transport at energies below 20 MeV is discussed, in particular with regard to shielding and dose calculations. The reliability of the GEANT/MICAP package for neutron transport in a wide energy range has been verified by comparing the results of simulations performed with this package in a wide energy range with the prediction of MCNP-4B, a code commonly used for neutron transport at low energy. A reasonable agreement between the results of the two codes is found for the neutron flux through a slab of material (iron and ordinary concrete), as well as for the dose released in soft tissue by neutrons. These results justify the use of the GEANT/MICAP code for neutron transport in a wide range of applications, including health physics problems.

  4. WOLF: a computer code package for the calculation of ion beam trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogel, D.L.

    1985-10-01

    The WOLF code solves POISSON'S equation within a user-defined problem boundary of arbitrary shape. The code is compatible with ANSI FORTRAN and uses a two-dimensional Cartesian coordinate geometry represented on a triangular lattice. The vacuum electric fields and equipotential lines are calculated for the input problem. The use may then introduce a series of emitters from which particles of different charge-to-mass ratios and initial energies can originate. These non-relativistic particles will then be traced by WOLF through the user-defined region. Effects of ion and electron space charge are included in the calculation. A subprogram PISA forms part of this codemore » and enables optimization of various aspects of the problem. The WOLF package also allows detailed graphics analysis of the computed results to be performed.« less

  5. A user's guide to the ssWavelets package

    Treesearch

    J.H. ​Gove

    2017-01-01

    ssWavelets is an R package that is meant to be used in conjunction with the sampSurf package (Gove, 2012) to perform wavelet decomposition on the results of a sampling surface simulation. In general, the wavelet filter decomposes the sampSurf simulation results by scale (distance), with each scale corresponding to a different level of the...

  6. An Interactive Computer Aided Design and Analysis Package.

    DTIC Science & Technology

    1986-03-01

    Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/𔃼 PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and

  7. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  8. Determinant Computation on the GPU using the Condensation Method

    NASA Astrophysics Data System (ADS)

    Anisul Haque, Sardar; Moreno Maza, Marc

    2012-02-01

    We report on a GPU implementation of the condensation method designed by Abdelmalek Salem and Kouachi Said for computing the determinant of a matrix. We consider two types of coefficients: modular integers and floating point numbers. We evaluate the performance of our code by measuring its effective bandwidth and argue that it is numerical stable in the floating point number case. In addition, we compare our code with serial implementation of determinant computation from well-known mathematical packages. Our results suggest that a GPU implementation of the condensation method has a large potential for improving those packages in terms of running time and numerical stability.

  9. magnum.fe: A micromagnetic finite-element simulation code based on FEniCS

    NASA Astrophysics Data System (ADS)

    Abert, Claas; Exl, Lukas; Bruckner, Florian; Drews, André; Suess, Dieter

    2013-11-01

    We have developed a finite-element micromagnetic simulation code based on the FEniCS package called magnum.fe. Here we describe the numerical methods that are applied as well as their implementation with FEniCS. We apply a transformation method for the solution of the demagnetization-field problem. A semi-implicit weak formulation is used for the integration of the Landau-Lifshitz-Gilbert equation. Numerical experiments show the validity of simulation results. magnum.fe is open source and well documented. The broad feature range of the FEniCS package makes magnum.fe a good choice for the implementation of novel micromagnetic finite-element algorithms.

  10. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  11. NPTFit: A Code Package for Non-Poissonian Template Fitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R., E-mail: smsharma@princeton.edu, E-mail: nrodd@mit.edu, E-mail: bsafdi@mit.edu

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ∼GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allowmore » for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.« less

  12. Practical Problems with Medication Use that Older People Experience: A Qualitative Study

    PubMed Central

    Notenboom, Kim; Beers, Erna; van Riet-Nales, Diana A; Egberts, Toine C G; Leufkens, Hubert G M; Jansen, Paul A F; Bouvy, Marcel L

    2014-01-01

    Objectives To identify the practical problems that older people experience with the daily use of their medicines and their management strategies to address these problems and to determine the potential clinical relevance thereof. Design Qualitative study with semistructured face-to-face interviews. Setting A community pharmacy and a geriatric outpatient ward. Participants Community-dwelling people aged 70 and older (N = 59). Measurements Participants were interviewed at home. Two researchers coded the reported problems and management strategies independently according to a coding scheme. An expert panel classified the potential clinical relevance of every identified practical problem and associated management strategy using a 3-point scale. Results Two hundred eleven practical problems and 184 management strategies were identified. Ninety-five percent of the participants experienced one or more practical problems with the use of their medicines: problems reading and understanding the instructions for use, handling the outer packaging, handling the immediate packaging, completing preparation before use, and taking the medicine. For 10 participants, at least one of their problems, in combination with the applied management strategy, had potential clinical consequences and 11 cases (5% of the problems) had the potential to cause moderate or severe clinical deterioration. Conclusion Older people experience a number of practical problems using their medicines, and their strategies to manage these problems are sometimes suboptimal. These problems can lead to incorrect medication use with clinically relevant consequences. The findings pose a challenge for healthcare professionals, drug developers, and regulators to diminish these problems. PMID:25516030

  13. Bayesian Atmospheric Radiative Transfer (BART): Model, Statistics Driver, and Application to HD 209458b

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.

    2014-11-01

    Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  14. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  15. FlexibleSUSY-A spectrum generator generator for supersymmetric models

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Park, Jae-hyeon; Stöckinger, Dominik; Voigt, Alexander

    2015-05-01

    We introduce FlexibleSUSY, a Mathematica and C++ package, which generates a fast, precise C++ spectrum generator for any SUSY model specified by the user. The generated code is designed with both speed and modularity in mind, making it easy to adapt and extend with new features. The model is specified by supplying the superpotential, gauge structure and particle content in a SARAH model file; specific boundary conditions e.g. at the GUT, weak or intermediate scales are defined in a separate FlexibleSUSY model file. From these model files, FlexibleSUSY generates C++ code for self-energies, tadpole corrections, renormalization group equations (RGEs) and electroweak symmetry breaking (EWSB) conditions and combines them with numerical routines for solving the RGEs and EWSB conditions simultaneously. The resulting spectrum generator is then able to solve for the spectrum of the model, including loop-corrected pole masses, consistent with user specified boundary conditions. The modular structure of the generated code allows for individual components to be replaced with an alternative if available. FlexibleSUSY has been carefully designed to grow as alternative solvers and calculators are added. Predefined models include the MSSM, NMSSM, E6SSM, USSM, R-symmetric models and models with right-handed neutrinos.

  16. Modular assembly of chimeric phi29 packaging RNAs that support DNA packaging.

    PubMed

    Fang, Yun; Shu, Dan; Xiao, Feng; Guo, Peixuan; Qin, Peter Z

    2008-08-08

    The bacteriophage phi29 DNA packaging motor is a protein/RNA complex that can produce strong force to condense the linear-double-stranded DNA genome into a pre-formed protein capsid. The RNA component, called the packaging RNA (pRNA), utilizes magnesium-dependent inter-molecular base-pairing interactions to form ring-shaped complexes. The pRNA is a class of non-coding RNA, interacting with phi29 motor proteins to enable DNA packaging. Here, we report a two-piece chimeric pRNA construct that is fully competent in interacting with partner pRNA to form ring-shaped complexes, in packaging DNA via the motor, and in assembling infectious phi29 virions in vitro. This is the first example of a fully functional pRNA assembled using two non-covalently interacting fragments. The results support the notion of modular pRNA architecture in the phi29 packaging motor.

  17. Modular assembly of chimeric phi29 packaging RNAs that support DNA packaging

    PubMed Central

    Fang, Yun; Shu, Dan; Xiao, Feng; Guo, Peixuan; Qin, Peter Z.

    2008-01-01

    The bacteriophage phi29 DNA packaging motor is a protein/RNA complex that can produce strong force to condense the linear-double stranded DNA genome into a pre-formed protein capsid. The RNA component, called the packaging RNA (pRNA), utilizes magnesium-dependent intermolecular base-pairing interactions to form ring-shaped complexes. The pRNA is a class of non-coding RNA, interacting with phi29 motor proteins to enable DNA packaging. Here, we report a 2-piece chimeric pRNA construct that is fully competent in interacting with partner pRNA to form ring-shaped complexes, in packaging DNA via the motor, and in assembling infectious phi29 virions in vitro. This is the first example of a fully functional pRNA assembled using two non-covalently interacting fragments. The results support the notion of modular pRNA architecture in the phi29 packaging motor. PMID:18514064

  18. 49 CFR 178.517 - Standards for plastic boxes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Standards for plastic boxes. 178.517 Section 178... PACKAGINGS Non-bulk Performance-Oriented Packaging Standards § 178.517 Standards for plastic boxes. (a) The following are identification codes for plastic boxes: (1) 4H1 for an expanded plastic box; and (2) 4H2 for a...

  19. APINetworks Java. A Java approach to the efficient treatment of large-scale complex networks

    NASA Astrophysics Data System (ADS)

    Muñoz-Caro, Camelia; Niño, Alfonso; Reyes, Sebastián; Castillo, Miriam

    2016-10-01

    We present a new version of the core structural package of our Application Programming Interface, APINetworks, for the treatment of complex networks in arbitrary computational environments. The new version is written in Java and presents several advantages over the previous C++ version: the portability of the Java code, the easiness of object-oriented design implementations, and the simplicity of memory management. In addition, some additional data structures are introduced for storing the sets of nodes and edges. Also, by resorting to the different garbage collectors currently available in the JVM the Java version is much more efficient than the C++ one with respect to memory management. In particular, the G1 collector is the most efficient one because of the parallel execution of G1 and the Java application. Using G1, APINetworks Java outperforms the C++ version and the well-known NetworkX and JGraphT packages in the building and BFS traversal of linear and complete networks. The better memory management of the present version allows for the modeling of much larger networks.

  20. Phonon Calculations Using the Real-Space Multigrid Method (RMG)

    NASA Astrophysics Data System (ADS)

    Zhang, Jiayong; Lu, Wenchang; Briggs, Emil; Cheng, Yongqiang; Ramirez-Cuesta, A. J.; Bernholc, Jerry

    RMG, a DFT-based open-source package using the real-space multigrid method, has proven to work effectively on large scale systems with thousands of atoms. Our recent work has shown its practicability for high accuracy phonon calculations employing the frozen phonon method. In this method, a primary unit cell with a small lattice constant is enlarged to a supercell that is sufficiently large to obtain the force constants matrix by finite displacements of atoms in the supercell. An open-source package PhonoPy is used to determine the necessary displacements by taking symmetry into account. A python script coupling RMG and PhonoPy enables us to perform high-throughput calculations of phonon properties. We have applied this method to many systems, such as silicon, silica glass, ZIF-8, etc. Results from RMG are compared to the experimental spectra measured using the VISION inelastic neutron scattering spectrometer at the Spallation Neutron Source at ORNL, as well as results from other DFT codes. The computing resources were made available through the VirtuES (Virtual Experiments in Spectroscopy) project, funded by Laboratory Directed Research and Development program (LDRD project No. 7739)

  1. Fast and Accurate Protein False Discovery Rates on Large-Scale Proteomics Data Sets with Percolator 3.0

    NASA Astrophysics Data System (ADS)

    The, Matthew; MacCoss, Michael J.; Noble, William S.; Käll, Lukas

    2016-11-01

    Percolator is a widely used software tool that increases yield in shotgun proteomics experiments and assigns reliable statistical confidence measures, such as q values and posterior error probabilities, to peptides and peptide-spectrum matches (PSMs) from such experiments. Percolator's processing speed has been sufficient for typical data sets consisting of hundreds of thousands of PSMs. With our new scalable approach, we can now also analyze millions of PSMs in a matter of minutes on a commodity computer. Furthermore, with the increasing awareness for the need for reliable statistics on the protein level, we compared several easy-to-understand protein inference methods and implemented the best-performing method—grouping proteins by their corresponding sets of theoretical peptides and then considering only the best-scoring peptide for each protein—in the Percolator package. We used Percolator 3.0 to analyze the data from a recent study of the draft human proteome containing 25 million spectra (PM:24870542). The source code and Ubuntu, Windows, MacOS, and Fedora binary packages are available from http://percolator.ms/ under an Apache 2.0 license.

  2. Fast and Accurate Protein False Discovery Rates on Large-Scale Proteomics Data Sets with Percolator 3.0.

    PubMed

    The, Matthew; MacCoss, Michael J; Noble, William S; Käll, Lukas

    2016-11-01

    Percolator is a widely used software tool that increases yield in shotgun proteomics experiments and assigns reliable statistical confidence measures, such as q values and posterior error probabilities, to peptides and peptide-spectrum matches (PSMs) from such experiments. Percolator's processing speed has been sufficient for typical data sets consisting of hundreds of thousands of PSMs. With our new scalable approach, we can now also analyze millions of PSMs in a matter of minutes on a commodity computer. Furthermore, with the increasing awareness for the need for reliable statistics on the protein level, we compared several easy-to-understand protein inference methods and implemented the best-performing method-grouping proteins by their corresponding sets of theoretical peptides and then considering only the best-scoring peptide for each protein-in the Percolator package. We used Percolator 3.0 to analyze the data from a recent study of the draft human proteome containing 25 million spectra (PM:24870542). The source code and Ubuntu, Windows, MacOS, and Fedora binary packages are available from http://percolator.ms/ under an Apache 2.0 license. Graphical Abstract ᅟ.

  3. Entity- Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Brian; Oppel, Fred; Rigdon, Brian

    2012-09-13

    This package contains classes that capture high-level aspects of characters and vehicles. Vehicles manage seats and riders. Vehicles and characters now can be configured to compose different behaviors and have certain capabilities, by adding them through xml data. These behaviors and capabilities are not included in this package, but instead are part of other packages such as mobility behavior, path planning, sight, sound. Entity is not dependent on these other packages. This package also contains the icons used for Umbra applications Dante Scenario Editor, Dante Tabletop and OpShed. This assertion includes a managed C++ wrapper code (EntityWrapper) to enable C#more » applications, such as Dante Scenario Editor, Dante Tabletop, and OpShed, to incorporate this library.« less

  4. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  5. NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations

    NASA Astrophysics Data System (ADS)

    Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.

    2010-09-01

    The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.

  6. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  7. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  8. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  9. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT.” This...

  10. 49 CFR 171.25 - Additional requirements for the use of the IMDG Code.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 176 of this subchapter. (3) Packages containing primary lithium batteries and cells that are transported in accordance with Special Provision 188 of the IMDG Code must be marked “PRIMARY LITHIUM BATTERIES—FORBIDDEN FOR TRANSPORT ABOARD PASSENGER AIRCRAFT” or “LITHIUM METAL BATTERIES—FORBIDDEN FOR...

  11. Introducing Python tools for magnetotellurics: MTpy

    NASA Astrophysics Data System (ADS)

    Krieger, L.; Peacock, J.; Inverarity, K.; Thiel, S.; Robertson, K.

    2013-12-01

    Within the framework of geophysical exploration techniques, the magnetotelluric method (MT) is relatively immature: It is still not as widely spread as other geophysical methods like seismology, and its processing schemes and data formats are not thoroughly standardized. As a result, the file handling and processing software within the academic community is mainly based on a loose collection of codes, which are sometimes highly adapted to the respective local specifications. Although tools for the estimation of the frequency dependent MT transfer function, as well as inversion and modelling codes, are available, the standards and software for handling MT data are generally not unified throughout the community. To overcome problems that arise from missing standards, and to simplify the general handling of MT data, we have developed the software package "MTpy", which allows the handling, processing, and imaging of magnetotelluric data sets. It is written in Python and the code is open-source. The setup of this package follows the modular approach of successful software packages like GMT or Obspy. It contains sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides pure Python classes and functions, MTpy provides wrappers and convenience scripts to call external software, e.g. modelling and inversion codes. Even though still under development, MTpy already contains ca. 250 functions that work on raw and preprocessed data. However, as our aim is not to produce a static collection of software, we rather introduce MTpy as a flexible framework, which will be dynamically extended in the future. It then has the potential to help standardise processing procedures and at same time be a versatile supplement for existing algorithms. We introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing utilising MTpy on an example data set collected over a geothermal exploration site in South Australia. Workflow of MT data processing. Within the structural diagram, the MTpy sub-packages are shown in red (time series data processing), green (handling of EDI files and impedance tensor data), yellow (connection to modelling/inversion algorithms), black (impedance tensor interpretation, e.g. by Phase Tensor calculations), and blue (generation of visual representations, e.g pseudo sections or resistivity models).

  12. An Analysis of Information Assurance Relating to the Department of Defense Radio Frequency Identification (RFID) Passive Network

    DTIC Science & Technology

    2005-03-01

    codes speed up consumer shopping, package shipping, and inventory tracking. RFID offers many advantages over bar codes, as the table below shows...sunlight” (Accenture, 2001, p. 4). Finally, one of the most significant advantages of RFID is the advent of anti-collision. Anti-collision allows an...RFID reader to read and/or write to multiple tags at one time, which is not possible for bar codes. Despite the many advantages RFID over bar codes

  13. 39 CFR Appendix A to Part 121 - Tables Depicting Service Standard Day Ranges

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... & USVI Periodicals 1 1-3 1 1-3 1-4 (AK) 11 (JNU) 11 (KTN) 1 (HI) 2 (GU) 1-4 10-11 10 8-10 Standard Mail 2 3 3-4 3-4 14 13 12 Package Services 1 2 2-3 2-3 12 11 11 AK = Alaska 3-digit ZIP Codes 995-997; JNU = Juneau AK 3-digit ZIP Code 998; KTN = Ketchikan AK 3-digit ZIP Code 999; HI = Hawaii 3-digit ZIP Codes...

  14. 39 CFR Appendix A to Part 121 - Tables Depicting Service Standard Day Ranges

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... & USVI Periodicals 1 1-3 1 1-3 1-4 (AK) 11 (JNU) 11 (KTN) 1 (HI) 2 (GU) 1-4 10-11 10 8-10 Standard Mail 2 3 3-4 3-4 14 13 12 Package Services 1 2 2-3 2-3 12 11 11 AK = Alaska 3-digit ZIP Codes 995-997; JNU = Juneau AK 3-digit ZIP Code 998; KTN = Ketchikan AK 3-digit ZIP Code 999; HI = Hawaii 3-digit ZIP Codes...

  15. Development of Fuel Shuffling Module for PHISICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan Mabe; Andrea Alfonsi; Cristian Rabiti

    2013-06-01

    PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less

  16. The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems

    DOE R&D Accomplishments Database

    Nelson, W. R.; Namito, Yoshihito

    1990-03-01

    In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.

  17. Modeling ICF With RAGE, BHR, And The New Laser Package

    NASA Astrophysics Data System (ADS)

    Cliche, Dylan; Welser-Sherrill, Leslie; Haines, Brian; Mancini, Roberto

    2017-10-01

    Inertial Confinement Fusion (ICF) is one method used to obtain thermonuclear burn through the either direct or indirect ablation of a millimeter-scale capsule with several lasers. Although progress has been made in theory, experiment, and diagnostics, the community has yet to reach ignition. A way of investigating this is through the use of high performance computer simulations of the implosion. RAGE is an advanced 1D, 2D, and 3D radiation adaptive grid Eulerian code used to simulate hydrodynamics of a system. Due to the unstable nature of two unequal densities accelerating into one another, it is important to include a turbulence model. BHR is a turbulence model which uses Reynolds-averaged Navier-Stokes (RANS) equations to model the mixing that occurs between the shell and fusion fuel material. Until recently, it was still difficult to model direct drive experiments because there was no laser energy deposition model in RAGE. Recently, a new laser energy deposition model has been implemented using the same ray tracing method as the Mazinisin laser package used at the OMEGA laser facility at the Laboratory for Laser Energetics (LLE) in Rochester, New York. Using the new laser package along with BHR for mixing allows us to more accurately simulate ICF implosions and obtain spatially and temporally resolved information (e.g. position, temperature, density, and mix concentrations) to give insight into what is happening inside the implosion.

  18. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    NASA Astrophysics Data System (ADS)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik; Behroozi, Peter; Diemer, Benedikt; Goldbaum, Nathan J.; Jennings, Elise; Leauthaud, Alexie; Mao, Yao-Yuan; More, Surhud; Parejko, John; Sinha, Manodeep; Sipöcz, Brigitta; Zentner, Andrew

    2017-11-01

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custom number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy-galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.

  19. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  20. Report of AAPM Task Group 162: Software for planar image quality metrology.

    PubMed

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  1. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  2. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  3. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  4. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  5. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  6. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  7. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  8. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  9. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  10. CoFFEE: Corrections For Formation Energy and Eigenvalues for charged defect simulations

    NASA Astrophysics Data System (ADS)

    Naik, Mit H.; Jain, Manish

    2018-05-01

    Charged point defects in materials are widely studied using Density Functional Theory (DFT) packages with periodic boundary conditions. The formation energy and defect level computed from these simulations need to be corrected to remove the contributions from the spurious long-range interaction between the defect and its periodic images. To this effect, the CoFFEE code implements the Freysoldt-Neugebauer-Van de Walle (FNV) correction scheme. The corrections can be applied to charged defects in a complete range of material shapes and size: bulk, slab (or two-dimensional), wires and nanoribbons. The code is written in Python and features MPI parallelization and optimizations using the Cython package for slow steps.

  11. Validation of the new code package APOLLO2.8 for accurate PWR neutronics calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Blaise, P.

    2013-07-01

    This paper summarizes the Qualification work performed to demonstrate the accuracy of the new APOLLO2.S/SHEM-MOC package based on JEFF3.1.1 nuclear data file for the prediction of PWR neutronics parameters. This experimental validation is based on PWR mock-up critical experiments performed in the EOLE/MINERVE zero-power reactors and on P.I. Es on spent fuel assemblies from the French PWRs. The Calculation-Experiment comparison for the main design parameters is presented: reactivity of UOX and MOX lattices, depletion calculation and fuel inventory, reactivity loss with burnup, pin-by-pin power maps, Doppler coefficient, Moderator Temperature Coefficient, Void coefficient, UO{sub 2}-Gd{sub 2}O{sub 3} poisoning worth, Efficiency ofmore » Ag-In-Cd and B4C control rods, Reflector Saving for both standard 2-cm baffle and GEN3 advanced thick SS reflector. From this qualification process, calculation biases and associated uncertainties are derived. This code package APOLLO2.8 is already implemented in the ARCADIA new AREVA calculation chain for core physics and is currently under implementation in the future neutronics package of the French utility Electricite de France. (authors)« less

  12. The U. S. Department of Energy SARP review training program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauck, C.J.

    1988-01-01

    In support of its radioactive material packaging certification program, the U.S. Department of Energy (DOE) has established a special training workshop. The purpose of the two-week workshop is to develop skills in reviewing Safety Analysis Reports for Packagings (SARPs) and performing confirmatory analyses. The workshop, conducted by the Lawrence Livermore National Laboratory (LLNL) for DOE, is divided into two parts: methods of review and methods of analysis. The sessions covering methods of review are based on the DOE document, ''Packaging Review Guide for Reviewing Safety Analysis Reports for Packagings'' (PRG). The sessions cover relevant DOE Orders and all areas ofmore » review in the applicable Nuclear Regulatory Commission (NRC) Regulatory Guides. The technical areas addressed include structural and thermal behavior, materials, shielding, criticality, and containment. The course sessions on methods of analysis provide hands-on experience in the use of calculational methods and codes for reviewing SARPs. Analytical techniques and computer codes are discussed and sample problems are worked. Homework is assigned each night and over the included weekend; at the conclusion, a comprehensive take-home examination is given requiring six to ten hours to complete.« less

  13. Basic mathematical function libraries for scientific computation

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.

  14. Automated UMLS-Based Comparison of Medical Forms

    PubMed Central

    Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard

    2013-01-01

    Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827

  15. PelePhysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-17

    PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.

  16. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  17. PlasmaPy: initial development of a Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas; Leonard, Andrew J.; Stańczak, Dominik; Haggerty, Colby C.; Parashar, Tulasi N.; Huang, Yu-Min; PlasmaPy Community

    2017-10-01

    We report on initial development of PlasmaPy: an open source community-driven Python package for plasma physics. PlasmaPy seeks to provide core functionality that is needed for the formation of a fully open source Python ecosystem for plasma physics. PlasmaPy prioritizes code readability, consistency, and maintainability while using best practices for scientific computing such as version control, continuous integration testing, embedding documentation in code, and code review. We discuss our current and planned capabilities, including features presently under development. The development roadmap includes features such as fluid and particle simulation capabilities, a Grad-Shafranov solver, a dispersion relation solver, atomic data retrieval methods, and tools to analyze simulations and experiments. We describe several ways to contribute to PlasmaPy. PlasmaPy has a code of conduct and is being developed under a BSD license, with a version 0.1 release planned for 2018. The success of PlasmaPy depends on active community involvement, so anyone interested in contributing to this project should contact the authors. This work was partially supported by the U.S. Department of Energy.

  18. qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2008-10-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  19. qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2009-02-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  20. A Python Implementation of an Intermediate-Level Tropical Circulation Model and Implications for How Modeling Science is Done

    NASA Astrophysics Data System (ADS)

    Lin, J. W. B.

    2015-12-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  1. Validation of a Laser-Ray Package in an Eulerian Code

    NASA Astrophysics Data System (ADS)

    Bradley, Paul; Hall, Mike; McKenty, Patrick; Collins, Tim; Keller, David

    2014-10-01

    A laser-ray absorption package was recently installed in the RAGE code by the Laboratory for Laser Energetics (LLE). In this presentation, we describe our use of this package to implode Omega 60 beam symmetric direct drive capsules. The capsules have outer diameters of about 860 microns, CH plastic shell thicknesses between 8 and 32 microns, DD or DT gas fills between 5 and 20 atmospheres, and a 1 ns square pulse of 23 to 27 kJ. These capsule implosions were previously modeled with a calibrated energy source in the outer layer of the capsule, where we matched bang time and burn ion temperature well, but the simulated yields were two to three times higher than the data. We will run simulations with laser ray energy deposition to the experiments and the results to the yield and spectroscopic data. Work performed by Los Alamos National Laboratory under Contract DE-AC52-06NA25396 for the National Nuclear Security Administration of the U.S. Department of Energy.

  2. Flight experiment of thermal energy storage

    NASA Technical Reports Server (NTRS)

    Namkoong, David

    1989-01-01

    Thermal energy storage (TES) enables a solar dynamic system to deliver constant electric power through periods of sun and shade. Brayton and Stirling power systems under current considerations for missions in the near future require working fluid temperatures in the 1100 to 1300+ K range. TES materials that meet these requirements fall into the fluoride family of salts. These salts store energy as a heat of fusion, thereby transferring heat to the fluid at constant temperature during shade. The principal feature of fluorides that must be taken into account is the change in volume that occurs with melting and freezing. Salts shrink as they solidify, a change reaching 30 percent for some salts. The location of voids that form as result of the shrinkage is critical when the solar dynamic system reemerges into the sun. Hot spots can develop in the TES container or the container can become distorted if the melting salt cannot expand elsewhere. Analysis of the transient, two-phase phenomenon is being incorporated into a three-dimensional computer code. The code is capable of analysis under microgravity as well as 1 g. The objective of the flight program is to verify the predictions of the code, particularly of the void location and its effect on containment temperature. The four experimental packages comprising the program will be the first tests of melting and freezing conducted under microgravity. Each test package will be installed in a Getaway Special container to be carried by the shuttle. The package will be self-contained and independent of shuttle operations other than the initial opening of the container lid and the final closing of the lid. Upon the return of the test package from flight, the TES container will be radiographed and finally partitioned to examine the exact location and shape of the void. Visual inspection of the void and the temperature data during flight will constitute the bases for code verification.

  3. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.

    PubMed

    Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan

    2017-05-24

    The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  4. Parachute Dynamics Investigations Using a Sensor Package Airdropped from a Small-Scale Airplane

    NASA Technical Reports Server (NTRS)

    Dooley, Jessica; Lorenz, Ralph D.

    2005-01-01

    We explore the utility of various sensors by recovering parachute-probe dynamics information from a package released from a small-scale, remote-controlled airplane. The airdrops aid in the development of datasets for the exploration of planetary probe trajectory recovery algorithms, supplementing data collected from instrumented, full-scale tests and computer models.

  5. Effect of Various Packaging Methods on Small-Scale Hanwoo (Korean Native Cattle) during Refrigerated Storage

    PubMed Central

    Yu, Hwan Hee; Song, Myung Wook; Kim, Tae-Kyung; Choi, Yun-Sang; Cho, Gyu Yong; Lee, Na-Kyoung; Paik, Hyun-Dong

    2018-01-01

    Abstract The objective of this study was to investigate comparison of physicochemical, microbiological, and sensory characteristics of Hanwoo eye of round by various packaging methods [wrapped packaging (WP), modified atmosphere packaging (MAP), vacuum packaging (VP) with three different vacuum films, and vacuum skin packaging (VSP)] at a small scale. Packaged Hanwoo beef samples were stored in refrigerated conditions (4±1°C) for 28 days. Packaged beef was sampled on days 0, 7, 14, 21, and 28. Physicochemical [pH, surface color, thiobarbituric acid reactive substances (TBARS), and volatile basic nitrogen (VBN) values], microbiological, and sensory analysis of packaged beef samples were performed. VP and VSP samples showed low TBARS and VBN values, and pH and surface color did not change substantially during the 28-day period. For VSP, total viable bacteria, psychrotrophic bacteria, lactic acid bacteria, and coliform counts were lower than those for other packaging systems. Salmonella spp. and Escherichia coli O157:H7 were not detected in any packaged beef samples. A sensory analysis showed that the scores for appearance, flavor, color, and overall acceptability did not change significantly until day 7. In total, VSP was effective with respect to significantly higher a* values, physicochemical stability, and microbial safety in Hanwoo packaging (p<0.05). PMID:29805283

  6. Suitability of point kernel dose calculation techniques in brachytherapy treatment planning

    PubMed Central

    Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.

    2010-01-01

    Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118

  7. Technical Review Report for the Model 9978-96 Package Safety Analysis Report for Packaging (S-SARP-G-00002, Revision 1, March 2009)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, M

    2009-03-06

    This Technical Review Report (TRR) documents the review, performed by Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the Department of Energy (DOE), on the 'Safety Analysis Report for Packaging (SARP), Model 9978 B(M)F-96', Revision 1, March 2009 (S-SARP-G-00002). The Model 9978 Package complies with 10 CFR 71, and with 'Regulations for the Safe Transport of Radioactive Material-1996 Edition (As Amended, 2000)-Safety Requirements', International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9978 Packaging is designed, analyzed, fabricated, and tested in accordance with Section III of the American Society of Mechanical Engineers Boiler and Pressuremore » Vessel Code (ASME B&PVC). The review presented in this TRR was performed using the methods outlined in Revision 3 of the DOE's 'Packaging Review Guide (PRG) for Reviewing Safety Analysis Reports for Packages'. The format of the SARP follows that specified in Revision 2 of the Nuclear Regulatory Commission's Regulatory Guide 7.9, i.e., 'Standard Format and Content of Part 71 Applications for Approval of Packages for Radioactive Material'. Although the two documents are similar in their content, they are not identical. Formatting differences have been noted in this TRR, where appropriate. The Model 9978 Packaging is a single containment package, using a 5-inch containment vessel (5CV). It uses a nominal 35-gallon drum package design. In comparison, the Model 9977 Packaging uses a 6-inch containment vessel (6CV). The Model 9977 and Model 9978 Packagings were developed concurrently, and they were referred to as the General Purpose Fissile Material Package, Version 1 (GPFP). Both packagings use General Plastics FR-3716 polyurethane foam as insulation and as impact limiters. The 5CV is used as the Primary Containment Vessel (PCV) in the Model 9975-96 Packaging. The Model 9975-96 Packaging also has the 6CV as its Secondary Containment Vessel (SCV). In comparison, the Model 9975 Packagings use Celotex{trademark} for insulation and as impact limiters. To provide a historical perspective, it is noted that the Model 9975-96 Packaging is a 35-gallon drum package design that has evolved from a family of packages designed by DOE contractors at the Savannah River Site. Earlier package designs, i.e., the Model 9965, the Model 9966, the Model 9967, and the Model 9968 Packagings, were originally designed and certified in the early 1980s. In the 1990s, updated package designs that incorporated design features consistent with the then-newer safety requirements were proposed. The updated package designs at the time were the Model 9972, the Model 9973, the Model 9974, and the Model 9975 Packagings, respectively. The Model 9975 Package was certified by the Packaging Certification Program, under the Office of Safety Management and Operations. The Model 9978 Package has six Content Envelopes: C.1 ({sup 238}Pu Heat Sources), C.2 ( Pu/U Metals), C.3 (Pu/U Oxides, Reserved), C.4 (U Metal or Alloy), C.5 (U Compounds), and C.6 (Samples and Sources). Per 10 CFR 71.59 (Code of Federal Regulations), the value of N is 50 for the Model 9978 Package leading to a Criticality Safety Index (CSI) of 1.0. The Transport Index (TI), based on dose rate, is calculated to be a maximum of 4.1.« less

  8. Incorporation of coupled nonequilibrium chemistry into a two-dimensional nozzle code (SEAGULL)

    NASA Technical Reports Server (NTRS)

    Ratliff, A. W.

    1979-01-01

    A two-dimensional multiple shock nozzle code (SEAGULL) was extended to include the effects of finite rate chemistry. The basic code that treats multiple shocks and contact surfaces was fully coupled with a generalized finite rate chemistry and vibrational energy exchange package. The modified code retains all of the original SEAGULL features plus the capability to treat chemical and vibrational nonequilibrium reactions. Any chemical and/or vibrational energy exchange mechanism can be handled as long as thermodynamic data and rate constants are available for all participating species.

  9. An Examination of the Reliability of the Organizational Assessment Package (OAP).

    DTIC Science & Technology

    1981-07-01

    reactiv- ity or pretest sensitization (Bracht and Glass, 1968) may occur. In this case, the change from pretest to posttest can be caused just by the...content items. The blocks for supervisor’s code were left blank, work group code was coded as all ones , and each person’s seminar number was coded in...63 5 19 .91 .74 5 (Work Group Effective- ness) 822 19 .83 .42 7 17 .90 .57 7 (Job Related Sati sfacti on ) 823 16 .91 .84 2 18 .93 .87 2 (Job Related

  10. Child-resistant and tamper-resistant packaging: A systematic review to inform tobacco packaging regulation.

    PubMed

    Jo, Catherine L; Ambs, Anita; Dresler, Carolyn M; Backinger, Cathy L

    2017-02-01

    We aimed to investigate the effects of special packaging (child-resistant, adult-friendly) and tamper-resistant packaging on health and behavioral outcomes in order to identify research gaps and implications for packaging standards for tobacco products. We searched seven databases for keywords related to special and tamper-resistant packaging, consulted experts, and reviewed citations of potentially relevant studies. 733 unique papers were identified. Two coders independently screened each title and abstract for eligibility. They then reviewed the full text of the remaining papers for a second round of eligibility screening. Included studies investigated a causal relationship between type of packaging or packaging regulation and behavioral or health outcomes and had a study population composed of consumers. Studies were excluded on the basis of publication type, if they were not peer-reviewed, and if they had low external validity. Two reviewers independently coded each paper for study and methodological characteristics and limitations. Discrepancies were discussed and resolved. The review included eight studies: four assessing people's ability to access the contents of different packaging types and four evaluating the impact of packaging requirements on health-related outcomes. Child-resistant packaging was generally more difficult to open than non-child-resistant packaging. Child-resistant packaging requirements have been associated with reductions in child mortality. Child-resistant packaging holds the expectation to reduce tobacco product poisonings among children under six. Published by Elsevier Inc.

  11. Child-resistant and tamper-resistant packaging: A systematic review to inform tobacco packaging regulation

    PubMed Central

    Jo, Catherine L.; Ambs, Anita; Dresler, Carolyn M.; Backinger, Cathy L.

    2017-01-01

    Objective We aimed to investigate the effects of special packaging (child-resistant, adult-friendly) and tamper-resistant packaging on health and behavioral outcomes in order to identify research gaps and implications for packaging standards for tobacco products. Methods We searched seven databases for keywords related to special and tamper-resistant packaging, consulted experts, and reviewed citations of potentially relevant studies. 733 unique papers were identified. Two coders independently screened each title and abstract for eligibility. They then reviewed the full text of the remaining papers for a second round of eligibility screening. Included studies investigated a causal relationship between type of packaging or packaging regulation and behavioral or health outcomes and had a study population composed of consumers. Studies were excluded on the basis of publication type, if they were not peer-reviewed, and if they had low external validity. Two reviewers independently coded each paper for study and methodological characteristics and limitations. Discrepancies were discussed and resolved. Results The review included eight studies: four assessing people’s ability to access the contents of different packaging types and four evaluating the impact of packaging requirements on health-related outcomes. Child-resistant packaging was generally more difficult to open than non-child-resistant packaging. Child-resistant packaging requirements have been associated with reductions in child mortality. Conclusions Child-resistant packaging holds the expectation to reduce tobacco product poisonings among children under six. PMID:27939602

  12. Thermal cycling test results of CSP and RF assemblies

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.; Nelson, G.; Cooper, M.; Lam, D.; Strudler, S.; Umdekar, A.; Selk, K.; Bjorndahl, B.; Duprey, R.

    2000-01-01

    A JPL-led chip scale package (CSP) Consortium of enterprises, composed of representing agencies and private companies, recently joined together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects.

  13. Impact of external influences on food packaging.

    PubMed

    Brody, A L

    1977-09-01

    Since the food supply is dependent upon an effective packaging system, threats to packaging represent implied threats to food processing and distribution. Enacted and potential legislation and regulation are retarding technological and commercial progress in food packaging and have already restricted some food packaging/processins systems. The results of these external influences is not simply the sum of the individual acts, but is a cascading self-imposed arresting of food packaging/processing advancement. The technological bases for the enacted and proposed legislation and regulation are presented in the enumeration of the external influences on food packaging. Economic and sociological arguments and facts surrounding the issues are also presented. Among the external influences on food packaging detailed are indirect additives, nutritional labeling, benefit:risk, solid waste and litter, environmental pollution, universal product code, and food industry productivity. The magnitude of the total impact of these external influences upon the food supply is so large that assertive action must be taken to channel these influences into more productive awareness. An objective and comprehensive public communications program supported by the technological community appears mandatory.

  14. Real-Time Pattern Recognition - An Industrial Example

    NASA Astrophysics Data System (ADS)

    Fitton, Gary M.

    1981-11-01

    Rapid advancements in cost effective sensors and micro computers are now making practical the on-line implementation of pattern recognition based systems for a variety of industrial applications requiring high processing speeds. One major application area for real time pattern recognition is in the sorting of packaged/cartoned goods at high speed for automated warehousing and return goods cataloging. While there are many OCR and bar code readers available to perform these functions, it is often impractical to use such codes (package too small, adverse esthetics, poor print quality) and an approach which recognizes an item by its graphic content alone is desirable. This paper describes a specific application within the tobacco industry, that of sorting returned cigarette goods by brand and size.

  15. Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David

    This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.

  16. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  17. New algorithm for tensor contractions on multi-core CPUs, GPUs, and accelerators enables CCSD and EOM-CCSD calculations with over 1000 basis functions on a single compute node.

    PubMed

    Kaliman, Ilya A; Krylov, Anna I

    2017-04-30

    A new hardware-agnostic contraction algorithm for tensors of arbitrary symmetry and sparsity is presented. The algorithm is implemented as a stand-alone open-source code libxm. This code is also integrated with general tensor library libtensor and with the Q-Chem quantum-chemistry package. An overview of the algorithm, its implementation, and benchmarks are presented. Similarly to other tensor software, the algorithm exploits efficient matrix multiplication libraries and assumes that tensors are stored in a block-tensor form. The distinguishing features of the algorithm are: (i) efficient repackaging of the individual blocks into large matrices and back, which affords efficient graphics processing unit (GPU)-enabled calculations without modifications of higher-level codes; (ii) fully asynchronous data transfer between disk storage and fast memory. The algorithm enables canonical all-electron coupled-cluster and equation-of-motion coupled-cluster calculations with single and double substitutions (CCSD and EOM-CCSD) with over 1000 basis functions on a single quad-GPU machine. We show that the algorithm exhibits predicted theoretical scaling for canonical CCSD calculations, O(N 6 ), irrespective of the data size on disk. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  19. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  20. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  1. SHIPMENT OF TWO DOE-STD-3013 CONTAINERS IN A 9977 TYPE B PACKAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, G.; Bellamy, S.; Loftin, B.

    2011-06-06

    The 9977 is a certified Type B Packaging authorized to ship uranium and plutonium in metal and oxide forms. Historically, the standard container for these materials has been the DOE-STD-3013 which was specifically designed for the long term storage of plutonium bearing materials. The Department of Energy has used the 9975 Packaging containing a single 3013 container for the transportation and storage of these materials. In order to reduce container, shipping, and storage costs, the 9977 Packaging is being certified for transportation and storage of two 3013 containers. The challenges and risks of this content and the 9977s ability tomore » meet the Code of Federal Regulations for the transport of these materials are presented.« less

  2. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  3. MODEL 9977 B(M)F-96 SAFETY ANALYSIS REPORT FOR PACKAGING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, G; Paul Blanton, P; Kurt Eberl, K

    2006-05-18

    This Safety Analysis Report for Packaging (SARP) documents the analysis and testing performed on and for the 9977 Shipping Package, referred to as the General Purpose Fissile Package (GPFP). The performance evaluation presented in this SARP documents the compliance of the 9977 package with the regulatory safety requirements for Type B packages. Per 10 CFR 71.59, for the 9977 packages evaluated in this SARP, the value of ''N'' is 50, and the Transport Index based on nuclear criticality control is 1.0. The 9977 package is designed with a high degree of single containment. The 9977 complies with 10 CFR 71more » (2002), Department of Energy (DOE) Order 460.1B, DOE Order 460.2, and 10 CFR 20 (2003) for As Low As Reasonably Achievable (ALARA) principles. The 9977 also satisfies the requirements of the Regulations for the Safe Transport of Radioactive Material--1996 Edition (Revised)--Requirements. IAEA Safety Standards, Safety Series No. TS-R-1 (ST-1, Rev.), International Atomic Energy Agency, Vienna, Austria (2000). The 9977 package is designed, analyzed and fabricated in accordance with Section III of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code, 1992 edition.« less

  4. Performance oriented packaging testing of nine Mk 3 Mod 0 signal containers in PPP-B-621 wood box for packing group II solid hazardous materials. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Libbert, K.J.

    1992-10-01

    A PPP-B-621 wood box containing nine Mk 3 Mod 0 Signal containers was tested for conformance to Performance Oriented Packaging criteria established by Code of Federal Regulations Title 49 CFR. The container was tested with a gross weight of 123.3 pounds (56 kilograms) and met all requirements.

  5. Smurf2 Regulates DNA Repair and Packaging to Prevent Tumors | Center for Cancer Research

    Cancer.gov

    The blueprint for all of a cell’s functions is written in the genetic code of DNA sequences as well as in the landscape of DNA and histone modifications. DNA is wrapped around histones to package it into chromatin, which is stored in the nucleus. It is important to maintain the integrity of the chromatin structure to ensure that the cell continues to behave appropriately.

  6. CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei

    2014-12-01

    We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.

  7. Status report on the 'Merging' of the Electron-Cloud Code POSINST with the 3-D Accelerator PIC CODE WARP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vay, J.-L.; Furman, M.A.; Azevedo, A.W.

    2004-04-19

    We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.

  8. Green Packaging Management of Logistics Enterprises

    NASA Astrophysics Data System (ADS)

    Zhang, Guirong; Zhao, Zongjian

    From the connotation of green logistics management, we discuss the principles of green packaging, and from the two levels of government and enterprises, we put forward a specific management strategy. The management of green packaging can be directly and indirectly promoted by laws, regulations, taxation, institutional and other measures. The government can also promote new investment to the development of green packaging materials, and establish specialized institutions to identify new packaging materials, standardization of packaging must also be accomplished through the power of the government. Business units of large scale through the packaging and container-based to reduce the use of packaging materials, develop and use green packaging materials and easy recycling packaging materials for proper packaging.

  9. Advanced simulation of mixed-material erosion/evolution and application to low and high-Z containing plasma facing components

    NASA Astrophysics Data System (ADS)

    Brooks, J. N.; Hassanein, A.; Sizyuk, T.

    2013-07-01

    Plasma interactions with mixed-material surfaces are being analyzed using advanced modeling of time-dependent surface evolution/erosion. Simulations use the REDEP/WBC erosion/redeposition code package coupled to the HEIGHTS package ITMC-DYN mixed-material formation/response code, with plasma parameter input from codes and data. We report here on analysis for a DIII-D Mo/C containing tokamak divertor. A DIII-D/DiMES probe experiment simulation predicts that sputtered molybdenum from a 1 cm diameter central spot quickly saturates (˜4 s) in the 5 cm diameter surrounding carbon probe surface, with subsequent re-sputtering and transport to off-probe divertor regions, and with high (˜50%) redeposition on the Mo spot. Predicted Mo content in the carbon agrees well with post-exposure probe data. We discuss implications and mixed-material analysis issues for Be/W mixing at the ITER outer divertor, and Li, C, Mo mixing at an NSTX divertor.

  10. Modelling of an Orthovoltage X-ray Therapy Unit with the EGSnrc Monte Carlo Package

    NASA Astrophysics Data System (ADS)

    Knöös, Tommy; Rosenschöld, Per Munck Af; Wieslander, Elinore

    2007-06-01

    Simulations with the EGSnrc code package of an orthovoltage x-ray machine have been performed. The BEAMnrc code was used to transport electrons, produce x-ray photons in the target and transport of these through the treatment machine down to the exit level of the applicator. Further transport in water or CT based phantoms was facilitated by the DOSXYZnrc code. Phase space files were scored with BEAMnrc and analysed regarding the energy spectra at the end of the applicator. Tuning of simulation parameters was based on the half-value layer quantity for the beams in either Al or Cu. Calculated depth dose and profile curves have been compared against measurements and show good agreement except at shallow depths. The MC model tested in this study can be used for various dosimetric studies as well as generating a library of typical treatment cases that can serve as both educational material and guidance in the clinical practice

  11. CFD analyses for advanced pump design

    NASA Technical Reports Server (NTRS)

    Dejong, F. J.; Choi, S.-K.; Govindan, T. R.

    1994-01-01

    As one of the activities of the NASA/MSFC Pump Stage Technology Team, the present effort was focused on using CFD in the design and analysis of high performance rocket engine pumps. Under this effort, a three-dimensional Navier-Stokes code was used for various inducer and impeller flow field calculations. An existing algebraic grid generation procedure was-extended to allow for nonzero blade thickness, splitter blades, and hub/shroud cavities upstream or downstream of the (main) blades. This resulted in a fast, robust inducer/impeller geometry/grid generation package. Problems associated with running a compressible flow code to simulate an incompressible flow were resolved; related aspects of the numerical algorithm (viz., the matrix preconditioning, the artificial dissipation, and the treatment of low Mach number flows) were addressed. As shown by the calculations performed under the present effort, the resulting code, in conjunction with the grid generation package, is an effective tool for the rapid solution of three-dimensional viscous inducer and impeller flows.

  12. LARC: computer codes for Lagrangian analysis of stress-gauge data to obtain decomposition rates through correlation to thermodynamic variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, A.B.; Wackerle, J.

    1983-07-01

    This report describes a package of five computer codes for analyzing stress-gauge data from shock-wave experiments on reactive materials. The aim of the analysis is to obtain rate laws from experiment. A Lagrangian analysis of the stress records, performed by program LANAL, provides flow histories of particle velocity, density, and energy. Three postprocessing programs, LOOKIT, LOOK1, and LOOK2, are included in the package of codes for producing graphical output of the results of LANAL. Program RATE uses the flow histories in conjunction with an equation of state to calculate reaction-rate histories. RATE can be programmed to examine correlations between themore » rate histories and thermodynamic variables. Observed correlations can be incorporated into an appropriately parameterized rate law. Program RATE determines the values of these parameters that best reproduce the observed rate histories. The procedure is illustrated with a sample problem.« less

  13. A LAMMPS implementation of volume-temperature replica exchange molecular dynamics

    NASA Astrophysics Data System (ADS)

    Liu, Liang-Chun; Kuo, Jer-Lai

    2015-04-01

    A driver module for executing volume-temperature replica exchange molecular dynamics (VTREMD) was developed for the LAMMPS package. As a patch code, the VTREMD module performs classical molecular dynamics (MD) with Monte Carlo (MC) decisions between MD runs. The goal of inserting the MC step was to increase the breadth of sampled configurational space. In this method, states receive better sampling by making temperature or density swaps with their neighboring states. As an accelerated sampling method, VTREMD is particularly useful to explore states at low temperatures, where systems are easily trapped in local potential wells. As functional examples, TIP4P/Ew and TIP4P/2005 water models were analyzed using VTREMD. The phase diagram in this study covered the deeply supercooled regime, and this test served as a suitable demonstration of the usefulness of VTREMD in overcoming the slow dynamics problem. To facilitate using the current code, attention was also paid on how to optimize the exchange efficiency by using grid allocation. VTREMD was useful for studying systems with rough energy landscapes, such as those with numerous local minima or multiple characteristic time scales.

  14. Porting ONETEP to graphical processing unit-based coprocessors. 1. FFT box operations.

    PubMed

    Wilkinson, Karl; Skylaris, Chris-Kriton

    2013-10-30

    We present the first graphical processing unit (GPU) coprocessor-enabled version of the Order-N Electronic Total Energy Package (ONETEP) code for linear-scaling first principles quantum mechanical calculations on materials. This work focuses on porting to the GPU the parts of the code that involve atom-localized fast Fourier transform (FFT) operations. These are among the most computationally intensive parts of the code and are used in core algorithms such as the calculation of the charge density, the local potential integrals, the kinetic energy integrals, and the nonorthogonal generalized Wannier function gradient. We have found that direct porting of the isolated FFT operations did not provide any benefit. Instead, it was necessary to tailor the port to each of the aforementioned algorithms to optimize data transfer to and from the GPU. A detailed discussion of the methods used and tests of the resulting performance are presented, which show that individual steps in the relevant algorithms are accelerated by a significant amount. However, the transfer of data between the GPU and host machine is a significant bottleneck in the reported version of the code. In addition, an initial investigation into a dynamic precision scheme for the ONETEP energy calculation has been performed to take advantage of the enhanced single precision capabilities of GPUs. The methods used here result in no disruption to the existing code base. Furthermore, as the developments reported here concern the core algorithms, they will benefit the full range of ONETEP functionality. Our use of a directive-based programming model ensures portability to other forms of coprocessors and will allow this work to form the basis of future developments to the code designed to support emerging high-performance computing platforms. Copyright © 2013 Wiley Periodicals, Inc.

  15. Chip Scale Package Integrity Assessment by Isothermal Aging

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    1998-01-01

    Many aspects of chip scale package (CSP) technology, with focus on assembly reliability characteristics, are being investigated by the JPL-led consortia. Three types of test vehicles were considered for evaluation and currently two configurations have been built to optimize attachment processes. These test vehicles use numerous package types. To understand potential failure mechanisms of the packages, particularly solder ball attachment, the grid CSPs were subjected to environmental exposure. Package I/Os ranged from 40 to nearly 300. This paper presents both as assembled, up to 1, 000 hours of isothermal aging shear test results and photo micrographs, and tensile test results before and after 1,500 cycles in the range of -30/100 C for CSPs. Results will be compared to BGAs with the same the same isothermal aging environmental exposures.

  16. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  17. PsiQuaSP-A library for efficient computation of symmetric open quantum systems.

    PubMed

    Gegg, Michael; Richter, Marten

    2017-11-24

    In a recent publication we showed that permutation symmetry reduces the numerical complexity of Lindblad quantum master equations for identical multi-level systems from exponential to polynomial scaling. This is important for open system dynamics including realistic system bath interactions and dephasing in, for instance, the Dicke model, multi-Λ system setups etc. Here we present an object-oriented C++ library that allows to setup and solve arbitrary quantum optical Lindblad master equations, especially those that are permutationally symmetric in the multi-level systems. PsiQuaSP (Permutation symmetry for identical Quantum Systems Package) uses the PETSc package for sparse linear algebra methods and differential equations as basis. The aim of PsiQuaSP is to provide flexible, storage efficient and scalable code while being as user friendly as possible. It is easily applied to many quantum optical or quantum information systems with more than one multi-level system. We first review the basics of the permutation symmetry for multi-level systems in quantum master equations. The application of PsiQuaSP to quantum dynamical problems is illustrated with several typical, simple examples of open quantum optical systems.

  18. Muon simulation codes MUSIC and MUSUN for underground physics

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, V. A.

    2009-03-01

    The paper describes two Monte Carlo codes dedicated to muon simulations: MUSIC (MUon SImulation Code) and MUSUN (MUon Simulations UNderground). MUSIC is a package for muon transport through matter. It is particularly useful for propagating muons through large thickness of rock or water, for instance from the surface down to underground/underwater laboratory. MUSUN is designed to use the results of muon transport through rock/water to generate muons in or around underground laboratory taking into account their energy spectrum and angular distribution.

  19. Burner liner thermal-structural load modeling

    NASA Technical Reports Server (NTRS)

    Maffeo, R.

    1986-01-01

    The software package Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) was developed. The TRANCITS code is used to interface temperature data between thermal and structural analytical models. The use of this transfer module allows the heat transfer analyst to select the thermal mesh density and thermal analysis code best suited to solve the thermal problem and gives the same freedoms to the stress analyst, without the efficiency penalties associated with common meshes and the accuracy penalties associated with the manual transfer of thermal data.

  20. BCM-2.0 - The new version of computer code ;Basic Channeling with Mathematica©;

    NASA Astrophysics Data System (ADS)

    Abdrashitov, S. V.; Bogdanov, O. V.; Korotchenko, K. B.; Pivovarov, Yu. L.; Rozhkova, E. I.; Tukhfatullin, T. A.; Eikhorn, Yu. L.

    2017-07-01

    The new symbolic-numerical code devoted to investigation of the channeling phenomena in periodic potential of a crystal has been developed. The code has been written in Wolfram Language taking advantage of analytical programming method. Newly developed different packages were successfully applied to simulate scattering, radiation, electron-positron pair production and other effects connected with channeling of relativistic particles in aligned crystal. The result of the simulation has been validated against data from channeling experiments carried out at SAGA LS.

  1. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  2. PynPoint code for exoplanet imaging

    NASA Astrophysics Data System (ADS)

    Amara, A.; Quanz, S. P.; Akeret, J.

    2015-04-01

    We announce the public release of PynPoint, a Python package that we have developed for analysing exoplanet data taken with the angular differential imaging observing technique. In particular, PynPoint is designed to model the point spread function of the central star and to subtract its flux contribution to reveal nearby faint companion planets. The current version of the package does this correction by using a principal component analysis method to build a basis set for modelling the point spread function of the observations. We demonstrate the performance of the package by reanalysing publicly available data on the exoplanet β Pictoris b, which consists of close to 24,000 individual image frames. We show that PynPoint is able to analyse this typical data in roughly 1.5 min on a Mac Pro, when the number of images is reduced by co-adding in sets of 5. The main computational work, the calculation of the Singular-Value-Decomposition, parallelises well as a result of a reliance on the SciPy and NumPy packages. For this calculation the peak memory load is 6 GB, which can be run comfortably on most workstations. A simpler calculation, by co-adding over 50, takes 3 s with a peak memory usage of 600 MB. This can be performed easily on a laptop. In developing the package we have modularised the code so that we will be able to extend functionality in future releases, through the inclusion of more modules, without it affecting the users application programming interface. We distribute the PynPoint package under GPLv3 licence through the central PyPI server, and the documentation is available online (http://pynpoint.ethz.ch).

  3. PHYLUCE is a software package for the analysis of conserved genomic loci.

    PubMed

    Faircloth, Brant C

    2016-03-01

    Targeted enrichment of conserved and ultraconserved genomic elements allows universal collection of phylogenomic data from hundreds of species at multiple time scales (<5 Ma to > 300 Ma). Prior to downstream inference, data from these types of targeted enrichment studies must undergo preprocessing to assemble contigs from sequence data; identify targeted, enriched loci from the off-target background data; align enriched contigs representing conserved loci to one another; and prepare and manipulate these alignments for subsequent phylogenomic inference. PHYLUCE is an efficient and easy-to-install software package that accomplishes these tasks across hundreds of taxa and thousands of enriched loci. PHYLUCE is written for Python 2.7. PHYLUCE is supported on OSX and Linux (RedHat/CentOS) operating systems. PHYLUCE source code is distributed under a BSD-style license from https://www.github.com/faircloth-lab/phyluce/ PHYLUCE is also available as a package (https://binstar.org/faircloth-lab/phyluce) for the Anaconda Python distribution that installs all dependencies, and users can request a PHYLUCE instance on iPlant Atmosphere (tag: phyluce). The software manual and a tutorial are available from http://phyluce.readthedocs.org/en/latest/ and test data are available from doi: 10.6084/m9.figshare.1284521. brant@faircloth-lab.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. A frequency-based linguistic approach to protein decoding and design: Simple concepts, diverse applications, and the SCS Package

    PubMed Central

    Motomura, Kenta; Nakamura, Morikazu; Otaki, Joji M.

    2013-01-01

    Protein structure and function information is coded in amino acid sequences. However, the relationship between primary sequences and three-dimensional structures and functions remains enigmatic. Our approach to this fundamental biochemistry problem is based on the frequencies of short constituent sequences (SCSs) or words. A protein amino acid sequence is considered analogous to an English sentence, where SCSs are equivalent to words. Availability scores, which are defined as real SCS frequencies in the non-redundant amino acid database relative to their probabilistically expected frequencies, demonstrate the biological usage bias of SCSs. As a result, this frequency-based linguistic approach is expected to have diverse applications, such as secondary structure specifications by structure-specific SCSs and immunological adjuvants with rare or non-existent SCSs. Linguistic similarities (e.g., wide ranges of scale-free distributions) and dissimilarities (e.g., behaviors of low-rank samples) between proteins and the natural English language have been revealed in the rank-frequency relationships of SCSs or words. We have developed a web server, the SCS Package, which contains five applications for analyzing protein sequences based on the linguistic concept. These tools have the potential to assist researchers in deciphering structurally and functionally important protein sites, species-specific sequences, and functional relationships between SCSs. The SCS Package also provides researchers with a tool to construct amino acid sequences de novo based on the idiomatic usage of SCSs. PMID:24688703

  5. A frequency-based linguistic approach to protein decoding and design: Simple concepts, diverse applications, and the SCS Package.

    PubMed

    Motomura, Kenta; Nakamura, Morikazu; Otaki, Joji M

    2013-01-01

    Protein structure and function information is coded in amino acid sequences. However, the relationship between primary sequences and three-dimensional structures and functions remains enigmatic. Our approach to this fundamental biochemistry problem is based on the frequencies of short constituent sequences (SCSs) or words. A protein amino acid sequence is considered analogous to an English sentence, where SCSs are equivalent to words. Availability scores, which are defined as real SCS frequencies in the non-redundant amino acid database relative to their probabilistically expected frequencies, demonstrate the biological usage bias of SCSs. As a result, this frequency-based linguistic approach is expected to have diverse applications, such as secondary structure specifications by structure-specific SCSs and immunological adjuvants with rare or non-existent SCSs. Linguistic similarities (e.g., wide ranges of scale-free distributions) and dissimilarities (e.g., behaviors of low-rank samples) between proteins and the natural English language have been revealed in the rank-frequency relationships of SCSs or words. We have developed a web server, the SCS Package, which contains five applications for analyzing protein sequences based on the linguistic concept. These tools have the potential to assist researchers in deciphering structurally and functionally important protein sites, species-specific sequences, and functional relationships between SCSs. The SCS Package also provides researchers with a tool to construct amino acid sequences de novo based on the idiomatic usage of SCSs.

  6. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custommore » number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy–galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.« less

  7. Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1999-01-01

    The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.

  8. Assembly reliability of CSPs with various chiip sizes by accelerated thermal and mechanical cycling test

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    2000-01-01

    A JPL-led chip scale package (CSP) Consortium, composed of team members representing government agencies and private companies, recently joined together to pool in-kind resources for developing the quality and reliability of chip scale packages (CSPs) for a variety of projects.

  9. InSAR Scientific Computing Environment - The Home Stretch

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Gurrola, E. M.; Sacco, G.; Zebker, H. A.

    2011-12-01

    The Interferometric Synthetic Aperture Radar (InSAR) Scientific Computing Environment (ISCE) is a software development effort in its third and final year within the NASA Advanced Information Systems and Technology program. The ISCE is a new computing environment for geodetic image processing for InSAR sensors enabling scientists to reduce measurements directly from radar satellites to new geophysical products with relative ease. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. Upcoming international SAR missions will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment has the functionality to become a key element in processing data from NASA's proposed DESDynI mission into higher level data products, supporting a new class of analyses that take advantage of the long time and large spatial scales of these new data. At the core of ISCE is a new set of efficient and accurate InSAR algorithms. These algorithms are placed into an object-oriented, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. ISCE supports data from nearly all of the available satellite platforms, including ERS, EnviSAT, Radarsat-1, Radarsat-2, ALOS, TerraSAR-X, and Cosmo-SkyMed. The code applies a number of parallelization techniques and sensible approximations for speed. It is configured to work on modern linux-based computers with gcc compilers and python. ISCE is now a complete, functional package, under configuration management, and with extensive documentation and tested use cases appropriate to geodetic imaging applications. The software has been tested with canonical simulated radar data ("point targets") as well as with a variety of existing satellite data, cross-compared with other software packages. Its extensibility has already been proven by the straightforward addition of polarimetric processing and calibration, and derived filtering and estimation routines associated with polarimetry that supplement the original InSAR geodetic functionality. As of October 2011, the software is available for non-commercial use through UNAVCO's WinSAR consortium.

  10. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  11. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  12. Amesos2 and Belos: Direct and Iterative Solvers for Large Sparse Linear Systems

    DOE PAGES

    Bavier, Eric; Hoemmen, Mark; Rajamanickam, Sivasankaran; ...

    2012-01-01

    Solvers for large sparse linear systems come in two categories: direct and iterative. Amesos2, a package in the Trilinos software project, provides direct methods, and Belos, another Trilinos package, provides iterative methods. Amesos2 offers a common interface to many different sparse matrix factorization codes, and can handle any implementation of sparse matrices and vectors, via an easy-to-extend C++ traits interface. It can also factor matrices whose entries have arbitrary “Scalar” type, enabling extended-precision and mixed-precision algorithms. Belos includes many different iterative methods for solving large sparse linear systems and least-squares problems. Unlike competing iterative solver libraries, Belos completely decouples themore » algorithms from the implementations of the underlying linear algebra objects. This lets Belos exploit the latest hardware without changes to the code. Belos favors algorithms that solve higher-level problems, such as multiple simultaneous linear systems and sequences of related linear systems, faster than standard algorithms. The package also supports extended-precision and mixed-precision algorithms. Together, Amesos2 and Belos form a complete suite of sparse linear solvers.« less

  13. Introducing MCgrid 2.0: Projecting cross section calculations on grids

    NASA Astrophysics Data System (ADS)

    Bothmann, Enrico; Hartland, Nathan; Schumann, Steffen

    2015-11-01

    MCgrid is a software package that provides access to interpolation tools for Monte Carlo event generator codes, allowing for the fast and flexible variation of scales, coupling parameters and PDFs in cutting edge leading- and next-to-leading-order QCD calculations. We present the upgrade to version 2.0 which has a broader scope of interfaced interpolation tools, now providing access to fastNLO, and features an approximated treatment for the projection of MC@NLO-type calculations onto interpolation grids. MCgrid 2.0 also now supports the extended information provided through the HepMC event record used in the recent SHERPA version 2.2.0. The additional information provided therein allows for the support of multi-jet merged QCD calculations in a future update of MCgrid.

  14. 40 CFR 98.9 - Addresses.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Director, Climate Change Division, 1200 Pennsylvania Ave., NW., Mail Code: 6207J, Washington, DC 20460. (b) For package deliveries. Director, Climate Change Division, 1310 L St, NW., Washington, DC 20005. [74...

  15. 40 CFR 98.9 - Addresses.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Director, Climate Change Division, 1200 Pennsylvania Ave., NW., Mail Code: 6207J, Washington, DC 20460. (b) For package deliveries. Director, Climate Change Division, 1310 L St, NW., Washington, DC 20005. [74...

  16. 40 CFR 98.9 - Addresses.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Director, Climate Change Division, 1200 Pennsylvania Ave., NW., Mail Code: 6207J, Washington, DC 20460. (b) For package deliveries. Director, Climate Change Division, 1310 L St, NW., Washington, DC 20005. [74...

  17. 49 CFR 176.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., spillage, or other accident. INF cargo means packaged irradiated nuclear fuel, plutonium or high-level... Irradiated Nuclear Fuel, Plutonium and High-Level Radioactive Wastes on Board Ships” (INF Code) contained in...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lanier, Nicholas Edward

    We have completed implementation of a laser package in LANL's principal AGEX design code, Cassio. Although we have greatly improved our target characterization and uncertainty quantification, we remain unable to satisfactorily simulate the NIF Pleiades data.

  19. Turbofan noise generation. Volume 2: Computer programs

    NASA Technical Reports Server (NTRS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-01-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  20. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  1. Numerical Simulation of Doped Targets for ICF

    NASA Astrophysics Data System (ADS)

    Phillips, Lee; Gardner, John H.; Bodner, Stephen E.; Colombant, Denis; Klapisch, Marcel; Bar-Shalom, Avraham

    1997-11-01

    The ablative Rayleigh-Taylor (RT) instability can be reduced by preheating the ablator, thereby reducing the peak density and increasing the mass ablation velocity. The ablator can be preheated with radiation from higher Z dopants.(Gardner, J.H., Bodner, S.E., Dahlburg, J.P., Phys. Fluids 3), 1070 (1991) Dopants also reduce the density gradient at the ablator, which provides a second mechanism to reduce the RT growth rate. We have recently developed a more sophisticated and detailed radiation package that uses opacities generated by an STA code, with non-LTE radiation transport based on the Busquet method. This radiation package has been incorporated into NRL's FAST2D radiation hydrodynamics code, which has been used to evaluate and optimize the use of various dopants that can provide interesting levels of preheat for an ICF target.

  2. Turbofan noise generation. Volume 2: Computer programs

    NASA Astrophysics Data System (ADS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-07-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  3. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  4. AstroBlend: Visualization package for use with Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2015-12-01

    AstroBlend is a visualization package for use in the three dimensional animation and modeling software, Blender. It reads data in via a text file or can use pre-fab isosurface files stored as OBJ or Wavefront files. AstroBlend supports a variety of codes such as FLASH (ascl:1010.082), Enzo (ascl:1010.072), and Athena (ascl:1010.014), and combines artistic 3D models with computational astrophysics datasets to create models and animations.

  5. Total Ionizing Dose Test Report BFR92A NPN 5 GHz Wide Band Transistor from NXP

    NASA Technical Reports Server (NTRS)

    Phan, Anthony M.; Oldham, Timothy R.

    2011-01-01

    The purpose of this test was to characterize the Philips/NXP BFR92A NPN 5 gigahertz wide band silicon transistor for total dose response. This test shall serves as the radiation lot acceptance test (RLAT) for the lot date code (LDC) 1027. The BFR92A is packaged in a 3-pin plastic SOT23 package. Low dose rate (LDR/ELDRS) irradiations was performed.

  6. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  7. Test report dot 7A type a liquid packaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ketusky, E. T.; Brandjes, C.; Benoit, T. J.

    This test report documents the performance of Savannah River National Laboratory’s (SRNL’s) U.S. Department of Transportation (DOT) Specification 7A; General Packaging, Type A shielded liquid shipping packaging and compliance with the regulatory requirements of Title 49 of the Code of Federal Regulations (CFR). The primary use of this packaging design is for the transport of radioactive liquids of up to 1.3 liters in an unshielded configuration and up to 113 mL of radioactive liquids in a shielded configuration, with no more than an A2 quantity in either configuration, over public highways and/or commercial aircraft. The contents are liquid radioactive materialsmore » sufficiently shielded and within the activity limits specified in173.435 or 173.433 for A2 (normal form) materials, as well as within the analyzed thermal heat limits. Any contents must be compatibly packaged and must be compatible with the packaging. The basic packaging design is based on the U.S. Department of Energy’s (DOE’s) Model 9979 Type A fissile shipping packaging designed and tested by SRNL. The shielded liquid configuration consists of the outer and inner drums of the 9979 package with additional low density polyethylene (LDPE) dunnage nesting a tungsten shielded cask assembly (WSCA) within the 30-gallon inner drum. The packaging model for the DOT Specification 7A, Type A liquids packaging is HVYTAL.« less

  8. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  9. Modeling unsaturated zone flow and runoff processes by integrating MODFLOW-LGR and VSF, and creating the new CFL package

    USGS Publications Warehouse

    Borsia, I.; Rossetto, R.; Schifani, C.; Hill, Mary C.

    2013-01-01

    In this paper two modifications to the MODFLOW code are presented. One concerns an extension of Local Grid Refinement (LGR) to Variable Saturated Flow process (VSF) capability. This modification allows the user to solve the 3D Richards’ equation only in selected parts of the model domain. The second modification introduces a new package, named CFL (Cascading Flow), which improves the computation of overland flow when ground surface saturation is simulated using either VSF or the Unsaturated Zone Flow (UZF) package. The modeling concepts are presented and demonstrated. Programmer documentation is included in appendices.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Edwin S.

    Under the CRADA, NREL will provide assistance to NRGsim to debug and convert the EnergyPlus Hysteresis Phase Change Material ('PCM') model to C++ for adoption into the main code package of the EnergyPlus simulation engine.

  11. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  12. TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, S; Nazareth, D; Bellor, M

    Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less

  13. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    NASA Astrophysics Data System (ADS)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we present our strategy for developing BEAT and show application examples; especially the effect of including the model prediction uncertainty of the velocity model in following source optimizations: full moment tensor, Mogi source, moderate strike-slip earth-quake.

  14. Color-coded automated signal intensity curves for detection and characterization of breast lesions: preliminary evaluation of a new software package for integrated magnetic resonance-based breast imaging.

    PubMed

    Pediconi, Federica; Catalano, Carlo; Venditti, Fiammetta; Ercolani, Mauro; Carotenuto, Luigi; Padula, Simona; Moriconi, Enrica; Roselli, Antonella; Giacomelli, Laura; Kirchin, Miles A; Passariello, Roberto

    2005-07-01

    The objective of this study was to evaluate the value of a color-coded automated signal intensity curve software package for contrast-enhanced magnetic resonance mammography (CE-MRM) in patients with suspected breast cancer. Thirty-six women with suspected breast cancer based on mammographic and sonographic examinations were preoperatively evaluated on CE-MRM. CE-MRM was performed on a 1.5-T magnet using a 2D Flash dynamic T1-weighted sequence. A dosage of 0.1 mmol/kg of Gd-BOPTA was administered at a flow rate of 2 mL/s followed by 10 mL of saline. Images were analyzed with the new software package and separately with a standard display method. Statistical comparison was performed of the confidence for lesion detection and characterization with the 2 methods and of the diagnostic accuracy for characterization compared with histopathologic findings. At pathology, 54 malignant lesions and 14 benign lesions were evaluated. All 68 (100%) lesions were detected with both methods and good correlation with histopathologic specimens was obtained. Confidence for both detection and characterization was significantly (P < or = 0.025) better with the color-coded method, although no difference (P > 0.05) between the methods was noted in terms of the sensitivity, specificity, and overall accuracy for lesion characterization. Excellent agreement between the 2 methods was noted for both the determination of lesion size (kappa = 0.77) and determination of SI/T curves (kappa = 0.85). The novel color-coded signal intensity curve software allows lesions to be visualized as false color maps that correspond to conventional signal intensity time curves. Detection and characterization of breast lesions with this method is quick and easily interpretable.

  15. Sedimentary rhythms in coastal dunes as a record of intra-annual changes in wind climate (Łeba, Poland)

    NASA Astrophysics Data System (ADS)

    Ludwig, J.; Lindhorst, S.; Betzler, C.; Bierstedt, S. E.; Borówka, R. K.

    2017-08-01

    It is shown that coastal dunes bear a so far unread archive of annual wind intensity. Active dunes at the Polish coast near Łeba consist of two genetic units: primary dunes with up to 18 m high eastward-dipping foresets, temporarily superimposed by smaller secondary dunes. Ground-penetrating radar (GPR) data reveal that the foresets of the primary dunes are bundled into alternating packages imaged as either low- or high-amplitude reflections. High-amplitude packages are composed of quartz sand with intercalated heavy-minerals layers. Low-amplitude packages lack these heavy-mineral concentrations. Dune net-progradation is towards the east, reflecting the prevalence of westerly winds. Winds blowing parallel to the dune crest winnow the lee slope, leaving layers enriched in heavy minerals. Sediment transport to the slip face of the dunes is enhanced during the winter months, whereas winnowing predominantly takes place during the spring to autumn months, when the wind field is bi-directional. As a consequence of this seasonal shift, the sedimentary record of one year comprises one low- and one high-amplitude GPR reflection interval. This sedimentary pattern is a persistent feature of the Łeba dunes and recognized to resemble a sedimentary "bar code". To overcome hiatuses in the bar code of individual dunes and dune-to-dune variations in bar-code quality, dendrochronological methods were adopted to compile a composite bar code from several dunes. The resulting data series shows annual variations in west-wind intensity at the southern Baltic coast for the time period 1987 to 2012. Proxy-based wind data are validated against instrumental based weather observations.

  16. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  17. Study of providing omnidirectional vibration isolation to entire space shuttle payload packages

    NASA Technical Reports Server (NTRS)

    Chang, C. S.; Robinson, G. D.; Weber, D. E.

    1974-01-01

    Techniques to provide omnidirectional vibration isolation for a space shuttle payload package were investigated via a reduced-scale model. Development, design, fabrication, assembly and test evaluation of a 0.125-scale isolation model are described. Final drawings for fabricated mechanical components are identified, and prints of all drawings are included.

  18. CORRELATIONS BETWEEN HOMOLOGUE CONCENTRATIONS OF PCDD/FS AND TOXIC EQUIVALENCY VALUES IN LABORATORY-, PACKAGE BOILER-, AND FIELD-SCALE INCINERATORS

    EPA Science Inventory

    The toxic equivalency (TEQ) values of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) are predicted with a model based on the homologue concentrations measured from a laboratory-scale reactor (124 data points), a package boiler (61 data points), and ...

  19. Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique

    NASA Technical Reports Server (NTRS)

    Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann

    2010-01-01

    We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, M.K.; Kershaw, D.S.; Shaw, M.J.

    The authors present detailed features of the ICF3D hydrodynamics code used for inertial fusion simulations. This code is intended to be a state-of-the-art upgrade of the well-known fluid code, LASNEX. ICF3D employs discontinuous finite elements on a discrete unstructured mesh consisting of a variety of 3D polyhedra including tetrahedra, prisms, and hexahedra. The authors discussed details of how the ROE-averaged second-order convection was applied on the discrete elements, and how the C++ coding interface has helped to simplify implementing the many physics and numerics modules within the code package. The author emphasized the virtues of object-oriented design in large scalemore » projects such as ICF3D.« less

  1. Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; White, Todd; Mangini, Nancy

    2009-01-01

    Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.

  2. Novel Ruggedized Packaging Technology for VCSELs

    DTIC Science & Technology

    2017-03-01

    Novel Ruggedized Packaging Technology for VCSELs Charlie Kuznia ckuznia@ultracomm-inc.com Ultra Communications, Inc. Vista, CA, USA, 92081...n ac hieve l ow-power, E MI-immune links within hi gh-performance m ilitary computing an d sensor systems. Figure 1. Chip-scale-packaging of

  3. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  4. Packaging Software Assets for Reuse

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.; Marshall, J. J.; Downs, R. R.

    2010-12-01

    The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.

  5. The NJOY Nuclear Data Processing System, Version 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  6. SQA of finite element method (FEM) codes used for analyses of pit storage/transport packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russel, E.

    1997-11-01

    This report contains viewgraphs on the software quality assurance of finite element method codes used for analyses of pit storage and transport projects. This methodology utilizes the ISO 9000-3: Guideline for application of 9001 to the development, supply, and maintenance of software, for establishing well-defined software engineering processes to consistently maintain high quality management approaches.

  7. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  8. Common Ada Missile Packages. Phase 2. (CAMP-2). Volume 2. 11th Missile Demonstration

    DTIC Science & Technology

    1988-11-01

    report describes the work performed, Ihe results obtained, and the conclusions reached during the Common Ada Missile Packages Phase-2 (CAMP-2) contract ... contract was performed between Sep- tember 1985. and March 1988. The MDAC-STL CAMP program manager was: Dr. Daniel G. McNicholl Technology Branch...j DEC Code Management System X X Software Development Files x x Development Status Database x ! X i Smart Cade Counter X j

  9. YAMM - Yet Another Menu Manager

    NASA Technical Reports Server (NTRS)

    Mazer, Alan S.; Weidner, Richard J.

    1991-01-01

    Yet Another Menu Manager (YAMM) computer program an application-independent menuing package of software designed to remove much difficulty and save much time inherent in implementation of front ends of large packages of software. Provides complete menuing front end for wide variety of applications, with provisions for independence from specific types of terminals, configurations that meet specific needs of users, and dynamic creation of menu trees. Consists of two parts: description of menu configuration and body of application code. Written in C.

  10. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens.

    PubMed

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin

    2017-06-01

    We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. 3D-PDR: Three-dimensional photodissociation region code

    NASA Astrophysics Data System (ADS)

    Bisbas, T. G.; Bell, T. A.; Viti, S.; Yates, J.; Barlow, M. J.

    2018-03-01

    3D-PDR is a three-dimensional photodissociation region code written in Fortran. It uses the Sundials package (written in C) to solve the set of ordinary differential equations and it is the successor of the one-dimensional PDR code UCL_PDR (ascl:1303.004). Using the HEALpix ray-tracing scheme (ascl:1107.018), 3D-PDR solves a three-dimensional escape probability routine and evaluates the attenuation of the far-ultraviolet radiation in the PDR and the propagation of FIR/submm emission lines out of the PDR. The code is parallelized (OpenMP) and can be applied to 1D and 3D problems.

  12. Custodial Management in the Information Age.

    ERIC Educational Resources Information Center

    Harris, Jim, Sr.

    1999-01-01

    Explains how computerizing the custodial department can be achieved through bar coding, hand-held readers, and the appropriate software packages. Software programs that aid cleaning management, track assets, and manage stock are discussed. (GR)

  13. 21 CFR 129.80 - Processes and controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... determine whether any of the coliform organisms are E. coli. (2) For chemical, physical, and radiological... bactericidal action to that required in paragraph (d)(3) of this section. (e) Unit package production code...

  14. 21 CFR 129.80 - Processes and controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... determine whether any of the coliform organisms are E. coli. (2) For chemical, physical, and radiological... bactericidal action to that required in paragraph (d)(3) of this section. (e) Unit package production code...

  15. PyMC: Bayesian Stochastic Modelling in Python

    PubMed Central

    Patil, Anand; Huard, David; Fonnesbeck, Christopher J.

    2010-01-01

    This user guide describes a Python package, PyMC, that allows users to efficiently code a probabilistic model and draw samples from its posterior distribution using Markov chain Monte Carlo techniques. PMID:21603108

  16. Lessons learned from an Ada conversion project

    NASA Technical Reports Server (NTRS)

    Porter, Tim

    1988-01-01

    Background; SAVVAS architecture; software portability; history of Ada; isolation of non-portable code; simple terminal interface package; constraints of language features; and virtual interfaces are outlined. This presentation is represented by viewgraphs only.

  17. DFTBaby: A software package for non-adiabatic molecular dynamics simulations based on long-range corrected tight-binding TD-DFT(B)

    NASA Astrophysics Data System (ADS)

    Humeniuk, Alexander; Mitrić, Roland

    2017-12-01

    A software package, called DFTBaby, is published, which provides the electronic structure needed for running non-adiabatic molecular dynamics simulations at the level of tight-binding DFT. A long-range correction is incorporated to avoid spurious charge transfer states. Excited state energies, their analytic gradients and scalar non-adiabatic couplings are computed using tight-binding TD-DFT. These quantities are fed into a molecular dynamics code, which integrates Newton's equations of motion for the nuclei together with the electronic Schrödinger equation. Non-adiabatic effects are included by surface hopping. As an example, the program is applied to the optimization of excited states and non-adiabatic dynamics of polyfluorene. The python and Fortran source code is available at http://www.dftbaby.chemie.uni-wuerzburg.de.

  18. PAREMD: A parallel program for the evaluation of momentum space properties of atoms and molecules

    NASA Astrophysics Data System (ADS)

    Meena, Deep Raj; Gadre, Shridhar R.; Balanarayan, P.

    2018-03-01

    The present work describes a code for evaluating the electron momentum density (EMD), its moments and the associated Shannon information entropy for a multi-electron molecular system. The code works specifically for electronic wave functions obtained from traditional electronic structure packages such as GAMESS and GAUSSIAN. For the momentum space orbitals, the general expression for Gaussian basis sets in position space is analytically Fourier transformed to momentum space Gaussian basis functions. The molecular orbital coefficients of the wave function are taken as an input from the output file of the electronic structure calculation. The analytic expressions of EMD are evaluated over a fine grid and the accuracy of the code is verified by a normalization check and a numerical kinetic energy evaluation which is compared with the analytic kinetic energy given by the electronic structure package. Apart from electron momentum density, electron density in position space has also been integrated into this package. The program is written in C++ and is executed through a Shell script. It is also tuned for multicore machines with shared memory through OpenMP. The program has been tested for a variety of molecules and correlated methods such as CISD, Møller-Plesset second order (MP2) theory and density functional methods. For correlated methods, the PAREMD program uses natural spin orbitals as an input. The program has been benchmarked for a variety of Gaussian basis sets for different molecules showing a linear speedup on a parallel architecture.

  19. Joint Communications Support Element: The Voice Heard Round the World

    DTIC Science & Technology

    2013-01-01

    Initial Entry Package ( IEP ), Early Entry Package (EEP), and Joint Mobil- ity Package provide secure and nonsecure voice, video, and data to small mobile...teams operating worldwide. The IEP and EEP can be rapidly scaled to meet force surge require- ments from small dismounted teams up to an advance

  20. The Chip-Scale Atomic Clock - Low-Power Physics Package

    DTIC Science & Technology

    2004-12-01

    36th Annual Precise Time and Time Interval (PTTI) Meeting 339 THE CHIP-SCALE ATOMIC CLOCK – LOW-POWER PHYSICS PACKAGE R. Lutwak ...pdf/documents/ds-x72.pdf [2] R. Lutwak , D. Emmons, W. Riley, and R. M. Garvey, 2003, “The Chip-Scale Atomic Clock – Coherent Population Trapping vs...2002, Reston, Virginia, USA (U.S. Naval Observatory, Washington, D.C.), pp. 539-550. [3] R. Lutwak , D. Emmons, T. English, and W. Riley, 2004

  1. Mass decomposition of galaxies using DECA software package

    NASA Astrophysics Data System (ADS)

    Mosenkov, A. V.

    2014-01-01

    The new DECA software package, which is designed to perform photometric analysis of the images of disk and elliptical galaxies having a regular structure, is presented. DECA is written in Python interpreted language and combines the capabilities of several widely used packages for astronomical data processing such as IRAF, SExtractor, and the GALFIT code used to perform two-dimensional decomposition of galaxy images into several photometric components (bulge+disk). DECA has the advantage that it can be applied to large samples of galaxies with different orientations with respect to the line of sight (including edge-on galaxies) and requires minimum human intervention. Examples of using the package to study a sample of simulated galaxy images and a sample of real objects are shown to demonstrate that DECA can be a reliable tool for the study of the structure of galaxies.

  2. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    DOE PAGES

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; ...

    2016-12-20

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  3. The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain

    Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  4. THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain

    Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt–Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∼3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less

  5. Schroedinger’s code: Source code availability and transparency in astrophysics

    NASA Astrophysics Data System (ADS)

    Ryan, PW; Allen, Alice; Teuben, Peter

    2018-01-01

    Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.

  6. Study of SOL in DIII-D tokamak with SOLPS suite of codes.

    NASA Astrophysics Data System (ADS)

    Pankin, Alexei; Bateman, Glenn; Brennan, Dylan; Coster, David; Hogan, John; Kritz, Arnold; Kukushkin, Andrey; Schnack, Dalton; Snyder, Phil

    2005-10-01

    The scrape-of-layer (SOL) region in DIII-D tokamak is studied with the SOLPS integrated suite of codes. The SOLPS package includes the 3D multi-species Monte-Carlo neutral code EIRINE and 2D multi-fluid code B2. The EIRINE and B2 codes are cross-coupled through B2-EIRINE interface. The results of SOLPS simulations are used in the integrated modeling of the plasma edge in DIII-D tokamak with the ASTRA transport code. Parameterized dependences for neutral particle fluxes that are computed with the SOLPS code are implemented in a model for the H-mode pedestal and ELMs [1] in the ASTRA code. The effects of neutrals on the H-mode pedestal and ELMs are studied in this report. [1] A. Y. Pankin, I. Voitsekhovitch, G. Bateman, et al., Plasma Phys. Control. Fusion 47, 483 (2005).

  7. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  8. Monte Carlo Shielding Comparative Analysis Applied to TRIGA HEU and LEU Spent Fuel Transport

    NASA Astrophysics Data System (ADS)

    Margeanu, C. A.; Margeanu, S.; Barbos, D.; Iorgulis, C.

    2010-12-01

    The paper is a comparative study of LEU and HEU fuel utilization effects for the shielding analysis during spent fuel transport. A comparison against the measured data for HEU spent fuel, available from the last stage of spent fuel repatriation fulfilled in the summer of 2008, is also presented. All geometrical and material data for the shipping cask were considered according to NAC-LWT Cask approved model. The shielding analysis estimates radiation doses to shipping cask wall surface, and in air at 1 m and 2 m, respectively, from the cask, by means of 3D Monte Carlo MORSE-SGC code. Before loading into the shipping cask, TRIGA spent fuel source terms and spent fuel parameters have been obtained by means of ORIGEN-S code. Both codes are included in ORNL's SCALE 5 programs package. The actinides contribution to total fuel radioactivity is very low in HEU spent fuel case, becoming 10 times greater in LEU spent fuel case. Dose rates for both HEU and LEU fuel contents are below regulatory limits, LEU spent fuel photon dose rates being greater than HEU ones. Comparison between HEU spent fuel theoretical and measured dose rates in selected measuring points shows a good agreement, calculated values being greater than the measured ones both to cask wall surface (about 34% relative difference) and in air at 1 m distance from cask surface (about 15% relative difference).

  9. Scoria: a Python module for manipulating 3D molecular data.

    PubMed

    Ropp, Patrick; Friedman, Aaron; Durrant, Jacob D

    2017-09-18

    Third-party packages have transformed the Python programming language into a powerful computational-biology tool. Package installation is easy for experienced users, but novices sometimes struggle with dependencies and compilers. This presents a barrier that can hinder the otherwise broad adoption of new tools. We present Scoria, a Python package for manipulating three-dimensional molecular data. Unlike similar packages, Scoria requires no dependencies, compilation, or system-wide installation. One can incorporate the Scoria source code directly into their own programs. But Scoria is not designed to compete with other similar packages. Rather, it complements them. Our package leverages others (e.g. NumPy, SciPy), if present, to speed and extend its own functionality. To show its utility, we use Scoria to analyze a molecular dynamics trajectory. Our FootPrint script colors the atoms of one chain by the frequency of their contacts with a second chain. We are hopeful that Scoria will be a useful tool for the computational-biology community. A copy is available for download free of charge (Apache License 2.0) at http://durrantlab.com/scoria/ . Graphical abstract .

  10. Simulation of nonlinear propagation of biomedical ultrasound using pzflex and the Khokhlov-Zabolotskaya-Kuznetsov Texas code

    PubMed Central

    Qiao, Shan; Jackson, Edward; Coussios, Constantin C.; Cleveland, Robin O.

    2016-01-01

    Nonlinear acoustics plays an important role in both diagnostic and therapeutic applications of biomedical ultrasound and a number of research and commercial software packages are available. In this manuscript, predictions of two solvers available in a commercial software package, pzflex, one using the finite-element-method (FEM) and the other a pseudo-spectral method, spectralflex, are compared with measurements and the Khokhlov-Zabolotskaya-Kuznetsov (KZK) Texas code (a finite-difference time-domain algorithm). The pzflex methods solve the continuity equation, momentum equation and equation of state where they account for nonlinearity to second order whereas the KZK code solves a nonlinear wave equation with a paraxial approximation for diffraction. Measurements of the field from a single element 3.3 MHz focused transducer were compared with the simulations and there was good agreement for the fundamental frequency and the harmonics; however the FEM pzflex solver incurred a high computational cost to achieve equivalent accuracy. In addition, pzflex results exhibited non-physical oscillations in the spatial distribution of harmonics when the amplitudes were relatively low. It was found that spectralflex was able to accurately capture the nonlinear fields at reasonable computational cost. These results emphasize the need to benchmark nonlinear simulations before using codes as predictive tools. PMID:27914432

  11. Simulation of nonlinear propagation of biomedical ultrasound using pzflex and the Khokhlov-Zabolotskaya-Kuznetsov Texas code.

    PubMed

    Qiao, Shan; Jackson, Edward; Coussios, Constantin C; Cleveland, Robin O

    2016-09-01

    Nonlinear acoustics plays an important role in both diagnostic and therapeutic applications of biomedical ultrasound and a number of research and commercial software packages are available. In this manuscript, predictions of two solvers available in a commercial software package, pzflex, one using the finite-element-method (FEM) and the other a pseudo-spectral method, spectralflex, are compared with measurements and the Khokhlov-Zabolotskaya-Kuznetsov (KZK) Texas code (a finite-difference time-domain algorithm). The pzflex methods solve the continuity equation, momentum equation and equation of state where they account for nonlinearity to second order whereas the KZK code solves a nonlinear wave equation with a paraxial approximation for diffraction. Measurements of the field from a single element 3.3 MHz focused transducer were compared with the simulations and there was good agreement for the fundamental frequency and the harmonics; however the FEM pzflex solver incurred a high computational cost to achieve equivalent accuracy. In addition, pzflex results exhibited non-physical oscillations in the spatial distribution of harmonics when the amplitudes were relatively low. It was found that spectralflex was able to accurately capture the nonlinear fields at reasonable computational cost. These results emphasize the need to benchmark nonlinear simulations before using codes as predictive tools.

  12. WannierTools: An open-source software package for novel topological materials

    NASA Astrophysics Data System (ADS)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  13. fastBMA: scalable network inference and transitive reduction.

    PubMed

    Hung, Ling-Hong; Shi, Kaiyuan; Wu, Migao; Young, William Chad; Raftery, Adrian E; Yeung, Ka Yee

    2017-10-01

    Inferring genetic networks from genome-wide expression data is extremely demanding computationally. We have developed fastBMA, a distributed, parallel, and scalable implementation of Bayesian model averaging (BMA) for this purpose. fastBMA also includes a computationally efficient module for eliminating redundant indirect edges in the network by mapping the transitive reduction to an easily solved shortest-path problem. We evaluated the performance of fastBMA on synthetic data and experimental genome-wide time series yeast and human datasets. When using a single CPU core, fastBMA is up to 100 times faster than the next fastest method, LASSO, with increased accuracy. It is a memory-efficient, parallel, and distributed application that scales to human genome-wide expression data. A 10 000-gene regulation network can be obtained in a matter of hours using a 32-core cloud cluster (2 nodes of 16 cores). fastBMA is a significant improvement over its predecessor ScanBMA. It is more accurate and orders of magnitude faster than other fast network inference methods such as the 1 based on LASSO. The improved scalability allows it to calculate networks from genome scale data in a reasonable time frame. The transitive reduction method can improve accuracy in denser networks. fastBMA is available as code (M.I.T. license) from GitHub (https://github.com/lhhunghimself/fastBMA), as part of the updated networkBMA Bioconductor package (https://www.bioconductor.org/packages/release/bioc/html/networkBMA.html) and as ready-to-deploy Docker images (https://hub.docker.com/r/biodepot/fastbma/). © The Authors 2017. Published by Oxford University Press.

  14. Beyond Widgets -- Systems Incentive Programs for Utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Regnier, Cindy; Mathew, Paul; Robinson, Alastair

    Utility incentive programs remain one of the most significant means of deploying commercialized, but underutilized building technologies to scale. However, these programs have been largely limited to component-based products (e.g., lamps, RTUs). While some utilities do provide ‘custom’ incentive programs with whole building and system level technical assistance, these programs require deeper levels of analysis, resulting in higher program costs. This results in custom programs being restricted to utilities with greater resources, and are typically applied mainly to large or energy-intensive facilities, leaving much of the market without cost effective access and incentives for these solutions. In addition, with increasinglymore » stringent energy codes, cost effective component-based solutions that achieve significant savings are dwindling. Building systems (e.g., integrated façade, HVAC and/or lighting solutions) can deliver higher savings that translate into large sector-wide savings if deployed at the scale of these programs. However, systems application poses a number of challenges – baseline energy use must be defined and measured; the metrics for energy and performance must be defined and tested against; in addition, system savings must be validated under well understood conditions. This paper presents a sample of findings of a project to develop validated utility incentive program packages for three specific integrated building systems, in collaboration with Xcel Energy (CO, MN), ComEd, and a consortium of California Public Owned Utilities (CA POUs) (Northern California Power Agency(NCPA) and the Southern California Public Power Authority(SCPPA)). Furthermore, these program packages consist of system specifications, system performance, M&V protocols, streamlined assessment methods, market assessment and implementation guidance.« less

  15. A Comparative Study of Inspection Techniques for Array Packages

    NASA Technical Reports Server (NTRS)

    Mohammed, Jelila; Green, Christopher

    2008-01-01

    This viewgraph presentation reviews the inspection techniques for Column Grid Array (CGA) packages. The CGA is a method of chip scale packaging using high temperature solder columns to attach part to board. It is becoming more popular over other techniques (i.e. quad flat pack (QFP) or ball grid array (BGA)). However there are environmental stresses and workmanship challenges that require good inspection techniques for these packages.

  16. Morse Monte Carlo Radiation Transport Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less

  17. Coded Modulation in C and MATLAB

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon; Andrews, Kenneth S.

    2011-01-01

    This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.

  18. Program Package for 3d PIC Model of Plasma Fiber

    NASA Astrophysics Data System (ADS)

    Kulhánek, Petr; Břeň, David

    2007-08-01

    A fully three dimensional Particle in Cell model of the plasma fiber had been developed. The code is written in FORTRAN 95, implementation CVF (Compaq Visual Fortran) under Microsoft Visual Studio user interface. Five particle solvers and two field solvers are included in the model. The solvers have relativistic and non-relativistic variants. The model can deal both with periodical and non-periodical boundary conditions. The mechanism of the surface turbulences generation in the plasma fiber was successfully simulated with the PIC program package.

  19. A Tutorial on RxODE: Simulating Differential Equation Pharmacometric Models in R.

    PubMed

    Wang, W; Hallow, K M; James, D A

    2016-01-01

    This tutorial presents the application of an R package, RxODE, that facilitates quick, efficient simulations of ordinary differential equation models completely within R. Its application is illustrated through simulation of design decision effects on an adaptive dosing regimen. The package provides an efficient, versatile way to specify dosing scenarios and to perform simulation with variability with minimal custom coding. Models can be directly translated to Rshiny applications to facilitate interactive, real-time evaluation/iteration on simulation scenarios.

  20. Influence of Interpretation Aids on Attentional Capture, Visual Processing, and Understanding of Front-of-Package Nutrition Labels.

    PubMed

    Antúnez, Lucía; Giménez, Ana; Maiche, Alejandro; Ares, Gastón

    2015-01-01

    To study the influence of 2 interpretational aids of front-of-package (FOP) nutrition labels (color code and text descriptors) on attentional capture and consumers' understanding of nutritional information. A full factorial design was used to assess the influence of color code and text descriptors using visual search and eye tracking. Ten trained assessors participated in the visual search study and 54 consumers completed the eye-tracking study. In the visual search study, assessors were asked to indicate whether there was a label high in fat within sets of mayonnaise labels with different FOP labels. In the eye-tracking study, assessors answered a set of questions about the nutritional content of labels. The researchers used logistic regression to evaluate the influence of interpretational aids of FOP nutrition labels on the percentage of correct answers. Analyses of variance were used to evaluate the influence of the studied variables on attentional measures and participants' response times. Response times were significantly higher for monochromatic FOP labels compared with color-coded ones (3,225 vs 964 ms; P < .001), which suggests that color codes increase attentional capture. The highest number and duration of fixations and visits were recorded on labels that did not include color codes or text descriptors (P < .05). The lowest percentage of incorrect answers was observed when the nutrient level was indicated using color code and text descriptors (P < .05). The combination of color codes and text descriptors seems to be the most effective alternative to increase attentional capture and understanding of nutritional information. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  1. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    NASA Astrophysics Data System (ADS)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  2. PAT-1 safety analysis report addendum.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiner, Ruth F.; Schmale, David T.; Kalan, Robert J.

    2010-09-01

    The Plutonium Air Transportable Package, Model PAT-1, is certified under Title 10, Code of Federal Regulations Part 71 by the U.S. Nuclear Regulatory Commission (NRC) per Certificate of Compliance (CoC) USA/0361B(U)F-96 (currently Revision 9). The purpose of this SAR Addendum is to incorporate plutonium (Pu) metal as a new payload for the PAT-1 package. The Pu metal is packed in an inner container (designated the T-Ampoule) that replaces the PC-1 inner container. The documentation and results from analysis contained in this addendum demonstrate that the replacement of the PC-1 and associated packaging material with the T-Ampoule and associated packaging withmore » the addition of the plutonium metal content are not significant with respect to the design, operating characteristics, or safe performance of the containment system and prevention of criticality when the package is subjected to the tests specified in 10 CFR 71.71, 71.73 and 71.74.« less

  3. Evolution of a modular software network

    PubMed Central

    Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.

    2011-01-01

    “Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260

  4. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  5. 49 CFR 178.910 - Marking of Large Packagings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... stamped or embossed, the capital letters “UN” may be applied instead of the symbol; (ii) The code number...) The country authorizing the allocation of the mark. The letters “USA” indicate that the Large...

  6. 49 CFR 178.910 - Marking of Large Packagings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... stamped or embossed, the capital letters “UN” may be applied instead of the symbol; (ii) The code number...) The country authorizing the allocation of the mark. The letters “USA” indicate that the Large...

  7. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  8. Browndye: A software package for Brownian dynamics

    NASA Astrophysics Data System (ADS)

    Huber, Gary A.; McCammon, J. Andrew

    2010-11-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. Program summaryProgram title: Browndye Catalogue identifier: AEGT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license, included in distribution No. of lines in distributed program, including test data, etc.: 143 618 No. of bytes in distributed program, including test data, etc.: 1 067 861 Distribution format: tar.gz Programming language: C++, OCaml ( http://caml.inria.fr/) Computer: PC, Workstation, Cluster Operating system: Linux Has the code been vectorised or parallelized?: Yes. Runs on multiple processors with shared memory using pthreads RAM: Depends linearly on size of physical system Classification: 3 External routines: uses the output of APBS [1] ( http://www.poissonboltzmann.org/apbs/) as input. APBS must be obtained and installed separately. Expat 2.0.1, CLAPACK, ocaml-expat, Mersenne Twister. These are included in the Browndye distribution. Nature of problem: Exploration and determination of rate constants of bimolecular interactions involving large biological molecules. Solution method: Brownian dynamics with electrostatic, excluded volume, van der Waals, and desolvation forces. Running time: Depends linearly on size of physical system and quadratically on precision of results. The included example executes in a few minutes.

  9. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores.

    PubMed

    Chikkagoudar, Satish; Wang, Kai; Li, Mingyao

    2011-05-26

    Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  10. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    PubMed Central

    2011-01-01

    Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/. PMID:21615923

  11. lpNet: a linear programming approach to reconstruct signal transduction networks.

    PubMed

    Matos, Marta R A; Knapp, Bettina; Kaderali, Lars

    2015-10-01

    With the widespread availability of high-throughput experimental technologies it has become possible to study hundreds to thousands of cellular factors simultaneously, such as coding- or non-coding mRNA or protein concentrations. Still, extracting information about the underlying regulatory or signaling interactions from these data remains a difficult challenge. We present a flexible approach towards network inference based on linear programming. Our method reconstructs the interactions of factors from a combination of perturbation/non-perturbation and steady-state/time-series data. We show both on simulated and real data that our methods are able to reconstruct the underlying networks fast and efficiently, thus shedding new light on biological processes and, in particular, into disease's mechanisms of action. We have implemented the approach as an R package available through bioconductor. This R package is freely available under the Gnu Public License (GPL-3) from bioconductor.org (http://bioconductor.org/packages/release/bioc/html/lpNet.html) and is compatible with most operating systems (Windows, Linux, Mac OS) and hardware architectures. bettina.knapp@helmholtz-muenchen.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Modeling Impact-induced Failure of Polysilicon MEMS: A Multi-scale Approach.

    PubMed

    Mariani, Stefano; Ghisi, Aldo; Corigliano, Alberto; Zerbini, Sarah

    2009-01-01

    Failure of packaged polysilicon micro-electro-mechanical systems (MEMS) subjected to impacts involves phenomena occurring at several length-scales. In this paper we present a multi-scale finite element approach to properly allow for: (i) the propagation of stress waves inside the package; (ii) the dynamics of the whole MEMS; (iii) the spreading of micro-cracking in the failing part(s) of the sensor. Through Monte Carlo simulations, some effects of polysilicon micro-structure on the failure mode are elucidated.

  13. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  14. Bar code, good for industry and trade--how does it benefit the dentist?

    PubMed

    Oehlmann, H

    2001-10-01

    Every dentist who attentively follows the change in product labelling can easily see that the HIBC bar code is on the increase. In fact, according to information from FIDE/VDDI and ADE/BVD, the dental industry and trade are firmly resolved to apply the HIBC bar code to all products used internationally in dental practices. Why? Indeed, at first it looks like extra expense to additionally print a bar code on the packages. Good reasons can only lie in advantages which manufacturers and the trade expect from the HIBC bar code, Indications in dental technician circles are that the HIBC bar code is coming. If there are advantages, what are these, and can the dentist also profit from them? What does HIBC bar code mean and what items of interest does it include? What does bar code cost and does only one code exist? This is explained briefly, concentrating on the benefits bar code can bring for different users.

  15. SPIKY: a graphical user interface for monitoring spike train synchrony

    PubMed Central

    Mulansky, Mario; Bozanic, Nebojsa

    2015-01-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. PMID:25744888

  16. SPIKY: a graphical user interface for monitoring spike train synchrony.

    PubMed

    Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa

    2015-05-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.

  17. Modularized seismic full waveform inversion based on waveform sensitivity kernels - The software package ASKI

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel

    2015-04-01

    We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion), which provides a generalized interface to arbitrary external forward modelling codes. So far, the 3D spectral-element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework are supported. The creation of interfaces to further forward codes is planned in the near future. ASKI is freely available under the terms of the GPL at www.rub.de/aski . Since the independent modules of ASKI must communicate via file output/input, large storage capacities need to be accessible conveniently. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full waveform inversion. In the presentation, we will show some aspects of the theory behind the full waveform inversion method and its practical realization by the software package ASKI, as well as synthetic and real-data applications from different scales and geometries.

  18. HydroApps: An R package for statistical simulation to use in regional analysis

    NASA Astrophysics Data System (ADS)

    Ganora, D.

    2013-12-01

    The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.

  19. Existing Fortran interfaces to Trilinos in preparation for exascale ForTrilinos development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Katherine J.; Young, Mitchell T.; Collins, Benjamin S.

    This report summarizes the current state of Fortran interfaces to the Trilinos library within several key applications of the Exascale Computing Program (ECP), with the aim of informing developers about strategies to develop ForTrilinos, an exascale-ready, Fortran interface software package within Trilinos. The two software projects assessed within are the DOE Office of Science's Accelerated Climate Model for Energy (ACME) atmosphere component, CAM, and the DOE Office of Nuclear Energy's core-simulator portion of VERA, a nuclear reactor simulation code. Trilinos is an object-oriented, C++ based software project, and spans a collection of algorithms and other enabling technologies such as uncertaintymore » quantification and mesh generation. To date, Trilinos has enabled these codes to achieve large-scale simulation results, however the simulation needs of CAM and VERA-CS will approach exascale over the next five years. A Fortran interface to Trilinos that enables efficient use of programming models and more advanced algorithms is necessary. Where appropriate, the needs of the CAM and VERA-CS software to achieve their simulation goals are called out specifically. With this report, a design document and execution plan for ForTrilinos development can proceed.« less

  20. ACME, a GIS tool for Automated Cirque Metric Extraction

    NASA Astrophysics Data System (ADS)

    Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.

    2017-02-01

    Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.

  1. fast_protein_cluster: parallel and optimized clustering of large-scale protein modeling data.

    PubMed

    Hung, Ling-Hong; Samudrala, Ram

    2014-06-15

    fast_protein_cluster is a fast, parallel and memory efficient package used to cluster 60 000 sets of protein models (with up to 550 000 models per set) generated by the Nutritious Rice for the World project. fast_protein_cluster is an optimized and extensible toolkit that supports Root Mean Square Deviation after optimal superposition (RMSD) and Template Modeling score (TM-score) as metrics. RMSD calculations using a laptop CPU are 60× faster than qcprot and 3× faster than current graphics processing unit (GPU) implementations. New GPU code further increases the speed of RMSD and TM-score calculations. fast_protein_cluster provides novel k-means and hierarchical clustering methods that are up to 250× and 2000× faster, respectively, than Clusco, and identify significantly more accurate models than Spicker and Clusco. fast_protein_cluster is written in C++ using OpenMP for multi-threading support. Custom streaming Single Instruction Multiple Data (SIMD) extensions and advanced vector extension intrinsics code accelerate CPU calculations, and OpenCL kernels support AMD and Nvidia GPUs. fast_protein_cluster is available under the M.I.T. license. (http://software.compbio.washington.edu/fast_protein_cluster) © The Author 2014. Published by Oxford University Press.

  2. Propagation of radio frequency waves through density fluctuations

    NASA Astrophysics Data System (ADS)

    Valvis, S. I.; Papagiannis, P.; Papadopoulos, A.; Hizanidis, K.; Glytsis, E.; Bairaktaris, F.; Zisis, A.; Tigelis, I.; Ram, A. K.

    2017-10-01

    On their way to the core of a tokamak plasma, radio frequency (RF) waves, excited in the vacuum region, have to propagate through a variety of density fluctuations in the edge region. These fluctuations include coherent structures, like blobs that can be field aligned or not, as well as turbulent and filamentary structures. We have been studying the effect of fluctuations on RF propagation using both theoretical (analytical) and computational models. The theoretical results are being compared with those obtained by two different numerical codes ``a Finite Difference Frequency Domain code and the commercial COMSOL package. For plasmas with arbitrary distribution of coherent and turbulent fluctuations, we have formulated an effective dielectric permittivity of the edge plasma. This permittivity tensor is then used in numerical simulations to study the effect of multi-scale turbulence on RF waves. We not only consider plane waves but also Gaussian beams in the electron cyclotron and lower hybrid range of frequencies. The analytical theory and results from simulations on the propagation of RF waves will be presented. Supported in part by the Hellenic National Programme on Controlled Thermonuclear Fusion associated with the EUROfusion Consortium and by DoE Grant DE-FG02-91ER-54109.

  3. Fast parametric relationships for the large-scale reservoir simulation of mixed CH 4-CO 2 gas hydrate systems

    DOE PAGES

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    2017-03-27

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less

  4. Fast parametric relationships for the large-scale reservoir simulation of mixed CH4-CO2 gas hydrate systems

    NASA Astrophysics Data System (ADS)

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    2017-06-01

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO2-CH4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this work, we present a set of fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. The mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.

  5. Fast parametric relationships for the large-scale reservoir simulation of mixed CH 4-CO 2 gas hydrate systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less

  6. Extension of the Bgl Broad Group Cross Section Library

    NASA Astrophysics Data System (ADS)

    Kirilova, Desislava; Belousov, Sergey; Ilieva, Krassimira

    2009-08-01

    The broad group cross-section libraries BUGLE and BGL are applied for reactor shielding calculation using the DOORS package based on discrete ordinates method and multigroup approximation of the neutron cross-sections. BUGLE and BGL libraries are problem oriented for PWR or VVER type of reactors respectively. They had been generated by collapsing the problem independent fine group library VITAMIN-B6 applying PWR and VVER one-dimensional radial model of the reactor middle plane using the SCALE software package. The surveillance assemblies (SA) of VVER-1000/320 are located on the baffle above the reactor core upper edge in a region where geometry and materials differ from those of the middle plane and the neutron field gradient is very high which would result in a different neutron spectrum. That is why the application of the fore-mentioned libraries for the neutron fluence calculation in the region of SA could lead to an additional inaccuracy. This was the main reason to study the necessity for an extension of the BGL library with cross-sections appropriate for the SA region. Comparative analysis of the neutron spectra of the SA region calculated by the VITAMIN-B6 and BGL libraries using the two-dimensional code DORT have been done with purpose to evaluate the BGL applicability for SA calculation.

  7. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David W. Nigg, Principal Investigator; Kevin A. Steuhm, Project Manager

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to properly verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Updatemore » Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the next anticipated ATR Core Internals Changeout (CIC) in the 2014-2015 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its third full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL under various licensing arrangements. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009, Cycle 145A through Cycle 151B, was successfully completed during 2012. This major effort supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR Core Safety Analysis Package (CSAP) preparation process, in parallel with the established PDQ-based methodology, beginning late in Fiscal Year 2012. Acquisition of the advanced SERPENT (VTT-Finland) and MC21 (DOE-NR) Monte Carlo stochastic neutronics simulation codes was also initiated during the year and some initial applications of SERPENT to ATRC experiment analysis were demonstrated. These two new codes will offer significant additional capability, including the possibility of full-3D Monte Carlo fuel management support capabilities for the ATR at some point in the future. Finally, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system has been implemented and initial computational results have been obtained. This capability will have many applications as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation.« less

  8. Packaging of silicon photonic devices: from prototypes to production

    NASA Astrophysics Data System (ADS)

    Morrissey, Padraic E.; Gradkowski, Kamil; Carroll, Lee; O'Brien, Peter

    2018-02-01

    The challenges associated with the photonic packaging of silicon devices is often underestimated and remains technically challenging. In this paper, we review some key enabling technologies that will allow us to overcome the current bottleneck in silicon photonic packaging; while also describing the recent developments in standardisation, including the establishment of PIXAPP as the worlds first open-access PIC packaging and assembly Pilot Line. These developments will allow the community to move from low volume prototype photonic packaged devices to large scale volume manufacturing, where the full commercialisation of PIC technology can be realised.

  9. Reproducible Research in the Geosciences at Scale: Achievable Goal or Elusive Dream?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.

    2016-12-01

    Reproducibility is a fundamental tenant of the scientific method: it implies that any researcher, or a third party working independently, can duplicate any experiment or investigation and produce the same results. Historically computationally based research involved an individual using their own data and processing it in their own private area, often using software they wrote or inherited from close collaborators. Today, a researcher is likely to be part of a large team that will use a subset of data from an external repository and then process the data on a public or private cloud or on a large centralised supercomputer, using a mixture of their own code, third party software and libraries, or global community codes. In 'Big Geoscience' research it is common for data inputs to be extracts from externally managed dynamic data collections, where new data is being regularly appended, or existing data is revised when errors are detected and/or as processing methods are improved. New workflows increasingly use services to access data dynamically to create subsets on-the-fly from distributed sources, each of which can have a complex history. At major computational facilities, underlying systems, libraries, software and services are being constantly tuned and optimised, or as new or replacement infrastructure being installed. Likewise code used from a community repository is continually being refined, re-packaged and ported to the target platform. To achieve reproducibility, today's researcher increasingly needs to track their workflow, including querying information on the current or historical state of facilities used. Versioning methods are standard practice for software repositories or packages, but it is not common for either data repositories or data services to provide information about their state, or for systems to provide query-able access to changes in the underlying software. While a researcher can achieve transparency and describe steps in their workflow so that others can repeat them and replicate processes undertaken, they cannot achieve exact reproducibility or even transparency of results generated. In Big Geoscience, full reproducibiliy will be an elusive dream until data repositories and compute facilities can provide provenance information in a standards compliant, machine query-able way.

  10. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less

  11. Magnetic Tape Recording for the Eighties

    NASA Technical Reports Server (NTRS)

    Kalil, Ford (Editor)

    1982-01-01

    The practical and theoretical aspects of state-of-the-art magnetic tape recording technology are reviewed. Topics covered include the following: (1) analog and digital magnetic tape recording, (2) tape and head wear, (3) wear testing, (4) magnetic tape certification, (5) care, handling, and management of magnetic tape, (6) cleaning, packing, and winding of magnetic tape, (7) tape reels, bands, and packaging, (8) coding techniques for high-density digital recording, and (9) tradeoffs of coding techniques.

  12. 40 CFR 98.9 - Addresses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... submitted to the following address: (a) For U.S. mail. Director, Climate Change Division, 1200 Pennsylvania Ave., NW., Mail Code: 6207J, Washington, DC 20460. (b) For package deliveries. Director, Climate Change Division, 1310 L St, NW., Washington, DC 20005. ...

  13. Multifunction audio digitizer for communications systems

    NASA Technical Reports Server (NTRS)

    Monford, L. G., Jr.

    1971-01-01

    Digitizer accomplishes both N bit pulse code modulation /PCM/ and delta modulation, and provides modulation indicating variable signal gain and variable sidetone. Other features include - low package count, variable clock rate to optimize bandwidth, and easily expanded PCM output.

  14. 40 CFR 98.9 - Addresses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... submitted to the following address: (a) For U.S. mail. Director, Climate Change Division, 1200 Pennsylvania Ave., NW., Mail Code: 6207J, Washington, DC 20460. (b) For package deliveries. Director, Climate Change Division, 1310 L St, NW., Washington, DC 20005. ...

  15. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less

  16. Integrated Composite Analyzer (ICAN): Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Chamis, C. C.

    1986-01-01

    The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.

  17. Packaging printed circuit boards: A production application of interactive graphics

    NASA Technical Reports Server (NTRS)

    Perrill, W. A.

    1975-01-01

    The structure and use of an Interactive Graphics Packaging Program (IGPP), conceived to apply computer graphics to the design of packaging electronic circuits onto printed circuit boards (PCB), were described. The intent was to combine the data storage and manipulative power of the computer with the imaginative, intuitive power of a human designer. The hardware includes a CDC 6400 computer and two CDC 777 terminals with CRT screens, light pens, and keyboards. The program is written in FORTRAN 4 extended with the exception of a few functions coded in COMPASS (assembly language). The IGPP performs four major functions for the designer: (1) data input and display, (2) component placement (automatic or manual), (3) conductor path routing (automatic or manual), and (4) data output. The most complex PCB packaged to date measured 16.5 cm by 19 cm and contained 380 components, two layers of ground planes and four layers of conductors mixed with ground planes.

  18. Structural testing of the Los Alamos National Laboratory Heat Source/Radioisotopic Thermoelectric Generator shipping container

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronowski, D.R.; Madsen, M.M.

    The Heat Source/Radioisotopic Thermoelectric Generator shipping container is a Type B packaging design currently under development by Los Alamos National Laboratory. Type B packaging for transporting radioactive material is required to maintain containment and shielding after being exposed to the normal and hypothetical accident environments defined in Title 10 Code of Federal Regulations Part 71. A combination of testing and analysis is used to verify the adequacy of this package design. This report documents the test program portion of the design verification, using several prototype packages. Four types of testing were performed: 30-foot hypothetical accident condition drop tests in threemore » orientations, 40-inch hypothetical accident condition puncture tests in five orientations, a 21 psi external overpressure test, and a normal conditions of transport test consisting of a water spray and a 4 foot drop test. 18 refs., 104 figs., 13 tabs.« less

  19. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  20. chimeraviz: a tool for visualizing chimeric RNA.

    PubMed

    Lågstad, Stian; Zhao, Sen; Hoff, Andreas M; Johannessen, Bjarne; Lingjærde, Ole Christian; Skotheim, Rolf I

    2017-09-15

    Advances in high-throughput RNA sequencing have enabled more efficient detection of fusion transcripts, but the technology and associated software used for fusion detection from sequencing data often yield a high false discovery rate. Good prioritization of the results is important, and this can be helped by a visualization framework that automatically integrates RNA data with known genomic features. Here we present chimeraviz , a Bioconductor package that automates the creation of chimeric RNA visualizations. The package supports input from nine different fusion-finder tools: deFuse, EricScript, InFusion, JAFFA, FusionCatcher, FusionMap, PRADA, SOAPfuse and STAR-FUSION. chimeraviz is an R package available via Bioconductor ( https://bioconductor.org/packages/release/bioc/html/chimeraviz.html ) under Artistic-2.0. Source code and support is available at GitHub ( https://github.com/stianlagstad/chimeraviz ). rolf.i.skotheim@rr-research.no. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  2. regioneR: an R/Bioconductor package for the association analysis of genomic regions based on permutation tests.

    PubMed

    Gel, Bernat; Díez-Villanueva, Anna; Serra, Eduard; Buschbeck, Marcus; Peinado, Miguel A; Malinverni, Roberto

    2016-01-15

    Statistically assessing the relation between a set of genomic regions and other genomic features is a common challenging task in genomic and epigenomic analyses. Randomization based approaches implicitly take into account the complexity of the genome without the need of assuming an underlying statistical model. regioneR is an R package that implements a permutation test framework specifically designed to work with genomic regions. In addition to the predefined randomization and evaluation strategies, regioneR is fully customizable allowing the use of custom strategies to adapt it to specific questions. Finally, it also implements a novel function to evaluate the local specificity of the detected association. regioneR is an R package released under Artistic-2.0 License. The source code and documents are freely available through Bioconductor (http://www.bioconductor.org/packages/regioneR). rmalinverni@carrerasresearch.org. © The Author 2015. Published by Oxford University Press.

  3. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography.

    PubMed

    Hamilton, Liberty S; Chang, David L; Lee, Morgan B; Chang, Edward F

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users.

  4. A Shifted Block Lanczos Algorithm 1: The Block Recurrence

    NASA Technical Reports Server (NTRS)

    Grimes, Roger G.; Lewis, John G.; Simon, Horst D.

    1990-01-01

    In this paper we describe a block Lanczos algorithm that is used as the key building block of a software package for the extraction of eigenvalues and eigenvectors of large sparse symmetric generalized eigenproblems. The software package comprises: a version of the block Lanczos algorithm specialized for spectrally transformed eigenproblems; an adaptive strategy for choosing shifts, and efficient codes for factoring large sparse symmetric indefinite matrices. This paper describes the algorithmic details of our block Lanczos recurrence. This uses a novel combination of block generalizations of several features that have only been investigated independently in the past. In particular new forms of partial reorthogonalization, selective reorthogonalization and local reorthogonalization are used, as is a new algorithm for obtaining the M-orthogonal factorization of a matrix. The heuristic shifting strategy, the integration with sparse linear equation solvers and numerical experience with the code are described in a companion paper.

  5. Coronal Magnetism and Forward Solarsoft Idl Package

    NASA Astrophysics Data System (ADS)

    Gibson, S. E.

    2014-12-01

    The FORWARD suite of Solar Soft IDL codes is a community resource for model-data comparison, with a particular emphasis on analyzing coronal magnetic fields. FORWARD may be used both to synthesize a broad range of coronal observables, and to access and compare to existing data. FORWARD works with numerical model datacubes, interfaces with the web-served Predictive Science Inc MAS simulation datacubes and the Solar Soft IDL Potential Field Source Surface (PFSS) package, and also includes several analytic models (more can be added). It connects to the Virtual Solar Observatory and other web-served observations to download data in a format directly comparable to model predictions. It utilizes the CHIANTI database in modeling UV/EUV lines, and links to the CLE polarimetry synthesis code for forbidden coronal lines. FORWARD enables "forward-fitting" of specific observations, and helps to build intuition into how the physical properties of coronal magnetic structures translate to observable properties.

  6. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography

    PubMed Central

    Hamilton, Liberty S.; Chang, David L.; Lee, Morgan B.; Chang, Edward F.

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users. PMID:29163118

  7. Qualification of Simulation Software for Safety Assessment of Sodium Cooled Fast Reactors. Requirements and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Pointer, William David; Sieger, Matt

    2016-04-01

    The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.

  8. PCP METHODOLOGY FOR DETERMINING DOSE RATES FOR SMALL GRAM QUANTITIES IN SHIPPING PACKAGINGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nathan, S.

    The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials, are significantly less hazardous than large amounts of the same materials. This study describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials perform, under both normal and accident conditions, the essential functions of material containment, subcriticality, and maintain external radiation levels within regulatory limits.more » 10 CFR 71.33(b)(1)(2)&(3) state radioactive and fissile materials must be identified and their maximum quantity, chemical and physical forms be included in an application. Furthermore, the U.S. Federal Regulations require application contain an evaluation demonstrating the package (i.e., the packaging and its contents) satisfies the external radiation standards for all packages (10 CFR 71.31(2), 71.35(a), & 71.47). By placing the contents in a He leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large external dose rate. Quantifying of the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings described in this report provides bounding mass limits for a set of proposed SGQ isotopes. Methodology calculations were performed to estimate external radiation levels for the 9977 shipping package using the MCNP radiation transport code to develop a set of response multipliers (Green's functions) for 'dose per particle' for each neutron and photon spectral group. The source spectrum for each isotope generated using the ORIGEN-S and RASTA computer codes was folded with the response multipliers to generate the dose rate per gram of each isotope in the 9977 shipping package and its associated shielded containers. The maximum amount of a single isotope that could be shipped within the regulatory limits contained in 10 CFR 71.47 for dose rate at the surface of the package is determined. If a package contains a mixture of isotopes, the acceptability for shipment can be determined by a sum of fractions approach. Furthermore, the results of this analysis can be easily extended to additional radioisotopes by simply evaluating the neutron and/or photon spectra of those isotopes and folding the spectral data with the Green's functions provided.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven

    The PyForecastTools package provides Python routines for calculating metrics for model validation, forecast verification and model comparison. For continuous predictands the package provides functions for calculating bias (mean error, mean percentage error, median log accuracy, symmetric signed bias), and for calculating accuracy (mean squared error, mean absolute error, mean absolute scaled error, normalized RMSE, median symmetric accuracy). Convenience routines to calculate the component parts (e.g. forecast error, scaled error) of each metric are also provided. To compare models the package provides: generic skill score; percent better. Robust measures of scale including median absolute deviation, robust standard deviation, robust coefficient ofmore » variation and the Sn estimator are all provided by the package. Finally, the package implements Python classes for NxN contingency tables. In the case of a multi-class prediction, accuracy and skill metrics such as proportion correct and the Heidke and Peirce skill scores are provided as object methods. The special case of a 2x2 contingency table inherits from the NxN class and provides many additional metrics for binary classification: probability of detection, probability of false detection, false alarm ration, threat score, equitable threat score, bias. Confidence intervals for many of these quantities can be calculated using either the Wald method or Agresti-Coull intervals.« less

  10. MsSpec-1.0: A multiple scattering package for electron spectroscopies in material science

    NASA Astrophysics Data System (ADS)

    Sébilleau, Didier; Natoli, Calogero; Gavaza, George M.; Zhao, Haifeng; Da Pieve, Fabiana; Hatada, Keisuke

    2011-12-01

    We present a multiple scattering package to calculate the cross-section of various spectroscopies namely photoelectron diffraction (PED), Auger electron diffraction (AED), X-ray absorption (XAS), low-energy electron diffraction (LEED) and Auger photoelectron coincidence spectroscopy (APECS). This package is composed of three main codes, computing respectively the cluster, the potential and the cross-section. In the latter case, in order to cover a range of energies as wide as possible, three different algorithms are provided to perform the multiple scattering calculation: full matrix inversion, series expansion or correlation expansion of the multiple scattering matrix. Numerous other small Fortran codes or bash/csh shell scripts are also provided to perform specific tasks. The cross-section code is built by the user from a library of subroutines using a makefile. Program summaryProgram title: MsSpec-1.0 Catalogue identifier: AEJT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 504 438 No. of bytes in distributed program, including test data, etc.: 14 448 180 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any Operating system: Linux, MacOs RAM: Bytes Classification: 7.2 External routines: Lapack ( http://www.netlib.org/lapack/) Nature of problem: Calculation of the cross-section of various spectroscopies. Solution method: Multiple scattering. Running time: The test runs provided only take a few seconds to run.

  11. Pyteomics--a Python framework for exploratory data analysis and rapid software prototyping in proteomics.

    PubMed

    Goloborodko, Anton A; Levitsky, Lev I; Ivanov, Mark V; Gorshkov, Mikhail V

    2013-02-01

    Pyteomics is a cross-platform, open-source Python library providing a rich set of tools for MS-based proteomics. It provides modules for reading LC-MS/MS data, search engine output, protein sequence databases, theoretical prediction of retention times, electrochemical properties of polypeptides, mass and m/z calculations, and sequence parsing. Pyteomics is available under Apache license; release versions are available at the Python Package Index http://pypi.python.org/pyteomics, the source code repository at http://hg.theorchromo.ru/pyteomics, documentation at http://packages.python.org/pyteomics. Pyteomics.biolccc documentation is available at http://packages.python.org/pyteomics.biolccc/. Questions on installation and usage can be addressed to pyteomics mailing list: pyteomics@googlegroups.com.

  12. Study of the TRAC Airfoil Table Computational System

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1999-01-01

    The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.

  13. Explosive Model Tarantula V1/JWL++ Calibration of LX-17: #2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souers, P C; Vitello, P

    2009-05-01

    Tarantula V1 is a kinetic package for reactive flow codes that seeks to describe initiation, failure, dead zones and detonation simultaneously. The most important parameter is P1, the pressure between the initiation and failure regions. Both dead zone formation and failure can be largely controlled with this knob. However, V1 does failure with low settings and dead zones with higher settings, so that it cannot fulfill its purpose in the current format. To this end, V2 is under test. The derivation of the initiation threshold P0 is discussed. The derivation of the initiation pressure-tau curve as an output of Tarantulamore » shows that the initiation package is sound. A desensitization package is also considered.« less

  14. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    PubMed

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  15. A new Bayesian Earthquake Analysis Tool (BEAT)

    NASA Astrophysics Data System (ADS)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin

    2017-04-01

    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package is build on top of the pyrocko seismological toolbox (www.pyrocko.org) and makes use of the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat) and we encourage and solicit contributions to the project. In this contribution, we present our strategy for developing BEAT, show application examples, and discuss future developments.

  16. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... construction, as follows: (i) “A” means steel (all types and surface treatments). (ii) “B” means aluminum. (iii) “C” means natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means...

  17. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  18. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-06-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  19. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-09-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  20. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-06-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  1. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-01-18

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  2. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  3. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-12-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  4. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-02-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  5. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-09-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  6. CH-TRU Waste Content Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2008-01-16

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  7. iGC-an integrated analysis package of gene expression and copy number alteration.

    PubMed

    Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y

    2017-01-14

    With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .

  8. PharmacoGx: an R package for analysis of large pharmacogenomic datasets.

    PubMed

    Smirnov, Petr; Safikhani, Zhaleh; El-Hachem, Nehme; Wang, Dong; She, Adrian; Olsen, Catharina; Freeman, Mark; Selby, Heather; Gendoo, Deena M A; Grossmann, Patrick; Beck, Andrew H; Aerts, Hugo J W L; Lupien, Mathieu; Goldenberg, Anna; Haibe-Kains, Benjamin

    2016-04-15

    Pharmacogenomics holds great promise for the development of biomarkers of drug response and the design of new therapeutic options, which are key challenges in precision medicine. However, such data are scattered and lack standards for efficient access and analysis, consequently preventing the realization of the full potential of pharmacogenomics. To address these issues, we implemented PharmacoGx, an easy-to-use, open source package for integrative analysis of multiple pharmacogenomic datasets. We demonstrate the utility of our package in comparing large drug sensitivity datasets, such as the Genomics of Drug Sensitivity in Cancer and the Cancer Cell Line Encyclopedia. Moreover, we show how to use our package to easily perform Connectivity Map analysis. With increasing availability of drug-related data, our package will open new avenues of research for meta-analysis of pharmacogenomic data. PharmacoGx is implemented in R and can be easily installed on any system. The package is available from CRAN and its source code is available from GitHub. bhaibeka@uhnresearch.ca or benjamin.haibe.kains@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Microscale synthesis and characterization of polystyrene: NSF-POLYED scholars project

    NASA Technical Reports Server (NTRS)

    Quaal, Karen S.; Wu, Chang-Ning

    1994-01-01

    Polystyrene is a familiar polymer with many commercial uses. Its applications range from the clear, high index of refraction, brittle plastic used to form audio cassette and CD cases to the foamed material used in insulated drink cups and packaging material. Polystyrene constitutes 11 percent of the plastics used in packaging with only High Density Polyethylene (HDPE) and Low Density Polyethylene (LDPE) contributing a larger share: so much polystyrene is used today, it is one of six common plastics that manufacturers have assigned an identification code. The code helps recycling efforts. Polystyrene's code is (PS code 6). During the summer and fall of 1992 several new polymeric experiments were developed by the NSF POLYED Scholars for introduction into the chemistry core curriculum. In this presentation, one such project will be discussed. This laboratory project is recommended for a first or second year laboratory course allowing the introduction of polymeric science to undergraduates at the earliest opportunity. The reliability of the experiments which make up this project and the recognition factor of polystyrene, a material we come in contact with everyday, makes the synthesis and characterization of polystyrene a good choice for the introduction of polymerization to undergraduates. This laboratory project appeals to the varied interests of students enrolled in the typical first year chemistry course and becomes an ideal way to introduce polymers to a wide variety of science and engineering students.

  10. Python-Assisted MODFLOW Application and Code Development

    NASA Astrophysics Data System (ADS)

    Langevin, C.

    2013-12-01

    The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.

  11. EFTofPNG: a package for high precision computation with the effective field theory of post-Newtonian gravity

    NASA Astrophysics Data System (ADS)

    Levi, Michele; Steinhoff, Jan

    2017-12-01

    We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.

  12. Draco,Version 6.x.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Kelly; Budge, Kent; Lowrie, Rob

    2016-03-03

    Draco is an object-oriented component library geared towards numerically intensive, radiation (particle) transport applications built for parallel computing hardware. It consists of semi-independent packages and a robust build system. The packages in Draco provide a set of components that can be used by multiple clients to build transport codes. The build system can also be extracted for use in clients. Software includes smart pointers, Design-by-Contract assertions, unit test framework, wrapped MPI functions, a file parser, unstructured mesh data structures, a random number generator, root finders and an angular quadrature component.

  13. FREQ: A computational package for multivariable system loop-shaping procedures

    NASA Technical Reports Server (NTRS)

    Giesy, Daniel P.; Armstrong, Ernest S.

    1989-01-01

    Many approaches in the field of linear, multivariable time-invariant systems analysis and controller synthesis employ loop-sharing procedures wherein design parameters are chosen to shape frequency-response singular value plots of selected transfer matrices. A software package, FREQ, is documented for computing within on unified framework many of the most used multivariable transfer matrices for both continuous and discrete systems. The matrices are evaluated at user-selected frequency-response values, and singular values against frequency. Example computations are presented to demonstrate the use of the FREQ code.

  14. The challenges of packaging combination devices.

    PubMed

    Mankel, George

    2008-01-01

    This article focuses on the development of a packaging format for drug eluting stents where the package not only has to meet the needs of the stent, but also the needs of the drug incorporated into its polymer coating. The package has to allow the transfer of ethylene oxide gas for sterilisation, but when in storage, must provide a barrier to keep out moisture and oxygen. A pouch and commercial scale manufacturing process were developed to incorporate this dual function into one item.

  15. Multiscale Modelling of the 2011 Tohoku Tsunami with Fluidity: Coastal Inundation and Run-up.

    NASA Astrophysics Data System (ADS)

    Hill, J.; Martin-Short, R.; Piggott, M. D.; Candy, A. S.

    2014-12-01

    Tsunami-induced flooding represents one of the most dangerous natural hazards to coastal communities around the world, as exemplified by Tohoku tsunami of March 2011. In order to further understand this hazard and to design appropriate mitigation it is necessary to develop versatile, accurate software capable of simulating large scale tsunami propagation and interaction with coastal geomorphology on a local scale. One such software package is Fluidity, an open source, finite element, multiscale, code that is capable of solving the fully three dimensional Navier-Stokes equations on unstructured meshes. Such meshes are significantly better at representing complex coastline shapes than structured meshes and have the advantage of allowing variation in element size across a domain. Furthermore, Fluidity incorporates a novel wetting and drying algorithm, which enables accurate, efficient simulation of tsunami run-up over complex, multiscale, topography. Fluidity has previously been demonstrated to accurately simulate the 2011 Tohoku tsunami (Oishi et al 2013) , but its wetting and drying facility has not yet been tested on a geographical scale. This study makes use of Fluidity to simulate the 2011 Tohoku tsunami and its interaction with Japan's eastern shoreline, including coastal flooding. The results are validated against observations made by survey teams, aerial photographs and previous modelling efforts in order to evaluate Fluidity's current capabilities and suggest methods of future improvement. The code is shown to perform well at simulating flooding along the topographically complex Tohoku coast of Japan, with major deviations between model and observation arising mainly due to limitations imposed by bathymetry resolution, which could be improved in future. In theory, Fluidity is capable of full multiscale tsunami modelling, thus enabling researchers to understand both wave propagation across ocean basins and flooding of coastal landscapes down to interaction with individual defence structures. This makes the code an exciting candidate for use in future studies aiming to investigate tsunami risk elsewhere in the world. Oishi, Y. et al. Three-dimensional tsunami propagation simulations using an unstructured mesh finite element model. J. Geophys. Res. [Solid Earth] 118, 2998-3018 (2013).

  16. Browndye: A Software Package for Brownian Dynamics

    PubMed Central

    McCammon, J. Andrew

    2010-01-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109

  17. Visual and x-ray inspection characteristics of eutectic and lead free assemblies

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    2003-01-01

    For high reliability applications, visual inspection has been the key technique for most conventional electronic package assemblies. Now, the use of x-ray technique has become an additional inspection requirement for quality control and detection of unique defects due to manufacturing of advanced electronic array packages such as ball grid array (BGAs) and chip scale packages (CSPs).

  18. Waste Generator Instructions: Key to Successful Implementation of the US DOE's 435.1 for Transuranic Waste Packaging Instructions (LA-UR-12-24155) - 13218

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, David M.; Hayes, Timothy A.; Pope, Howard L.

    In times of continuing fiscal constraints, a management and operation tool that is straightforward to implement, works as advertised, and virtually ensures compliant waste packaging should be carefully considered and employed wherever practicable. In the near future, the Department of Energy (DOE) will issue the first major update to DOE Order 435.1, Radioactive Waste Management. This update will contain a requirement for sites that do not have a Waste Isolation Pilot Plant (WIPP) waste certification program to use two newly developed technical standards: Contact-Handled Defense Transuranic Waste Packaging Instructions and Remote-Handled Defense Transuranic Waste Packaging Instructions. The technical standards aremore » being developed from the DOE O 435.1 Notice, Contact-Handled and Remote-Handled Transuranic Waste Packaging, approved August 2011. The packaging instructions will provide detailed information and instruction for packaging almost every conceivable type of transuranic (TRU) waste for disposal at WIPP. While providing specificity, the packaging instructions leave to each site's own discretion the actual mechanics of how those Instructions will be functionally implemented at the floor level. While the Technical Standards are designed to provide precise information for compliant packaging, the density of the information in the packaging instructions necessitates a type of Rosetta Stone that translates the requirements into concise, clear, easy to use and operationally practical recipes that are waste stream and facility specific for use by both first line management and hands-on operations personnel. The Waste Generator Instructions provide the operator with step-by-step instructions that will integrate the sites' various operational requirements (e.g., health and safety limits, radiological limits or dose limits) and result in a WIPP certifiable waste and package that can be transported to and emplaced at WIPP. These little known but widely productive Waste Generator Instructions (WGIs) have been used occasionally in the past at large sites for treatment and packaging of TRU waste. The WGIs have resulted in highly efficient waste treatment, packaging and certification for disposal of TRU waste at WIPP. For example, a single WGI at LANL, combined with an increase in gram loading, resulted in a mind boggling 6,400% increase in waste loading for {sup 238}Pu heat source waste. In fact, the WGI combined with a new Contact Handled (CH) TRU Waste Content (TRUCON) Code provided a massive increase in shippable wattage per Transuranic Package Transporter-II (TRUPACT-II) over the previously used and more restrictive TRUCON Code that have been used previously for the heat source waste. In fact, the use of the WGI process at LANL's TA-55 facility reduced non-compliant drums for WIPP certification and disposal from a 13% failure rate down to a 0.5% failure rate and is expected to further reduce the failure rate to zero drums per year. The inherent value of the WGI is that it can be implemented in a site's current procedure issuance process and it provides documented proof of what actions were taken for each waste stream packaged. The WGI protocol provides a key floor-level operational component to achieve goal alignment between actual site operations, the WIPP TRU waste packaging instructions, and DOE O 435.1. (authors)« less

  19. Monte Carlo Particle Lists: MCPL

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.

    2017-09-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  20. Effect of girder spacing on bridge deck response.

    DOT National Transportation Integrated Search

    2000-12-01

    The purpose of this investigation was to evaluate the use of the commercial finite element code ABAQUS for analysis of reinforced concrete bridge decks and to employ this analysis package to determine the effect of girder spacing on deck response. A ...

  1. Cine: Line excitation by infrared fluorescence in cometary atmospheres

    NASA Astrophysics Data System (ADS)

    de Val-Borro, Miguel; Cordiner, Martin A.; Milam, Stefanie N.; Charnley, Steven B.

    2017-03-01

    CINE is a Python module for calculating infrared pumping efficiencies that can be applied to the most common molecules found in cometary comae such as water, hydrogen cyanide or methanol. Excitation by solar radiation of vibrational bands followed by radiative decay to the ground vibrational state is one of the main mechanisms for molecular excitation in comets. This code calculates the effective pumping rates for rotational levels in the ground vibrational state scaled by the heliocentric distance of the comet. Line transitions are queried from the latest version of the HITRAN spectroscopic repository using the astroquery affiliated package of astropy. Molecular data are obtained from the LAMDA database. These coefficients are useful for modeling rotational emission lines observed in cometary spectra at sub-millimeter wavelengths. Combined with computational methods to solve the radiative transfer equations based, e.g., on the Monte Carlo algorithm, this model can retrieve production rates and rotational temperatures from the observed emission spectrum.

  2. A Study of Upgraded Phenolic Curing for RSRM Nozzle Rings

    NASA Technical Reports Server (NTRS)

    Smartt, Ziba

    2000-01-01

    A thermochemical cure model for predicting temperature and degree of cure profiles in curing phenolic parts was developed, validated and refined over several years. The model supports optimization of cure cycles and allows input of properties based upon the types of material and the process by which these materials are used to make nozzle components. The model has been refined to use sophisticated computer graphics to demonstrate the changes in temperature and degree of cure during the curing process. The effort discussed in the paper will be the conversion from an outdated solid modeling input program and SINDA analysis code to an integrated solid modeling and analysis package (I-DEAS solid model and TMG). Also discussed will be the incorporation of updated material properties obtained during full scale curing tests into the cure models and the results for all the Reusable Solid Rocket Motor (RSRM) nozzle rings.

  3. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  4. Warthog: Coupling Status Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Shane W. D.; Reardon, Bradley T.

    The Warthog code was developed to couple codes that are developed in both the Multi-Physics Object-Oriented Simulation Environment (MOOSE) from Idaho National Laboratory (INL) and SHARP from Argonne National Laboratory (ANL). The initial phase of this work, focused on coupling the neutronics code PROTEUS with the fuel performance code BISON. The main technical challenge involves mapping the power density solution determined by PROTEUS to the fuel in BISON. This presents a challenge since PROTEUS uses the MOAB mesh format, but BISON, like all other MOOSE codes, uses the libMesh format. When coupling the different codes, one must consider that Warthogmore » is a light-weight MOOSE-based program that uses the Data Transfer Kit (DTK) to transfer data between the various mesh types. Users set up inputs for the codes they want to run, and then Warthog transfers the data between them. Currently Warthog supports XSProc from SCALE or the Sub-Group Application Programming Interface (SGAPI) in PROTEUS for generating cross sections. It supports arbitrary geometries using PROTEUS and BISON. DTK will transfer power densities and temperatures between the codes where the domains overlap. In the past fiscal year (FY), much work has gone into demonstrating two-way coupling for simple pin cells of various materials. XSProc was used to calculate the cross sections, which were then passed to PROTEUS in an external file. PROTEUS calculates the fission/power density, and Warthog uses DTK to pass this information to BISON, where it is used as the heat source. BISON then calculates the temperature profile of the pin cell and sends it back to XSProc to obtain the temperature corrected cross sections. This process is repeated until the convergence criteria (tolerance on BISON solve, or number of time steps) is reached. Models have been constructed and run for both uranium oxide and uranium silicide fuels. These models demonstrate a clear difference in power shape that is not accounted for in a stand-alone BISON run. Future work involves improving the user interface (UI), likely through integration with the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Workbench. Furthermore, automating the input creation would ease the user experience. The next priority is to continue coupling the work with other codes in the SHARP package. Efforts on other projects include work to couple the Nek5000 thermo-hydraulics code to MOOSE, but this is in the preliminary stages.« less

  5. R-HyMOD: an R-package for the hydrological model HyMOD

    NASA Astrophysics Data System (ADS)

    Baratti, Emanuele; Montanari, Alberto

    2015-04-01

    A software code for the implementation of the HyMOD hydrological model [1] is presented. HyMOD is a conceptual lumped rainfall-runoff model that is based on the probability-distributed soil storage capacity principle introduced by R. J. Moore 1985 [2]. The general idea behind this model is to describe the spatial variability of some process parameters as, for instance, the soil structure or the water storage capacities, through probability distribution functions. In HyMOD, the rainfall-runoff process is represented through a nonlinear tank connected with three identical linear tanks in parallel representing the surface flow and a slow-flow tank representing groundwater flow. The model requires the optimization of five parameters: Cmax (the maximum storage capacity within the watershed), β (the degree of spatial variability of the soil moisture capacity within the watershed), α (a factor for partitioning the flow between two series of tanks) and the two residence time parameters of quick-flow and slow-flow tanks, kquick and kslow respectively. Given its relatively simplicity but robustness, the model is widely used in the literature. The input data consist of precipitation and potential evapotranspiration at the given time scale. The R-HyMOD package is composed by a 'canonical' R-function of HyMOD and a fast FORTRAN implementation. The first one can be easily modified and can be used, for instance, for educational purposes; the second part combines the R user friendly interface with a fast processing unit. [1] Boyle D.P. (2000), Multicriteria calibration of hydrological models, Ph.D. dissertation, Dep. of Hydrol. and Water Resour., Univ of Arizona, Tucson. [2] Moore, R.J., (1985), The probability-distributed principle and runoff production at point and basin scale, Hydrol. Sci. J., 30(2), 273-297.

  6. NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.

    PubMed

    Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N

    2016-11-01

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.

  7. Deterministic Local Sensitivity Analysis of Augmented Systems - II: Applications to the QUENCH-04 Experiment Using the RELAP5/MOD3.2 Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ionescu-Bujor, Mihaela; Jin Xuezhou; Cacuci, Dan G.

    2005-09-15

    The adjoint sensitivity analysis procedure for augmented systems for application to the RELAP5/MOD3.2 code system is illustrated. Specifically, the adjoint sensitivity model corresponding to the heat structure models in RELAP5/MOD3.2 is derived and subsequently augmented to the two-fluid adjoint sensitivity model (ASM-REL/TF). The end product, called ASM-REL/TFH, comprises the complete adjoint sensitivity model for the coupled fluid dynamics/heat structure packages of the large-scale simulation code RELAP5/MOD3.2. The ASM-REL/TFH model is validated by computing sensitivities to the initial conditions for various time-dependent temperatures in the test bundle of the Quench-04 reactor safety experiment. This experiment simulates the reflooding with water ofmore » uncovered, degraded fuel rods, clad with material (Zircaloy-4) that has the same composition and size as that used in typical pressurized water reactors. The most important response for the Quench-04 experiment is the time evolution of the cladding temperature of heated fuel rods. The ASM-REL/TFH model is subsequently used to perform an illustrative sensitivity analysis of this and other time-dependent temperatures within the bundle. The results computed by using the augmented adjoint sensitivity system, ASM-REL/TFH, highlight the reliability, efficiency, and usefulness of the adjoint sensitivity analysis procedure for computing time-dependent sensitivities.« less

  8. Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik V.; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.« less

  9. Fourteen Years of R/qtl: Just Barely Sustainable

    PubMed Central

    Broman, Karl W.

    2014-01-01

    R/qtl is an R package for mapping quantitative trait loci (genetic loci that contribute to variation in quantitative traits) in experimental crosses. Its development began in 2000. There have been 38 software releases since 2001. The latest release contains 35k lines of R code and 24k lines of C code, plus 15k lines of code for the documentation. Challenges in the development and maintenance of the software are discussed. A key to the success of R/qtl is that it remains a central tool for the chief developer's own research work, and so its maintenance is of selfish importance. PMID:25364504

  10. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  11. A Zeus++ Code Tool, a Method for Implementing Same, and Storage Medium Storing Computer Readable Instructions for Instantiating the Zeus++ Code Tool

    DTIC Science & Technology

    1999-12-01

    applications, it should be understood that the invention is not limited thereto. Those having - 9 - Navy Case No. 79694 ordinary skill in the art and access...processing. It should also be mentioned that Tecplot is a commercial plotting software package produced by Amtec Engineering, Inc. The following...conditions) 7. Ch (base on edge conditions) -43- 10 Navy Case No. 79694 8. Ch (base on reference conditions) 9 . Momentum thickness 10. Displacement

  12. REBURNING APPLICATION TO FIRETUBE PACKAGE BOILERS

    EPA Science Inventory

    The report gives results of pilot-scale experimental research that examined the physical and chemical phenomena associated with the NOx control technology of reburning applied to gas- and liquid-fired firetube package boilers. Reburning (staged fuel combustion) diverts some of th...

  13. Software for Computing, Archiving, and Querying Semisimple Braided Monoidal Category Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software package collects various open source and freely available codes and algorithms to compute and archive the categorical data for certain semisimple braided monoidal categories. In particular, it computes the data for of group theoretical categories for academic research.

  14. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  15. MWASTools: an R/bioconductor package for metabolome-wide association studies.

    PubMed

    Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel

    2018-03-01

    MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version  > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Design Optimization Toolkit: Users' Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less

  17. Edible packaging materials.

    PubMed

    Janjarasskul, Theeranun; Krochta, John M

    2010-01-01

    Research groups and the food and pharmaceutical industries recognize edible packaging as a useful alternative or addition to conventional packaging to reduce waste and to create novel applications for improving product stability, quality, safety, variety, and convenience for consumers. Recent studies have explored the ability of biopolymer-based food packaging materials to carry and control-release active compounds. As diverse edible packaging materials derived from various by-products or waste from food industry are being developed, the dry thermoplastic process is advancing rapidly as a feasible commercial edible packaging manufacturing process. The employment of nanocomposite concepts to edible packaging materials promises to improve barrier and mechanical properties and facilitate effective incorporation of bioactive ingredients and other designed functions. In addition to the need for a more fundamental understanding to enable design to desired specifications, edible packaging has to overcome challenges such as regulatory requirements, consumer acceptance, and scaling-up research concepts to commercial applications.

  18. Light element opacities of astrophysical interest from ATOMIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colgan, J.; Kilcrease, D. P.; Magee, N. H. Jr.

    We present new calculations of local-thermodynamic-equilibrium (LTE) light element opacities from the Los Alamos ATOMIC code for systems of astrophysical interest. ATOMIC is a multi-purpose code that can generate LTE or non-LTE quantities of interest at various levels of approximation. Our calculations, which include fine-structure detail, represent a systematic improvement over previous Los Alamos opacity calculations using the LEDCOP legacy code. The ATOMIC code uses ab-initio atomic structure data computed from the CATS code, which is based on Cowan's atomic structure codes, and photoionization cross section data computed from the Los Alamos ionization code GIPPER. ATOMIC also incorporates a newmore » equation-of-state (EOS) model based on the chemical picture. ATOMIC incorporates some physics packages from LEDCOP and also includes additional physical processes, such as improved free-free cross sections and additional scattering mechanisms. Our new calculations are made for elements of astrophysical interest and for a wide range of temperatures and densities.« less

  19. WDEC: A Code for Modeling White Dwarf Structure and Pulsations

    NASA Astrophysics Data System (ADS)

    Bischoff-Kim, Agnès; Montgomery, Michael H.

    2018-05-01

    The White Dwarf Evolution Code (WDEC), written in Fortran, makes models of white dwarf stars. It is fast, versatile, and includes the latest physics. The code evolves hot (∼100,000 K) input models down to a chosen effective temperature by relaxing the models to be solutions of the equations of stellar structure. The code can also be used to obtain g-mode oscillation modes for the models. WDEC has a long history going back to the late 1960s. Over the years, it has been updated and re-packaged for modern computer architectures and has specifically been used in computationally intensive asteroseismic fitting. Generations of white dwarf astronomers and dozens of publications have made use of the WDEC, although the last true instrument paper is the original one, published in 1975. This paper discusses the history of the code, necessary to understand why it works the way it does, details the physics and features in the code today, and points the reader to where to find the code and a user guide.

  20. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    NASA Astrophysics Data System (ADS)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

Top