Sample records for general purpose codes

  1. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  2. 48 CFR 1501.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1501.105-1 Section 1501.105-1 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY GENERAL GENERAL Purpose, Authority, Issuance 1501.105-1 Publication and code arrangement. The...

  3. 48 CFR 501.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 501.105-1 Section 501.105-1 Federal Acquisition Regulations System GENERAL SERVICES ADMINISTRATION GENERAL GENERAL SERVICES ADMINISTRATION ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance...

  4. General-Purpose Serial Interface For Remote Control

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M.; Gupton, Lawrence E.

    1990-01-01

    Computer controls remote television camera. General-purpose controller developed to serve as interface between host computer and pan/tilt/zoom/focus functions on series of automated video cameras. Interface port based on 8251 programmable communications-interface circuit configured for tristated outputs, and connects controller system to any host computer with RS-232 input/output (I/O) port. Accepts byte-coded data from host, compares them with prestored codes in read-only memory (ROM), and closes or opens appropriate switches. Six output ports control opening and closing of as many as 48 switches. Operator controls remote television camera by speaking commands, in system including general-purpose controller.

  5. NSWC Library of Mathematics Subroutines

    DTIC Science & Technology

    1993-01-01

    standards concerning in-line documentation and the style of code cannot be imposed. In generel, all supportive subreutines not intended for direct use are...proprietary or otherwise restricted codes have been permitted ;’ the library. Only general purpose mathematical subroutines for use by the entire NSWCDD...where the source codes are frequently of prime importance), and for general use in applications. Since expertise is so widely scattered, reliable

  6. Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Popok, Daniel

    1999-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  7. Comparison of Einstein-Boltzmann solvers for testing general relativity

    NASA Astrophysics Data System (ADS)

    Bellini, E.; Barreira, A.; Frusciante, N.; Hu, B.; Peirone, S.; Raveri, M.; Zumalacárregui, M.; Avilez-Lopez, A.; Ballardini, M.; Battye, R. A.; Bolliet, B.; Calabrese, E.; Dirian, Y.; Ferreira, P. G.; Finelli, F.; Huang, Z.; Ivanov, M. M.; Lesgourgues, J.; Li, B.; Lima, N. A.; Pace, F.; Paoletti, D.; Sawicki, I.; Silvestri, A.; Skordis, C.; Umiltà, C.; Vernizzi, F.

    2018-01-01

    We compare Einstein-Boltzmann solvers that include modifications to general relativity and find that, for a wide range of models and parameters, they agree to a high level of precision. We look at three general purpose codes that primarily model general scalar-tensor theories, three codes that model Jordan-Brans-Dicke (JBD) gravity, a code that models f (R ) gravity, a code that models covariant Galileons, a code that models Hořava-Lifschitz gravity, and two codes that model nonlocal models of gravity. Comparing predictions of the angular power spectrum of the cosmic microwave background and the power spectrum of dark matter for a suite of different models, we find agreement at the subpercent level. This means that this suite of Einstein-Boltzmann solvers is now sufficiently accurate for precision constraints on cosmological and gravitational parameters.

  8. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  9. 32 CFR 935.2 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CODE General § 935.2 Purpose. The purpose of this part is to provide— (a) For the civil administration of Wake Island; (b) Civil laws for Wake Island not otherwise provided for; (c) Criminal laws for Wake...

  10. 32 CFR 935.2 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CODE General § 935.2 Purpose. The purpose of this part is to provide— (a) For the civil administration of Wake Island; (b) Civil laws for Wake Island not otherwise provided for; (c) Criminal laws for Wake...

  11. 32 CFR 935.2 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CODE General § 935.2 Purpose. The purpose of this part is to provide— (a) For the civil administration of Wake Island; (b) Civil laws for Wake Island not otherwise provided for; (c) Criminal laws for Wake...

  12. 32 CFR 935.2 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CODE General § 935.2 Purpose. The purpose of this part is to provide— (a) For the civil administration of Wake Island; (b) Civil laws for Wake Island not otherwise provided for; (c) Criminal laws for Wake...

  13. 32 CFR 935.2 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CODE General § 935.2 Purpose. The purpose of this part is to provide— (a) For the civil administration of Wake Island; (b) Civil laws for Wake Island not otherwise provided for; (c) Criminal laws for Wake...

  14. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  15. South Carolina TEC Student Code.

    ERIC Educational Resources Information Center

    Edwards, C. A., Ed.; Kiser, J. A., Ed.

    This student code has statewide application to South Carolina Technical Colleges and Technical Education Centers (TEC). Provisions are divided into eight articles: (1) General Provisions, including the purpose of a student code, the precept of internal solution of problems, and definitions; (2) Student Rights, including Bill of Rights protections;…

  16. 48 CFR 901.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...

  17. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  18. 48 CFR 901.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...

  19. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  20. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE PAGES

    Khodak, Andrei

    2017-08-21

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  1. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, Andrei

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  2. Engine dynamic analysis with general nonlinear finite element codes. Part 2: Bearing element implementation overall numerical characteristics and benchmaking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Fertis, J.; Zeid, I.; Lam, P.

    1982-01-01

    Finite element codes are used in modelling rotor-bearing-stator structure common to the turbine industry. Engine dynamic simulation is used by developing strategies which enable the use of available finite element codes. benchmarking the elements developed are benchmarked by incorporation into a general purpose code (ADINA); the numerical characteristics of finite element type rotor-bearing-stator simulations are evaluated through the use of various types of explicit/implicit numerical integration operators. Improving the overall numerical efficiency of the procedure is improved.

  3. Signal Processing Expert Code (SPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  4. A Multiple Sphere T-Matrix Fortran Code for Use on Parallel Computer Clusters

    NASA Technical Reports Server (NTRS)

    Mackowski, D. W.; Mishchenko, M. I.

    2011-01-01

    A general-purpose Fortran-90 code for calculation of the electromagnetic scattering and absorption properties of multiple sphere clusters is described. The code can calculate the efficiency factors and scattering matrix elements of the cluster for either fixed or random orientation with respect to the incident beam and for plane wave or localized- approximation Gaussian incident fields. In addition, the code can calculate maps of the electric field both interior and exterior to the spheres.The code is written with message passing interface instructions to enable the use on distributed memory compute clusters, and for such platforms the code can make feasible the calculation of absorption, scattering, and general EM characteristics of systems containing several thousand spheres.

  5. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  6. Coding of procedures documented by general practitioners in Swedish primary care-an explorative study using two procedure coding systems

    PubMed Central

    2012-01-01

    Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095

  7. 48 CFR 1301.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Publication and code arrangement. 1301.105-1 Section 1301.105-1 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE GENERAL DEPARTMENT OF COMMERCE ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1301.105-1...

  8. 48 CFR 401.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...

  9. 48 CFR 2501.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 2501.104-1 Publication and...

  10. 48 CFR 1201.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 1201.105-1 Section 1201.105-1 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1201.105-1 Publication and...

  11. 48 CFR 401.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...

  12. 48 CFR 1201.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 1201.105-1 Section 1201.105-1 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1201.105-1 Publication and...

  13. 48 CFR 401.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...

  14. 48 CFR 401.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Publication and code arrangement. 401.105-1 Section 401.105-1 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE GENERAL AGRICULTURE ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 401.105-1 Publication and...

  15. 48 CFR 2501.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2501.104-1 Section 2501.104-1 Federal Acquisition Regulations System NATIONAL SCIENCE FOUNDATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 2501.104-1 Publication and...

  16. 48 CFR 2301.105-1 - Publication and code ar-rangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Publication and code ar-rangement. 2301.105-1 Section 2301.105-1 Federal Acquisition Regulations System SOCIAL SECURITY ADMINISTRATION GENERAL SOCIAL SECURITY ACQUISITION REGULATION SYSTEM Purpose, Authority, Issuance 2301.105-1...

  17. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  18. DoD Electronic Data Interchange (EDI) Convention: ASC X12 Transaction Set 858 Freight Government Bill of Lading Shipment Information (Version 003010)

    DTIC Science & Technology

    1993-02-01

    Segment: BX General Shipment Information Level: A Sequence: 30 Usage: M Max Use: 1 Loop: Purpose: To transmit identification numbers and other basic ...official code as- signed to a city or point (for ratemaking purposes) within a city. 930210 10.7.25 DEPARTM•B•T OF DOOM4N GOVERMET SILL OF LAD#M wDI...development group as the official code as- signed to a city or point (for ratemaking purposes) within a city. 930210 10.7.27 DEPARTMENT OF DG:dEI GOVWERJT

  19. Strong scaling of general-purpose molecular dynamics simulations on GPUs

    NASA Astrophysics Data System (ADS)

    Glaser, Jens; Nguyen, Trung Dac; Anderson, Joshua A.; Lui, Pak; Spiga, Filippo; Millan, Jaime A.; Morse, David C.; Glotzer, Sharon C.

    2015-07-01

    We describe a highly optimized implementation of MPI domain decomposition in a GPU-enabled, general-purpose molecular dynamics code, HOOMD-blue (Anderson and Glotzer, 2013). Our approach is inspired by a traditional CPU-based code, LAMMPS (Plimpton, 1995), but is implemented within a code that was designed for execution on GPUs from the start (Anderson et al., 2008). The software supports short-ranged pair force and bond force fields and achieves optimal GPU performance using an autotuning algorithm. We are able to demonstrate equivalent or superior scaling on up to 3375 GPUs in Lennard-Jones and dissipative particle dynamics (DPD) simulations of up to 108 million particles. GPUDirect RDMA capabilities in recent GPU generations provide better performance in full double precision calculations. For a representative polymer physics application, HOOMD-blue 1.0 provides an effective GPU vs. CPU node speed-up of 12.5 ×.

  20. 22 CFR 2.2 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Purpose. 2.2 Section 2.2 Foreign Relations DEPARTMENT OF STATE GENERAL PROTECTION OF FOREIGN DIGNITARIES AND OTHER OFFICIAL PERSONNEL § 2.2 Purpose. Section 1116(b)(2) of title 18 of the United States Code, as added by Pub. L. 92-539, An Act for the...

  1. 22 CFR 2.2 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Purpose. 2.2 Section 2.2 Foreign Relations DEPARTMENT OF STATE GENERAL PROTECTION OF FOREIGN DIGNITARIES AND OTHER OFFICIAL PERSONNEL § 2.2 Purpose. Section 1116(b)(2) of title 18 of the United States Code, as added by Pub. L. 92-539, An Act for the...

  2. 22 CFR 2.2 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Purpose. 2.2 Section 2.2 Foreign Relations DEPARTMENT OF STATE GENERAL PROTECTION OF FOREIGN DIGNITARIES AND OTHER OFFICIAL PERSONNEL § 2.2 Purpose. Section 1116(b)(2) of title 18 of the United States Code, as added by Pub. L. 92-539, An Act for the...

  3. 22 CFR 2.2 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Purpose. 2.2 Section 2.2 Foreign Relations DEPARTMENT OF STATE GENERAL PROTECTION OF FOREIGN DIGNITARIES AND OTHER OFFICIAL PERSONNEL § 2.2 Purpose. Section 1116(b)(2) of title 18 of the United States Code, as added by Pub. L. 92-539, An Act for the...

  4. 22 CFR 2.2 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Purpose. 2.2 Section 2.2 Foreign Relations DEPARTMENT OF STATE GENERAL PROTECTION OF FOREIGN DIGNITARIES AND OTHER OFFICIAL PERSONNEL § 2.2 Purpose. Section 1116(b)(2) of title 18 of the United States Code, as added by Pub. L. 92-539, An Act for the...

  5. 26 CFR 1.441-1 - Period for computation of taxable income.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Internal Revenue Code, and the regulations thereunder. (2) Length of taxable year. Except as otherwise provided in the Internal Revenue Code and the regulations thereunder (e.g., § 1.441-2 regarding 52-53-week... and definitions. The general rules and definitions in this paragraph (b) apply for purposes of...

  6. 26 CFR 1.441-1 - Period for computation of taxable income.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Internal Revenue Code, and the regulations thereunder. (2) Length of taxable year. Except as otherwise provided in the Internal Revenue Code and the regulations thereunder (e.g., § 1.441-2 regarding 52-53-week... and definitions. The general rules and definitions in this paragraph (b) apply for purposes of...

  7. 26 CFR 1.441-1 - Period for computation of taxable income.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Internal Revenue Code, and the regulations thereunder. (2) Length of taxable year. Except as otherwise provided in the Internal Revenue Code and the regulations thereunder (e.g., § 1.441-2 regarding 52-53-week... and definitions. The general rules and definitions in this paragraph (b) apply for purposes of...

  8. Practical Implementation of Prestack Kirchhoff Time Migration on a General Purpose Graphics Processing Unit

    NASA Astrophysics Data System (ADS)

    Liu, Guofeng; Li, Chun

    2016-08-01

    In this study, we present a practical implementation of prestack Kirchhoff time migration (PSTM) on a general purpose graphic processing unit. First, we consider the three main optimizations of the PSTM GPU code, i.e., designing a configuration based on a reasonable execution, using the texture memory for velocity interpolation, and the application of an intrinsic function in device code. This approach can achieve a speedup of nearly 45 times on a NVIDIA GTX 680 GPU compared with CPU code when a larger imaging space is used, where the PSTM output is a common reflection point that is gathered as I[ nx][ ny][ nh][ nt] in matrix format. However, this method requires more memory space so the limited imaging space cannot fully exploit the GPU sources. To overcome this problem, we designed a PSTM scheme with multi-GPUs for imaging different seismic data on different GPUs using an offset value. This process can achieve the peak speedup of GPU PSTM code and it greatly increases the efficiency of the calculations, but without changing the imaging result.

  9. 26 CFR 305.7871-1 - Indian tribal governments treated as States for certain purposes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for taxes); (3) Section 511(a)(2)(B) (relating to the taxation of colleges and universities which are... Code (relating to communications excise tax); and (4) Subchapter D of chapter 36 of the Code (relating...) Effective dates—(1) In general. Except as provided in paragraph (f)(2) of this section, the provisions of...

  10. Processor-in-memory-and-storage architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeBenedictis, Erik

    A method and apparatus for performing reliable general-purpose computing. Each sub-core of a plurality of sub-cores of a processor core processes a same instruction at a same time. A code analyzer receives a plurality of residues that represents a code word corresponding to the same instruction and an indication of whether the code word is a memory address code or a data code from the plurality of sub-cores. The code analyzer determines whether the plurality of residues are consistent or inconsistent. The code analyzer and the plurality of sub-cores perform a set of operations based on whether the code wordmore » is a memory address code or a data code and a determination of whether the plurality of residues are consistent or inconsistent.« less

  11. Addition of equilibrium air to an upwind Navier-Stokes code and other first steps toward a more generalized flow solver

    NASA Technical Reports Server (NTRS)

    Rosen, Bruce S.

    1991-01-01

    An upwind three-dimensional volume Navier-Stokes code is modified to facilitate modeling of complex geometries and flow fields represented by proposed National Aerospace Plane concepts. Code enhancements include an equilibrium air model, a generalized equilibrium gas model and several schemes to simplify treatment of complex geometric configurations. The code is also restructured for inclusion of an arbitrary number of independent and dependent variables. This latter capability is intended for eventual use to incorporate nonequilibrium/chemistry gas models, more sophisticated turbulence and transition models, or other physical phenomena which will require inclusion of additional variables and/or governing equations. Comparisons of computed results with experimental data and results obtained using other methods are presented for code validation purposes. Good correlation is obtained for all of the test cases considered, indicating the success of the current effort.

  12. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  13. The development of an intelligent interface to a computational fluid dynamics flow-solver code

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1988-01-01

    Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, 3-D, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.

  14. The development of an intelligent interface to a computational fluid dynamics flow-solver code

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1988-01-01

    Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, three-dimensional, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.

  15. PerSEUS: Ultra-Low-Power High Performance Computing for Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Doxas, I.; Andreou, A.; Lyon, J.; Angelopoulos, V.; Lu, S.; Pritchett, P. L.

    2017-12-01

    Peta-op SupErcomputing Unconventional System (PerSEUS) aims to explore the use for High Performance Scientific Computing (HPC) of ultra-low-power mixed signal unconventional computational elements developed by Johns Hopkins University (JHU), and demonstrate that capability on both fluid and particle Plasma codes. We will describe the JHU Mixed-signal Unconventional Supercomputing Elements (MUSE), and report initial results for the Lyon-Fedder-Mobarry (LFM) global magnetospheric MHD code, and a UCLA general purpose relativistic Particle-In-Cell (PIC) code.

  16. Combustion: Structural interaction in a viscoelastic material

    NASA Technical Reports Server (NTRS)

    Chang, T. Y.; Chang, J. P.; Kumar, M.; Kuo, K. K.

    1980-01-01

    The effect of interaction between combustion processes and structural deformation of solid propellant was considered. The combustion analysis was performed on the basis of deformed crack geometry, which was determined from the structural analysis. On the other hand, input data for the structural analysis, such as pressure distribution along the crack boundary and ablation velocity of the crack, were determined from the combustion analysis. The interaction analysis was conducted by combining two computer codes, a combustion analysis code and a general purpose finite element structural analysis code.

  17. Computational methods for fracture analysis of heavy-section steel technology (HSST) pressure vessel experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.

    This paper summarizes the capabilities and applications of the general-purpose and special-purpose computer programs that have been developed for use in fracture mechanics analyses of HSST pressure vessel experiments. Emphasis is placed on the OCA/USA code, which is designed for analysis of pressurized-thermal-shock (PTS) conditions, and on the ORMGEN/ADINA/ORVIRT system which is used for more general analysis. Fundamental features of these programs are discussed, along with applications to pressure vessel experiments.

  18. General purpose molecular dynamics simulations fully implemented on graphics processing units

    NASA Astrophysics Data System (ADS)

    Anderson, Joshua A.; Lorenz, Chris D.; Travesset, A.

    2008-05-01

    Graphics processing units (GPUs), originally developed for rendering real-time effects in computer games, now provide unprecedented computational power for scientific applications. In this paper, we develop a general purpose molecular dynamics code that runs entirely on a single GPU. It is shown that our GPU implementation provides a performance equivalent to that of fast 30 processor core distributed memory cluster. Our results show that GPUs already provide an inexpensive alternative to such clusters and discuss implications for the future.

  19. Touching ethics: assessing the applicability of ethical rules for safe touch in CAM--outcomes of a CAM (complementary and alternative medicine) practitioner survey in Israel.

    PubMed

    Schiff, Elad; Ben-Arye, Eran; Shilo, Margalit; Levy, Moti; Schachter, Leora; Weitchner, Na'ama; Golan, Ofra; Stone, Julie

    2011-02-01

    Recently, ethical guidelines regarding safe touch in CAM were developed in Israel. Publishing ethical codes does not imply that they will actually help practitioners to meet ethical care standards. The effectiveness of ethical rules depends on familiarity with the code and its content. In addition, critical self-examination of the code by individual members of the profession is required to reflect on the moral commitments encompassed in the code. For the purpose of dynamic self-appraisal, we devised a survey to assess how CAM practitioners view the suggested ethical guidelines for safe touch. We surveyed 781 CAM practitioners regarding their perspectives on the safe-touch code. There was a high level of agreement with general statements regarding ethics pertaining to safe touch with a mean rate of agreement of 4.61 out of a maximum of 5. Practitioners concurred substantially with practice guidelines for appropriate touch with a mean rate of agreement of 4.16 out of a maximum of 5. Attitudes toward the necessity to touch intimate areas for treatment purposes varied with 78.6% of respondents strongly disagreeing with any notion of need to touch intimate areas during treatment. 7.9% neither disagreed nor agreed, 7.9% slightly agreed, and 7.6% strongly agreed with the need for touching intimate areas during treatment. There was a direct correlation between disagreement with touching intimate areas for therapeutic purposes and agreement with general statements regarding ethics of safe touch (Spearman r=0.177, p<0.0001), and practice guidelines for appropriate touch (r=0.092, p=0.012). A substantial number of practitioners agreed with the code, although some findings regarding the need to touch intimate area during treatments were disturbing. Our findings can serve as a basis for ethical code development and implementation, as well as for educating CAM practitioners on the ethics of touch. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. A Steady State and Quasi-Steady Interface Between the Generalized Fluid System Simulation Program and the SINDA/G Thermal Analysis Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Majumdar, Alok; Tiller, Bruce

    2001-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasisteady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  1. The algebraic decoding of the (41, 21, 9) quadratic residue code

    NASA Technical Reports Server (NTRS)

    Reed, Irving S.; Truong, T. K.; Chen, Xuemin; Yin, Xiaowei

    1992-01-01

    A new algebraic approach for decoding the quadratic residue (QR) codes, in particular the (41, 21, 9) QR code is presented. The key ideas behind this decoding technique are a systematic application of the Sylvester resultant method to the Newton identities associated with the code syndromes to find the error-locator polynomial, and next a method for determining error locations by solving certain quadratic, cubic and quartic equations over GF(2 exp m) in a new way which uses Zech's logarithms for the arithmetic. The algorithms developed here are suitable for implementation in a programmable microprocessor or special-purpose VLSI chip. It is expected that the algebraic methods developed here can apply generally to other codes such as the BCH and Reed-Solomon codes.

  2. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L. (Inventor); Kintner, Jr., Paul M. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor)

    2007-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  3. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor); Kintner, Jr., Paul M. (Inventor)

    2006-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  4. MCNP capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tallymore » characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.« less

  5. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    NASA Technical Reports Server (NTRS)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  6. 24 CFR 200.925a - Multifamily and care-type minimum property standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... COMMISSIONER, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property..., electrical, and elevators. (3) For purposes of this paragraph, a state or local code regulates an area if it...

  7. Eleventh NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1983-01-01

    NASTRAN (NASA STRUCTURAL ANALYSIS) is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis which was developed under NASA sponsorship. The Eleventh Colloquium provides some comprehensive general papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre- and post-processing or auxiliary programs, and new methods of analysis with NASTRAN.

  8. Planning U.S. General Purpose Forces: The Theater Nuclear Forces

    DTIC Science & Technology

    1977-01-01

    usefulness in combat. All U.S. nuclear weapons deployed in Europe are fitted with Permissive Action Links (PAL), coded devices designed to impede...may be proposed. The Standard Missile 2, the Harpoon missile, the Mk48 tor- pedo , and the SUBROC anti-submarine rocket are all being considered for...Permissive Action Link . A coded device attached to nuclear weapons deployed abroad that impedes the unauthorized arming or firing of the weapon. Pershing

  9. Purpose-Driven Communities in Multiplex Networks: Thresholding User-Engaged Layer Aggregation

    DTIC Science & Technology

    2016-06-01

    dark networks is a non-trivial yet useful task. Because terrorists work hard to hide their relationships/network, analysts have an incomplete picture...them identify meaningful terrorist communities. This thesis introduces a general-purpose algorithm for community detection in multiplex dark networks...aggregation, dark networks, conductance, cluster adequacy, mod- ularity, Louvain method, shortest path interdiction 15. NUMBER OF PAGES 155 16. PRICE CODE

  10. Screamer version 4.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, Rick; Struve, Kenneth W.; Kiefer, Mark L.

    2017-02-16

    Screamer is a special purpose circuit code developed for the design of Pulsed Power systems. It models electrical circuits which have a restricted topology in order to provide a fast-running tool while still allowing configurations general enough for most Pulsed Power system designs

  11. Use of Generalized Fluid System Simulation Program (GFSSP) for Teaching and Performing Senior Design Projects at the Educational Institutions

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Hedayat, A.

    2015-01-01

    This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.

  12. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

  13. JavaGenes Molecular Evolution

    NASA Technical Reports Server (NTRS)

    Lohn, Jason; Smith, David; Frank, Jeremy; Globus, Al; Crawford, James

    2007-01-01

    JavaGenes is a general-purpose, evolutionary software system written in Java. It implements several versions of a genetic algorithm, simulated annealing, stochastic hill climbing, and other search techniques. This software has been used to evolve molecules, atomic force field parameters, digital circuits, Earth Observing Satellite schedules, and antennas. This version differs from version 0.7.28 in that it includes the molecule evolution code and other improvements. Except for the antenna code, JaveGenes is available for NASA Open Source distribution.

  14. 5 CFR 890.103 - Correction of errors.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administration and General Provisions § 890.103... States Code, and permit the individual to enroll in another health benefits plan for purposes of this... health care providers. (e) Retroactive corrections are subject to withholdings and contributions under...

  15. Development of BEM for ceramic composites

    NASA Technical Reports Server (NTRS)

    Henry, D. P.; Banerjee, P. K.; Dargush, G. F.

    1990-01-01

    Details on the progress made during the first three years of a five-year program towards the development of a boundary element code are presented. This code was designed for the micromechanical studies of advance ceramic composites. Additional effort was made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry. The ceramic composite formulations developed were implemented in the three-dimensional boundary element computer code BEST3D. BEST3D was adopted as the base for the ceramic composite program, so that many of the enhanced features of this general purpose boundary element code could by utilized. Some of these facilities include sophisticated numerical integration, the capability of local definition of boundary conditions, and the use of quadratic shape functions for modeling geometry and field variables on the boundary. The multi-region implementation permits a body to be modeled in substructural parts; thus dramatically reducing the cost of the analysis. Furthermore, it allows a body consisting of regions of different ceramic matrices and inserts to be studied.

  16. Participation as an outcome measure in psychosocial oncology: content of cancer-specific health-related quality of life instruments.

    PubMed

    van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F

    2011-12-01

    To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.

  17. Computational Simulation of a Water-Cooled Heat Pump

    NASA Technical Reports Server (NTRS)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  18. 14 CFR 294.1 - Applicability and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... PROCEEDINGS) ECONOMIC REGULATIONS CANADIAN CHARTER AIR TAXI OPERATORS General § 294.1 Applicability and... taxi operators,” and establishes registration procedures for these carriers operating or seeking to... air taxi operators from certain provisions of the Subtitle VII of Title 49 of the United States Code...

  19. 14 CFR 294.1 - Applicability and purpose.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... PROCEEDINGS) ECONOMIC REGULATIONS CANADIAN CHARTER AIR TAXI OPERATORS General § 294.1 Applicability and... taxi operators,” and establishes registration procedures for these carriers operating or seeking to... air taxi operators from certain provisions of the Subtitle VII of Title 49 of the United States Code...

  20. Planning perception and action for cognitive mobile manipulators

    NASA Astrophysics Data System (ADS)

    Gaschler, Andre; Nogina, Svetlana; Petrick, Ronald P. A.; Knoll, Alois

    2013-12-01

    We present a general approach to perception and manipulation planning for cognitive mobile manipulators. Rather than hard-coding single purpose robot applications, a robot should be able to reason about its basic skills in order to solve complex problems autonomously. Humans intuitively solve tasks in real-world scenarios by breaking down abstract problems into smaller sub-tasks and use heuristics based on their previous experience. We apply a similar idea for planning perception and manipulation to cognitive mobile robots. Our approach is based on contingent planning and run-time sensing, integrated in our knowledge of volumes" planning framework, called KVP. Using the general-purpose PKS planner, we model information-gathering actions at plan time that have multiple possible outcomes at run time. As a result, perception and sensing arise as necessary preconditions for manipulation, rather than being hard-coded as tasks themselves. We demonstrate the e ectiveness of our approach on two scenarios covering visual and force sensing on a real mobile manipulator.

  1. Neoclassical toroidal viscosity in perturbed equilibria with general tokamak geometry

    NASA Astrophysics Data System (ADS)

    Logan, Nikolas C.; Park, Jong-Kyu; Kim, Kimin; Wang, Zhirui; Berkery, John W.

    2013-12-01

    This paper presents a calculation of neoclassical toroidal viscous torque independent of large-aspect-ratio expansions across kinetic regimes. The Perturbed Equilibrium Nonambipolar Transport (PENT) code was developed for this purpose, and is compared to previous combined regime models as well as regime specific limits and a drift kinetic δf guiding center code. It is shown that retaining general expressions, without circular large-aspect-ratio or other orbit approximations, can be important at experimentally relevant aspect ratio and shaping. The superbanana plateau, a kinetic resonance effect recently recognized for its relevance to ITER, is recovered by the PENT calculations and shown to require highly accurate treatment of geometric effects.

  2. libvdwxc: a library for exchange-correlation functionals in the vdW-DF family

    NASA Astrophysics Data System (ADS)

    Hjorth Larsen, Ask; Kuisma, Mikael; Löfgren, Joakim; Pouillon, Yann; Erhart, Paul; Hyldgaard, Per

    2017-09-01

    We present libvdwxc, a general library for evaluating the energy and potential for the family of vdW-DF exchange-correlation functionals. libvdwxc is written in C and provides an efficient implementation of the vdW-DF method and can be interfaced with various general-purpose DFT codes. Currently, the Gpaw and Octopus codes implement interfaces to libvdwxc. The present implementation emphasizes scalability and parallel performance, and thereby enables ab initio calculations of nanometer-scale complexes. The numerical accuracy is benchmarked on the S22 test set whereas parallel performance is benchmarked on ligand-protected gold nanoparticles ({{Au}}144{({{SC}}11{{NH}}25)}60) up to 9696 atoms.

  3. Requirements for migration of NSSD code systems from LTSS to NLTSS

    NASA Technical Reports Server (NTRS)

    Pratt, M.

    1984-01-01

    The purpose of this document is to address the requirements necessary for a successful conversion of the Nuclear Design (ND) application code systems to the NLTSS environment. The ND application code system community can be characterized as large-scale scientific computation carried out on supercomputers. NLTSS is a distributed operating system being developed at LLNL to replace the LTSS system currently in use. The implications of change are examined including a description of the computational environment and users in ND. The discussion then turns to requirements, first in a general way, followed by specific requirements, including a proposal for managing the transition.

  4. Electroconvulsive therapy: administrative codes, legislation, and professional recommendations.

    PubMed

    Harris, Victoria

    2006-01-01

    Government regulatory involvement in electroconvulsive therapy (ECT) is due to several factors, including patient advocate groups, prior abuse by psychiatrists, and a general trend of state authority to move into areas traditionally governed by medical authorities. Regardless of the specific reasons, ECT is both highly effective in the treatment of many psychiatric disorders and heavily regulated by state administrative codes and legislation. The purpose of this article is to conduct a systematic review of the state administrative codes and legislation for the 50 states, the District of Columbia, and Puerto Rico and to compare the findings with professional recommendations for the administration of ECT.

  5. Dusty Plasmas in Planetary Magnetospheres Award

    NASA Technical Reports Server (NTRS)

    Horanyi, Mihaly

    2005-01-01

    This is my final report for the grant Dusty Plasmas in Planetary Magnetospheres. The funding from this grant supported our research on dusty plasmas to study: a) dust plasma interactions in general plasma environments, and b) dusty plasma processes in planetary magnetospheres (Earth, Jupiter and Saturn). We have developed a general purpose transport code in order to follow the spatial and temporal evolution of dust density distributions in magnetized plasma environments. The code allows the central body to be represented by a multipole expansion of its gravitational and magnetic fields. The density and the temperature of the possibly many-component plasma environment can be pre-defined as a function of coordinates and, if necessary, the time as well. The code simultaneously integrates the equations of motion with the equations describing the charging processes. The charging currents are dependent not only on the instantaneous plasma parameters but on the velocity, as well as on the previous charging history of the dust grains.

  6. Radar transponder antenna pattern analysis for the space shuttle

    NASA Technical Reports Server (NTRS)

    Radcliff, Roger

    1989-01-01

    In order to improve tracking capability, radar transponder antennas will soon be mounted on the Shuttle solid rocket boosters (SRB). These four antennas, each being identical cavity-backed helices operating at 5.765 GHz, will be mounted near the top of the SRB's, adjacent to the intertank portion of the external tank. The purpose is to calculate the roll-plane pattern (the plane perpendicular to the SRB axes and containing the antennas) in the presence of this complex electromagnetic environment. The large electrical size of this problem mandates an optical (asymptotic) approach. Development of a specific code for this application is beyond the scope of a summer fellowship; thus a general purpose code, the Numerical Electromagnetics Code - Basic Scattering Code, was chosen as the computational tool. This code is based on the modern Geometrical Theory of Diffraction, and allows computation of scattering of bodies composed of canonical problems such as plates and elliptic cylinders. Apertures mounted on a curved surface (the SRB) cannot be accomplished by the code, so an antenna model consisting of wires excited by a method of moments current input was devised that approximated the actual performance of the antennas. The improvised antenna model matched well with measurements taken at the MSFC range. The SRB's, the external tank, and the shuttle nose were modeled as circular cylinders, and the code was able to produce what is thought to be a reasonable roll-plane pattern.

  7. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun

    2004-05-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less

  8. Reliability of Next Generation Power Electronics Packaging Under Concurrent Vibration, Thermal and High Power Loads

    DTIC Science & Technology

    2008-02-01

    combined thermal g effect and initial current field. The model is implemented using Abaqus user element subroutine and verified against the experimental...Finite Element Formulation The proposed model is implemented with ABAQUS general purpose finite element program using thermal -displacement analysis...option. ABAQUS and other commercially available finite element codes do not have the capability to solve general electromigration problem directly. Thermal

  9. Regulations: Guaranteed Student Loan Program.

    ERIC Educational Resources Information Center

    1979

    The text is given of the amendments to part 177 of Title 45 of the Code of Federal Regulations, concerning the federal Guaranteed Student Loan program. Subpart A concerns the program's purpose and scope. Subpart B concerns general provisions: definitions, eligibility, permissible charges, refunds, and prohibited transactions. Subpart C addresss…

  10. 29 CFR 4043.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the Code. Subpart A contains definitions and general rules. Subpart B contains rules for post-event... Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION PLAN TERMINATIONS REPORTABLE EVENTS AND... prescribes the requirements for notifying the PBGC of a reportable event under section 4043 of ERISA or of a...

  11. 29 CFR 4043.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the Code. Subpart A contains definitions and general rules. Subpart B contains rules for post-event... Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION PLAN TERMINATIONS REPORTABLE EVENTS AND... prescribes the requirements for notifying the PBGC of a reportable event under section 4043 of ERISA or of a...

  12. 29 CFR 4043.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the Code. Subpart A contains definitions and general rules. Subpart B contains rules for post-event... Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION PLAN TERMINATIONS REPORTABLE EVENTS AND... prescribes the requirements for notifying the PBGC of a reportable event under section 4043 of ERISA or of a...

  13. 29 CFR 4043.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the Code. Subpart A contains definitions and general rules. Subpart B contains rules for post-event... Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION PLAN TERMINATIONS REPORTABLE EVENTS AND... prescribes the requirements for notifying the PBGC of a reportable event under section 4043 of ERISA or of a...

  14. 29 CFR 4043.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Code. Subpart A contains definitions and general rules. Subpart B contains rules for post-event... Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION PLAN TERMINATIONS REPORTABLE EVENTS AND... prescribes the requirements for notifying the PBGC of a reportable event under section 4043 of ERISA or of a...

  15. 17 CFR 4.24 - General disclosures required.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... STATEMENT CANNOT DISCLOSE ALL THE RISKS AND OTHER FACTORS NECESSARY TO EVALUATE YOUR PARTICIPATION IN THIS... TREATED AS A COMMODITY CUSTOMER CLAIM FOR PURPOSES OF SUBCHAPTER IV OF CHAPTER 7 OF THE BANKRUPTCY CODE.../or swap dealer by an introducing broker (such as payment for order flow or soft dollar arrangements...

  16. 17 CFR 4.24 - General disclosures required.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... STATEMENT CANNOT DISCLOSE ALL THE RISKS AND OTHER FACTORS NECESSARY TO EVALUATE YOUR PARTICIPATION IN THIS... TREATED AS A COMMODITY CUSTOMER CLAIM FOR PURPOSES OF SUBCHAPTER IV OF CHAPTER 7 OF THE BANKRUPTCY CODE.../or swap dealer by an introducing broker (such as payment for order flow or soft dollar arrangements...

  17. 17 CFR 4.24 - General disclosures required.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... STATEMENT CANNOT DISCLOSE ALL THE RISKS AND OTHER FACTORS NECESSARY TO EVALUATE YOUR PARTICIPATION IN THIS... TREATED AS A COMMODITY CUSTOMER CLAIM FOR PURPOSES OF SUBCHAPTER IV OF CHAPTER 7 OF THE BANKRUPTCY CODE... and/or retail foreign exchange dealer by an introducing broker (such as payment for order flow or soft...

  18. 17 CFR 4.24 - General disclosures required.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... STATEMENT CANNOT DISCLOSE ALL THE RISKS AND OTHER FACTORS NECESSARY TO EVALUATE YOUR PARTICIPATION IN THIS... TREATED AS A COMMODITY CUSTOMER CLAIM FOR PURPOSES OF SUBCHAPTER IV OF CHAPTER 7 OF THE BANKRUPTCY CODE... and/or retail foreign exchange dealer by an introducing broker (such as payment for order flow or soft...

  19. 23 CFR 710.105 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Definitions. 710.105 Section 710.105 Highways FEDERAL... ESTATE General § 710.105 Definitions. (a) Terms defined in 49 CFR part 24, and 23 CFR part 1 have the... person acquiring real property for title 23 of the United States Code purposes. Acquisition means...

  20. 29 CFR 1926.407 - Hazardous (classified) locations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...

  1. 29 CFR 1926.407 - Hazardous (classified) locations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...

  2. Twelfth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1984-01-01

    NASTRAN is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis. The Twelfth Users' Colloquim provides some comprehensive papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre and post processing or auxiliary programs, and new methods of analysis with NASTRAN.

  3. 26 CFR 1.368-1 - Purpose and scope of exception of reorganization exchanges.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... under the Internal Revenue Code are a continuity of the business enterprise through the issuing... binding on January 28, 1998, and at all times thereafter. The continuity of business enterprise.... (d) Continuity of business enterprise—(1) General rule. Continuity of business enterprise (COBE...

  4. Albany v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew; Phipps, Eric; Ostien, Jakob

    2016-01-13

    The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less

  5. WIND Flow Solver Released

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.

    1999-01-01

    The WIND code is a general-purpose, structured, multizone, compressible flow solver that can be used to analyze steady or unsteady flow for a wide range of geometric configurations and over a wide range of flow conditions. WIND is the latest product of the NPARC Alliance, a formal partnership between the NASA Lewis Research Center and the Air Force Arnold Engineering Development Center (AEDC). WIND Version 1.0 was released in February 1998, and Version 2.0 will be released in February 1999. The WIND code represents a merger of the capabilities of three existing computational fluid dynamics codes--NPARC (the original NPARC Alliance flow solver), NXAIR (an Air Force code used primarily for unsteady store separation problems), and NASTD (the primary flow solver at McDonnell Douglas, now part of Boeing).

  6. On the effective implementation of a boundary element code on graphics processing units unsing an out-of-core LU algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Azevedo, Ed F; Nintcheu Fata, Sylvain

    2012-01-01

    A collocation boundary element code for solving the three-dimensional Laplace equation, publicly available from \\url{http://www.intetec.org}, has been adapted to run on an Nvidia Tesla general purpose graphics processing unit (GPU). Global matrix assembly and LU factorization of the resulting dense matrix were performed on the GPU. Out-of-core techniques were used to solve problems larger than available GPU memory. The code achieved over eight times speedup in matrix assembly and about 56~Gflops/sec in the LU factorization using only 512~Mbytes of GPU memory. Details of the GPU implementation and comparisons with the standard sequential algorithm are included to illustrate the performance ofmore » the GPU code.« less

  7. MCNP Version 6.2 Release Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.

    Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less

  8. SU-D-BRD-03: A Gateway for GPU Computing in Cancer Radiotherapy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, X; Folkerts, M; Shi, F

    Purpose: Graphics Processing Unit (GPU) has become increasingly important in radiotherapy. However, it is still difficult for general clinical researchers to access GPU codes developed by other researchers, and for developers to objectively benchmark their codes. Moreover, it is quite often to see repeated efforts spent on developing low-quality GPU codes. The goal of this project is to establish an infrastructure for testing GPU codes, cross comparing them, and facilitating code distributions in radiotherapy community. Methods: We developed a system called Gateway for GPU Computing in Cancer Radiotherapy Research (GCR2). A number of GPU codes developed by our group andmore » other developers can be accessed via a web interface. To use the services, researchers first upload their test data or use the standard data provided by our system. Then they can select the GPU device on which the code will be executed. Our system offers all mainstream GPU hardware for code benchmarking purpose. After the code running is complete, the system automatically summarizes and displays the computing results. We also released a SDK to allow the developers to build their own algorithm implementation and submit their binary codes to the system. The submitted code is then systematically benchmarked using a variety of GPU hardware and representative data provided by our system. The developers can also compare their codes with others and generate benchmarking reports. Results: It is found that the developed system is fully functioning. Through a user-friendly web interface, researchers are able to test various GPU codes. Developers also benefit from this platform by comprehensively benchmarking their codes on various GPU platforms and representative clinical data sets. Conclusion: We have developed an open platform allowing the clinical researchers and developers to access the GPUs and GPU codes. This development will facilitate the utilization of GPU in radiation therapy field.« less

  9. PILOT: A Precision Intercoastal Loran Translocator. Volume 3. Software.

    DTIC Science & Technology

    1982-03-01

    includes a second loran receiver (for cross chain operation), an interface or modem for remotely entering TD bias values, and a printer. b. The nucleus...developing an interface board to connect to the ship’s gyro, and a TD bias modem or box, replacing the large general purpose keyboard with a small predefined...The PILOT program has divided this memory into 8K of RAM and 56K of EPROM. Of the 56K bytes of EPROM, 40K are HP code and 16K are PILOT code (see Fig. 3

  10. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  11. The search for person-related information in general practice: a qualitative study.

    PubMed

    Schrans, Diego; Avonts, Dirk; Christiaens, Thierry; Willems, Sara; de Smet, Kaat; van Boven, Kees; Boeckxstaens, Pauline; Kühlein, Thomas

    2016-02-01

    General practice is person-focused. Contextual information influences the clinical decision-making process in primary care. Currently, person-related information (PeRI) is neither recorded in a systematic way nor coded in the electronic medical record (EMR), and therefore not usable for scientific use. To search for classes of PeRI influencing the process of care. GPs, from nine countries worldwide, were asked to write down narrative case histories where personal factors played a role in decision-making. In an inductive process, the case histories were consecutively coded according to classes of PeRI. The classes found were deductively applied to the following cases and refined, until saturation was reached. Then, the classes were grouped into code-families and further clustered into domains. The inductive analysis of 32 case histories resulted in 33 defined PeRI codes, classifying all personal-related information in the cases. The 33 codes were grouped in the following seven mutually exclusive code-families: 'aspects between patient and formal care provider', 'social environment and family', 'functioning/behaviour', 'life history/non-medical experiences', 'personal medical information', 'socio-demographics' and 'work-/employment-related information'. The code-families were clustered into four domains: 'social environment and extended family', 'medicine', 'individual' and 'work and employment'. As PeRI is used in the process of decision-making, it should be part of the EMR. The PeRI classes we identified might form the basis of a new contextual classification mainly for research purposes. This might help to create evidence of the person-centredness of general practice. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. 26 CFR 1.851-7 - Certain unit investment trusts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 9 2011-04-01 2011-04-01 false Certain unit investment trusts. 1.851-7 Section... (CONTINUED) INCOME TAXES (CONTINUED) Regulated Investment Companies and Real Estate Investment Trusts § 1.851-7 Certain unit investment trusts. (a) In general. For purposes of the Internal Revenue Code, a unit...

  13. 4 CFR 2.2 - References.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false References. 2.2 Section 2.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM PURPOSE AND GENERAL PROVISION § 2.2 References. (a) Subchapters III and IV of Chapter 7 of Title 31 U.S.C. (b) Title 5, United States Code. [45 FR 68375, Oct. 15, 1980, as...

  14. 4 CFR 2.2 - References.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 4 Accounts 1 2014-01-01 2013-01-01 true References. 2.2 Section 2.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM PURPOSE AND GENERAL PROVISION § 2.2 References. (a) Subchapters III and IV of Chapter 7 of Title 31 U.S.C. (b) Title 5, United States Code. [45 FR 68375, Oct. 15, 1980, as...

  15. 4 CFR 2.2 - References.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 4 Accounts 1 2011-01-01 2011-01-01 false References. 2.2 Section 2.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM PURPOSE AND GENERAL PROVISION § 2.2 References. (a) Subchapters III and IV of Chapter 7 of Title 31 U.S.C. (b) Title 5, United States Code. [45 FR 68375, Oct. 15, 1980, as...

  16. 4 CFR 2.2 - References.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 4 Accounts 1 2012-01-01 2012-01-01 false References. 2.2 Section 2.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM PURPOSE AND GENERAL PROVISION § 2.2 References. (a) Subchapters III and IV of Chapter 7 of Title 31 U.S.C. (b) Title 5, United States Code. [45 FR 68375, Oct. 15, 1980, as...

  17. 4 CFR 2.2 - References.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 4 Accounts 1 2013-01-01 2013-01-01 false References. 2.2 Section 2.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL SYSTEM PURPOSE AND GENERAL PROVISION § 2.2 References. (a) Subchapters III and IV of Chapter 7 of Title 31 U.S.C. (b) Title 5, United States Code. [45 FR 68375, Oct. 15, 1980, as...

  18. Composeable Chat over Low-Bandwidth Intermittent Communication Links

    DTIC Science & Technology

    2007-04-01

    Compression (STC), introduced in this report, is a data compression algorithm intended to compress alphanumeric... Ziv - Lempel coding, the grandfather of most modern general-purpose file compression programs, watches for input symbol sequences that have previously... data . This section applies these techniques to create a new compression algorithm called Small Text Compression . Various sequence compression

  19. 26 CFR 1.851-7 - Certain unit investment trusts.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... (CONTINUED) INCOME TAXES (CONTINUED) Regulated Investment Companies and Real Estate Investment Trusts § 1.851-7 Certain unit investment trusts. (a) In general. For purposes of the Internal Revenue Code, a unit... 26 Internal Revenue 9 2013-04-01 2013-04-01 false Certain unit investment trusts. 1.851-7 Section...

  20. 26 CFR 1.851-7 - Certain unit investment trusts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (CONTINUED) INCOME TAXES (CONTINUED) Regulated Investment Companies and Real Estate Investment Trusts § 1.851-7 Certain unit investment trusts. (a) In general. For purposes of the Internal Revenue Code, a unit... 26 Internal Revenue 9 2014-04-01 2014-04-01 false Certain unit investment trusts. 1.851-7 Section...

  1. 26 CFR 1.851-7 - Certain unit investment trusts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... (CONTINUED) INCOME TAXES (CONTINUED) Regulated Investment Companies and Real Estate Investment Trusts § 1.851-7 Certain unit investment trusts. (a) In general. For purposes of the Internal Revenue Code, a unit... 26 Internal Revenue 9 2012-04-01 2012-04-01 false Certain unit investment trusts. 1.851-7 Section...

  2. 26 CFR 1.851-7 - Certain unit investment trusts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 9 2010-04-01 2010-04-01 false Certain unit investment trusts. 1.851-7 Section... (CONTINUED) INCOME TAXES Regulated Investment Companies and Real Estate Investment Trusts § 1.851-7 Certain unit investment trusts. (a) In general. For purposes of the Internal Revenue Code, a unit investment...

  3. 26 CFR 1.168(i)-1 - General asset accounts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... under paragraph (k) of this section. (b) Definitions. For purposes of this section, the following.... If a taxpayer makes the election under paragraph (k) of this section, assets that are subject to the... the Code that treat gain on a disposition as subject to section 1245 or 1250). (iii) Effect of...

  4. 26 CFR 1.168(i)-1 - General asset accounts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... under paragraph (k) of this section. (b) Definitions. For purposes of this section, the following.... If a taxpayer makes the election under paragraph (k) of this section, assets that are subject to the... the Code that treat gain on a disposition as subject to section 1245 or 1250). (iii) Effect of...

  5. Combining analysis with optimization at Langley Research Center. An evolutionary process

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1982-01-01

    The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.

  6. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    PubMed

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution.

    PubMed

    Correia, J R C C C; Martins, C J A P

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  8. High-speed architecture for the decoding of trellis-coded modulation

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  9. Implementation of Finite Volume based Navier Stokes Algorithm Within General Purpose Flow Network Code

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Majumdar, Alok

    2012-01-01

    This paper describes a finite volume based numerical algorithm that allows multi-dimensional computation of fluid flow within a system level network flow analysis. There are several thermo-fluid engineering problems where higher fidelity solutions are needed that are not within the capacity of system level codes. The proposed algorithm will allow NASA's Generalized Fluid System Simulation Program (GFSSP) to perform multi-dimensional flow calculation within the framework of GFSSP s typical system level flow network consisting of fluid nodes and branches. The paper presents several classical two-dimensional fluid dynamics problems that have been solved by GFSSP's multi-dimensional flow solver. The numerical solutions are compared with the analytical and benchmark solution of Poiseulle, Couette and flow in a driven cavity.

  10. Energy levels and radiative rates for transitions in Cr-like Co IV and Ni V

    NASA Astrophysics Data System (ADS)

    Aggarwal, K. M.; Bogdanovich, P.; Karpuškienė, R.; Keenan, F. P.; Kisielius, R.; Stancalie, V.

    2016-01-01

    We report calculations of energy levels and radiative rates (A-values) for transitions in Cr-like Co IV and Ni V. The quasi-relativistic Hartree-Fock (QRHF) code is adopted for calculating the data although GRASP (general-purpose relativistic atomic structure package) and flexible atomic code (FAC) have also been employed for comparison purposes. No radiative rates are available in the literature to compare with our results, but our calculated energies are in close agreement with those compiled by NIST for a majority of the levels. However, there are discrepancies for a few levels of up to 3%. The A-values are listed for all significantly contributing E1, E2 and M1 transitions, and the corresponding lifetimes reported, although unfortunately no previous theoretical or experimental results exist to compare with our data.

  11. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.

  12. ForTrilinos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Katherine J; Johnson, Seth R; Prokopenko, Andrey V

    'ForTrilinos' is related to The Trilinos Project, which contains a large and growing collection of solver capabilities that can utilize next-generation platforms, in particular scalable multicore, manycore, accelerator and heterogeneous systems. Trilinos is primarily written in C++, including its user interfaces. While C++ is advantageous for gaining access to the latest programming environments, it limits Trilinos usage via Fortran. Sever ad hoc translation interfaces exist to enable Fortran usage of Trilinos, but none of these interfaces is general-purpose or written for reusable and sustainable external use. 'ForTrilinos' provides a seamless pathway for large and complex Fortran-based codes to access Trilinosmore » without C/C++ interface code. This access includes Fortran versions of Kokkos abstractions for code execution and data management.« less

  13. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  14. To amend title 40, United States Code, to direct the Inspector General of the Department of Transportation to conduct an annual independent financial audit of the Union Station Redevelopment Corporation, and for other purposes.

    THOMAS, 112th Congress

    Rep. Norton, Eleanor Holmes [D-DC-At Large

    2011-07-28

    House - 07/29/2011 Referred to the Subcommittee on Economic Development, Public Buildings and Emergency Management. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  15. 26 CFR 1.614-1 - Definition of property.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 7 2014-04-01 2013-04-01 true Definition of property. 1.614-1 Section 1.614-1...) INCOME TAXES (CONTINUED) Natural Resources § 1.614-1 Definition of property. (a) General rule. (1) For purposes of subtitle A of the Code, in the case of mines, wells, and other natural deposits, the term...

  16. 26 CFR 1.614-1 - Definition of property.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 7 2011-04-01 2009-04-01 true Definition of property. 1.614-1 Section 1.614-1...) INCOME TAXES (CONTINUED) Natural Resources § 1.614-1 Definition of property. (a) General rule. (1) For purposes of subtitle A of the Code, in the case of mines, wells, and other natural deposits, the term...

  17. 26 CFR 1.614-1 - Definition of property.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 7 2013-04-01 2013-04-01 false Definition of property. 1.614-1 Section 1.614-1...) INCOME TAXES (CONTINUED) Natural Resources § 1.614-1 Definition of property. (a) General rule. (1) For purposes of subtitle A of the Code, in the case of mines, wells, and other natural deposits, the term...

  18. 26 CFR 1.614-1 - Definition of property.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 7 2012-04-01 2012-04-01 false Definition of property. 1.614-1 Section 1.614-1...) INCOME TAXES (CONTINUED) Natural Resources § 1.614-1 Definition of property. (a) General rule. (1) For purposes of subtitle A of the Code, in the case of mines, wells, and other natural deposits, the term...

  19. 26 CFR 1.614-1 - Definition of property.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Definition of property. 1.614-1 Section 1.614-1...) INCOME TAXES (CONTINUED) Natural Resources § 1.614-1 Definition of property. (a) General rule. (1) For purposes of subtitle A of the Code, in the case of mines, wells, and other natural deposits, the term...

  20. A bill to amend the Internal Revenue Code of 1986 with respect to the proper tax treatment of certain indebtedness discharged in 2009 or 2010, and for other purposes.

    THOMAS, 111th Congress

    Sen. Ensign, John [R-NV

    2009-01-06

    Senate - 01/07/2009 Read the second time. Placed on Senate Legislative Calendar under General Orders. Calendar No. 11. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  1. ToonTalk(TM)--An Animated Programming Environment for Children.

    ERIC Educational Resources Information Center

    Kahn, Ken

    This paper describes ToonTalk, a general-purpose concurrent programming system in which the source code is animated and the programming environment is a video game. The design objectives of ToonTalk were to create a self-teaching programming system for children that was also a very powerful and flexible programming tool. A keyboard can be used for…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Platania, P., E-mail: platania@ifp.cnr.it; Figini, L.; Farina, D.

    The purpose of this work is the optical modeling and physical performances evaluations of the JT-60SA ECRF launcher system. The beams have been simulated with the electromagnetic code GRASP® and used as input for ECCD calculations performed with the beam tracing code GRAY, capable of modeling propagation, absorption and current drive of an EC Gaussion beam with general astigmatism. Full details of the optical analysis has been taken into account to model the launched beams. Inductive and advanced reference scenarios has been analysed for physical evaluations in the full poloidal and toroidal steering ranges for two slightly different layouts ofmore » the launcher system.« less

  3. Simulation of Fatigue Behavior of High Temperature Metal Matrix Composites

    NASA Technical Reports Server (NTRS)

    Tong, Mike T.; Singhal, Suren N.; Chamis, Christos C.; Murthy, Pappu L. N.

    1996-01-01

    A generalized relatively new approach is described for the computational simulation of fatigue behavior of high temperature metal matrix composites (HT-MMCs). This theory is embedded in a specialty-purpose computer code. The effectiveness of the computer code to predict the fatigue behavior of HT-MMCs is demonstrated by applying it to a silicon-fiber/titanium-matrix HT-MMC. Comparative results are shown for mechanical fatigue, thermal fatigue, thermomechanical (in-phase and out-of-phase) fatigue, as well as the effects of oxidizing environments on fatigue life. These results show that the new approach reproduces available experimental data remarkably well.

  4. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  5. DFTB+ and lanthanides

    NASA Astrophysics Data System (ADS)

    Hourahine, B.; Aradi, B.; Frauenheim, T.

    2010-07-01

    DFTB+ is a recent general purpose implementation of density-functional based tight binding. One of the early motivators to develop this code was to investigate lanthanide impurities in nitride semiconductors, leading to a series of successful studies into structure and electrical properties of these systems. Here we describe our general framework to treat the physical effects needed for these problematic impurities within a tight-binding formalism, additionally discussing forces and stresses in DFTB. We also present an approach to evaluate the general case of Slater-Koster transforms and all of their derivatives in Cartesian coordinates. These developments are illustrated by simulating isolated Gd impurities in GaN.

  6. Diagnosis - Using automatic test equipment and artificial intelligence expert systems

    NASA Astrophysics Data System (ADS)

    Ramsey, J. E., Jr.

    Three expert systems (ATEOPS, ATEFEXPERS, and ATEFATLAS), which were created to direct automatic test equipment (ATE), are reviewed. The purpose of the project was to develop an expert system to troubleshoot the converter-programmer power supply card for the F-15 aircraft and have that expert system direct the automatic test equipment. Each expert system uses a different knowledge base or inference engine, basing the testing on the circuit schematic, test requirements document, or ATLAS code. Implementing generalized modules allows the expert systems to be used for any different unit under test. Using converted ATLAS to LISP code allows the expert system to direct any ATE using ATLAS. The constraint propagated frame system allows for the expansion of control by creating the ATLAS code, checking the code for good software engineering techniques, directing the ATE, and changing the test sequence as needed (planning).

  7. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  8. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  9. Verification of transport equations in a general purpose commercial CFD code.

    NASA Astrophysics Data System (ADS)

    Melot, Matthieu; Nennemann, Bernd; Deschênes, Claire

    2016-11-01

    In this paper, the Verification and Validation methodology is presented. This method aims to increase the reliability and the trust that can be placed into complex CFD simulations. The first step of this methodology, the code verification is presented in greater details. The CFD transport equations in steady state, transient and Arbitrary Eulerian Lagrangian (ALE, used for transient moving mesh) formulations in Ansys CFX are verified. It is shown that the expected spatial and temporal order of convergence are achieved for the steady state and the transient formulations. Unfortunately this is not completely the case for the ALE formulation. As for a lot of other commercial and in-house CFD codes, the temporal convergence of the velocity is limited to a first order where a second order would have been expected.

  10. Fifteenth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Numerous applications of the NASA Structural Analysis (NASTRAN) computer program, a general purpose finite element code, are discussed. Additional features that can be added to NASTRAN, interactive plotting of NASTRAN data on microcomputers, mass modeling for bars, the design of wind tunnel models, the analysis of ship structures subjected to underwater explosions, and buckling analysis of radio antennas are among the topics discussed.

  11. STAR-CCM+ Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less

  12. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  13. A multiphysics and multiscale software environment for modeling astrophysical systems

    NASA Astrophysics Data System (ADS)

    Portegies Zwart, Simon; McMillan, Steve; Harfst, Stefan; Groen, Derek; Fujii, Michiko; Nualláin, Breanndán Ó.; Glebbeek, Evert; Heggie, Douglas; Lombardi, James; Hut, Piet; Angelou, Vangelis; Banerjee, Sambaran; Belkus, Houria; Fragos, Tassos; Fregeau, John; Gaburov, Evghenii; Izzard, Rob; Jurić, Mario; Justham, Stephen; Sottoriva, Andrea; Teuben, Peter; van Bever, Joris; Yaron, Ofer; Zemp, Marcel

    2009-05-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a "Noah's Ark" milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multiscale and multiphysics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe three examples calculated using MUSE: the merger of two galaxies, the merger of two evolving stars, and a hybrid N-body simulation. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  14. Status and future of MUSE

    NASA Astrophysics Data System (ADS)

    Harfst, S.; Portegies Zwart, S.; McMillan, S.

    2008-12-01

    We present MUSE, a software framework for combining existing computational tools from different astrophysical domains into a single multi-physics, multi-scale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying an interface between each module and the framework that represents a balance between generality and computational efficiency. This approach allows scientists to use combinations of codes to solve highly-coupled problems without the need to write new codes for other domains or significantly alter their existing codes. MUSE currently incorporates the domains of stellar dynamics, stellar evolution and stellar hydrodynamics for studying generalized stellar systems. We have now reached a ``Noah's Ark'' milestone, with (at least) two available numerical solvers for each domain. MUSE can treat multi-scale and multi-physics systems in which the time- and size-scales are well separated, like simulating the evolution of planetary systems, small stellar associations, dense stellar clusters, galaxies and galactic nuclei. In this paper we describe two examples calculated using MUSE: the merger of two galaxies and an N-body simulation with live stellar evolution. In addition, we demonstrate an implementation of MUSE on a distributed computer which may also include special-purpose hardware, such as GRAPEs or GPUs, to accelerate computations. The current MUSE code base is publicly available as open source at http://muse.li.

  15. Interfacing the Generalized Fluid System Simulation Program with the SINDA/G Thermal Program

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Palmiter, Christopher; Farmer, Jeffery; Lycans, Randall; Tiller, Bruce

    2000-01-01

    A general purpose, one dimensional fluid flow code has been interfaced with the thermal analysis program SINDA/G. The flow code, GFSSP, is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development was conducted in two phases. This paper describes the first (which allows for steady and quasi-steady - unsteady solid, steady fluid - conjugate heat transfer modeling). The second (full transient conjugate heat transfer modeling) phase of the interface development will be addressed in a later paper. Phase 1 development has been benchmarked to an analytical solution with excellent agreement. Additional test cases for each development phase demonstrate desired features of the interface. The results of the benchmark case, three additional test cases and a practical application are presented herein.

  16. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  17. Explicit robust schemes for implementation of general principal value-based constitutive models

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.

    1993-01-01

    The issue of developing effective and robust schemes to implement general hyperelastic constitutive models is addressed. To this end, special purpose functions are used to symbolically derive, evaluate, and automatically generate the associated FORTRAN code for the explicit forms of the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid for the entire deformation range. The analytical form of these explicit expressions is given here for the case in which the strain-energy potential is taken as a nonseparable polynomial function of the principle stretches.

  18. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loughry, Thomas A.

    As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to tenmore » times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.« less

  19. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  20. 26 CFR 1.6038-2 - Information returns required of United States persons with respect to annual accounting periods...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) beginning after December 31, 1962, of each foreign corporation which that person controls (as defined in... defined in section 1504(d) of the Code which makes a consolidated return for the taxable year. The return... determining control as defined in paragraph (b) of this section. (d) U.S. person—(1) In general. For purposes...

  1. Nondiscrimination on the Basis of Handicap in Programs or Activities Receiving Federal Financial Assistance. Title 34, Code of Federal Regulations, Part 104.

    ERIC Educational Resources Information Center

    Office for Civil Rights (ED), Washington, DC.

    This document presents the final regulations of the Office for Civil Rights, Department of Education, concerning nondiscrimination on the basis of handicap in programs or activities receiving federal financial assistance under Section 504 of the Rehabilitation Act of 1973. The first part covers general provisions such as purpose, application,…

  2. A bill to amend title 38, United States Code, to repeal the prohibition on collective bargaining with respect to matters and questions regarding compensation of employees of the Department of Veterans Affairs other than rates of basic pay, and for other purposes.

    THOMAS, 111th Congress

    Sen. Brown, Sherrod [D-OH

    2010-06-15

    Senate - 09/02/2010 Placed on Senate Legislative Calendar under General Orders. Calendar No. 558. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  3. A bill to amend title 38, United States Code, to repeal the prohibition on collective bargaining with respect to matters and questions regarding compensation of employees of the Department of Veterans Affairs other than rates of basic pay, and for other purposes.

    THOMAS, 112th Congress

    Sen. Brown, Sherrod [D-OH

    2011-03-14

    Senate - 09/06/2011 Placed on Senate Legislative Calendar under General Orders. Calendar No. 148. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  4. Project DIPOLE WEST - Multiburst Environment (Non-Simultaneous Detonations)

    DTIC Science & Technology

    1976-09-01

    PAGE (WIMn Dat• Bntered) Unclassified SECURITY CLASSIFICATION OP’ THIS PAGE(ft• Data .Bnt......, 20. Abstract Purpose of the series was to obtain...HULL hydrodynamic air blast code show good correlation. UNCLASSIFIED SECUFUTY CLASSIFICATION OF THIS PA.GE(When Date Bntered) • • 1...supervision. Contributions were also made by Dr. John Dewey, University of Victoria; Mr. A. P. R. Lambert, Canadian General Electric; Mr. Charles Needham

  5. Extremely high frequency RF effects on electronics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loubriel, Guillermo Manuel; Vigliano, David; Coleman, Phillip Dale

    The objective of this work was to understand the fundamental physics of extremely high frequency RF effects on electronics. To accomplish this objective, we produced models, conducted simulations, and performed measurements to identify the mechanisms of effects as frequency increases into the millimeter-wave regime. Our purpose was to answer the questions, 'What are the tradeoffs between coupling, transmission losses, and device responses as frequency increases?', and, 'How high in frequency do effects on electronic systems continue to occur?' Using full wave electromagnetics codes and a transmission-line/circuit code, we investigated how extremely high-frequency RF propagates on wires and printed circuit boardmore » traces. We investigated both field-to-wire coupling and direct illumination of printed circuit boards to determine the significant mechanisms for inducing currents at device terminals. We measured coupling to wires and attenuation along wires for comparison to the simulations, looking at plane-wave coupling as it launches modes onto single and multiconductor structures. We simulated the response of discrete and integrated circuit semiconductor devices to those high-frequency currents and voltages, using SGFramework, the open-source General-purpose Semiconductor Simulator (gss), and Sandia's Charon semiconductor device physics codes. This report documents our findings.« less

  6. WestREN: a description of an Irish academic general practice research network

    PubMed Central

    2010-01-01

    Background Primary care research networks have been established internationally since the 1960s to enable diverse practitioners to engage in and develop research and education and implement research evidence. The newly established Western Research and Education Network (WestREN) is one such network consisting of a collaboration between the Discipline of General Practice at NUI Galway and 71 West of Ireland general practices. In September 2009 all member practices were issued with a questionnaire with two objectives: to describe the structure and characteristics of the member practices and to compare the results to the national profile of Irish general practice. Methods A postal survey was used followed by one written and one email reminder. Results A response rate of 73% (52/71) was achieved after two reminders. Half of practices were in a rural location, one quarter located in an urban setting and another quarter in a mixed location. Ninety-four per cent of general practitioners practice from purpose-built or adapted premises with under 6% of practices being attached to the general practitioner's residence. Over 96% of general practitioners use appointment systems with 58% using appointment only. All practices surveyed were computerised, with 80% describing their practices as 'fully computerised'. Almost 60% of general practitioners are coding chronic diagnoses with 20% coding individual consultations. Twenty-five per cent of general practitioners were single-handed with the majority of practices having at least two general practitioners, and a mean number of general practitioners of 2.4. Ninety-two per cent of practices employed a practice nurse with 30% employing more than one nurse. Compared to the national profile, WestREN practices appear somewhat larger, and more likely to be purpose-built and in rural areas. National trends apparent between 1982 and 1992, such as increasing computerisation and practice nurse availability, appear to be continuing. Conclusions WestREN is a new university-affiliated general practice research network in Ireland. Survey of its initial membership confirms WestREN practices to be broadly representative of the national profile and has provided us with valuable information on the current and changing structure of Irish general practice. PMID:20925958

  7. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  8. New Parallel computing framework for radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, M.A.; /Michigan State U., NSCL; Mokhov, N.V.

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility canmore » be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  9. Deploying electromagnetic particle-in-cell (EM-PIC) codes on Xeon Phi accelerators boards

    NASA Astrophysics Data System (ADS)

    Fonseca, Ricardo

    2014-10-01

    The complexity of the phenomena involved in several relevant plasma physics scenarios, where highly nonlinear and kinetic processes dominate, makes purely theoretical descriptions impossible. Further understanding of these scenarios requires detailed numerical modeling, but fully relativistic particle-in-cell codes such as OSIRIS are computationally intensive. The quest towards Exaflop computer systems has lead to the development of HPC systems based on add-on accelerator cards, such as GPGPUs and more recently the Xeon Phi accelerators that power the current number 1 system in the world. These cards, also referred to as Intel Many Integrated Core Architecture (MIC) offer peak theoretical performances of >1 TFlop/s for general purpose calculations in a single board, and are receiving significant attention as an attractive alternative to CPUs for plasma modeling. In this work we report on our efforts towards the deployment of an EM-PIC code on a Xeon Phi architecture system. We will focus on the parallelization and vectorization strategies followed, and present a detailed performance evaluation of code performance in comparison with the CPU code.

  10. Gear optimization

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.; Chen, Xiang; Zhang, Ning-Tian

    1988-01-01

    The use of formal numerical optimization methods for the design of gears is investigated. To achieve this, computer codes were developed for the analysis of spur gears and spiral bevel gears. These codes calculate the life, dynamic load, bending strength, surface durability, gear weight and size, and various geometric parameters. It is necessary to calculate all such important responses because they all represent competing requirements in the design process. The codes developed here were written in subroutine form and coupled to the COPES/ADS general purpose optimization program. This code allows the user to define the optimization problem at the time of program execution. Typical design variables include face width, number of teeth and diametral pitch. The user is free to choose any calculated response as the design objective to minimize or maximize and may impose lower and upper bounds on any calculated responses. Typical examples include life maximization with limits on dynamic load, stress, weight, etc. or minimization of weight subject to limits on life, dynamic load, etc. The research codes were written in modular form for easy expansion and so that they could be combined to create a multiple reduction optimization capability in future.

  11. The NASA Neutron Star Grand Challenge: The coalescences of Neutron Star Binary System

    NASA Astrophysics Data System (ADS)

    Suen, Wai-Mo

    1998-04-01

    NASA funded a Grand Challenge Project (9/1996-1999) for the development of a multi-purpose numerical treatment for relativistic astrophysics and gravitational wave astronomy. The coalescence of binary neutron stars is chosen as the model problem for the code development. The institutes involved in it are the Argonne Lab, Livermore lab, Max-Planck Institute at Potsdam, StonyBrook, U of Illinois and Washington U. We have recently succeeded in constructing a highly optimized parallel code which is capable of solving the full Einstein equations coupled with relativistic hydrodynamics, running at over 50 GFLOPS on a T3E (the second milestone point of the project). We are presently working on the head-on collisions of two neutron stars, and the inclusion of realistic equations of state into the code. The code will be released to the relativity and astrophysics community in April of 1998. With the full dynamics of the spacetime, relativistic hydro and microphysics all combined into a unified 3D code for the first time, many interesting large scale calculations in general relativistic astrophysics can now be carried out on massively parallel computers.

  12. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  13. Langley Stability and Transition Analysis Code (LASTRAC) Version 1.2 User Manual

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    LASTRAC is a general-purposed, physics-based transition prediction code released by NASA for Laminar Flow Control studies and transition research. The design and development of the LASTRAC code is aimed at providing an engineering tool that is easy to use and yet capable of dealing with a broad range of transition related issues. It was written from scratch based on the state-of-the-art numerical methods for stability analysis and modern software technologies. At low fidelity, it allows users to perform linear stability analysis and N-factor transition correlation for a broad range of flow regimes and configurations by using either the linear stability theory or linear parabolized stability equations method. At high fidelity, users may use nonlinear PSE to track finite-amplitude disturbances until the skin friction rise. This document describes the governing equations, numerical methods, code development, detailed description of input/output parameters, and case studies for the current release of LASTRAC.

  14. Performance of Low-Density Parity-Check Coded Modulation

    NASA Astrophysics Data System (ADS)

    Hamkins, J.

    2011-02-01

    This article presents the simulated performance of a family of nine AR4JA low-density parity-check (LDPC) codes when used with each of five modulations. In each case, the decoder inputs are codebit log-likelihood ratios computed from the received (noisy) modulation symbols using a general formula which applies to arbitrary modulations. Suboptimal soft-decision and hard-decision demodulators are also explored. Bit-interleaving and various mappings of bits to modulation symbols are considered. A number of subtle decoder algorithm details are shown to affect performance, especially in the error floor region. Among these are quantization dynamic range and step size, clipping degree-one variable nodes, "Jones clipping" of variable nodes, approximations of the min* function, and partial hard-limiting messages from check nodes. Using these decoder optimizations, all coded modulations simulated here are free of error floors down to codeword error rates below 10^{-6}. The purpose of generating this performance data is to aid system engineers in determining an appropriate code and modulation to use under specific power and bandwidth constraints, and to provide information needed to design a variable/adaptive coded modulation (VCM/ACM) system using the AR4JA codes. IPNPR Volume 42-185 Tagged File.txt

  15. Design of RISC Processor Using VHDL and Cadence

    NASA Astrophysics Data System (ADS)

    Moslehpour, Saeid; Puliroju, Chandrasekhar; Abu-Aisheh, Akram

    The project deals about development of a basic RISC processor. The processor is designed with basic architecture consisting of internal modules like clock generator, memory, program counter, instruction register, accumulator, arithmetic and logic unit and decoder. This processor is mainly used for simple general purpose like arithmetic operations and which can be further developed for general purpose processor by increasing the size of the instruction register. The processor is designed in VHDL by using Xilinx 8.1i version. The present project also serves as an application of the knowledge gained from past studies of the PSPICE program. The study will show how PSPICE can be used to simplify massive complex circuits designed in VHDL Synthesis. The purpose of the project is to explore the designed RISC model piece by piece, examine and understand the Input/ Output pins, and to show how the VHDL synthesis code can be converted to a simplified PSPICE model. The project will also serve as a collection of various research materials about the pieces of the circuit.

  16. Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.

    PubMed

    Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh

    2018-01-01

    Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.

  17. Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trambauer, K.

    1997-07-01

    The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonablemore » accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running time.« less

  18. Meshing of a Spiral Bevel Gearset with 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, George D.; Handschuh, Robert

    1996-01-01

    Recent advances in spiral bevel gear geometry and finite element technology make it practical to conduct a structural analysis and analytically roll the gearset through mesh. With the advent of user specific programming linked to 3D solid modelers and mesh generators, model generation has become greatly automated. Contact algorithms available in general purpose finite element codes eliminate the need for the use and alignment of gap elements. Once the gearset is placed in mesh, user subroutines attached to the FE code easily roll the gearset through mesh. The method is described in detail. Preliminary results for a gearset segment showing the progression of the contact lineload is given as the gears roll through mesh.

  19. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  20. Red River Waterway Thermal Studies. Report 2. Thermal Stress Analyses

    DTIC Science & Technology

    1991-12-01

    stress relaxation, q. Shrinkage of the concrete, and . Thermal properties of the concrete including coefficient of thermal expansion , specific heat...Finite-Element Code 12. The thermal stress analyses in this investigation was performed using ABAQUS , a general-purpose, heat-transfer and structural...model (the UMAT 9 subroutine discussed below) may be incorporated as an external subroutine linked to the ABAQUS library. 14. In order to model the

  1. NASA Requirements for Ground-Based Pressure Vessels and Pressurized Systems (PVS). Revision C

    NASA Technical Reports Server (NTRS)

    Greulich, Owen Rudolf

    2017-01-01

    The purpose of this document is to ensure the structural integrity of PVS through implementation of a minimum set of requirements for ground-based PVS in accordance with this document, NASA Policy Directive (NPD) 8710.5, NASA Safety Policy for Pressure Vessels and Pressurized Systems, NASA Procedural Requirements (NPR) 8715.3, NASA General Safety Program Requirements, applicable Federal Regulations, and national consensus codes and standards (NCS).

  2. A bill to amend title 38, United States Code, to protect certain veterans who would otherwise be subject to a reduction in educational assistance benefits, and for other purposes.

    THOMAS, 112th Congress

    Sen. Schumer, Charles E. [D-NY

    2011-04-06

    Senate - 07/19/2011 Placed on Senate Legislative Calendar under General Orders. Calendar No. 105. (All Actions) Notes: For further action, see H.R.1383, which became Public Law 112-26 on 8/3/2011. Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  3. Effects of Video Weather Training Products, Web-Based Preflight Weather Briefing, and Local Versus Non-Local Pilots on General Aviation Pilot Weather Knowledge and Flight Behavior. Phase 1

    DTIC Science & Technology

    2010-01-01

    Seemingly not . Repeated measures analysis of variance (ANOVA) for posttest - pretest score gain x training product interaction yielded a non-significant...Code 15. Supplemental Notes Work was accomplished under approved task AM-A-07-HRR-521 16. Abstract This research has two main...1 Purpose of This Research

  4. An original bill to amend title 38, United States Code, to improve Servicemembers' Group Life Insurance and Veterans' Group Life Insurance and to modify the provision of compensation and pension to surviving spouses of veterans in months of the deaths of the veterans, and for other purposes.

    THOMAS, 111th Congress

    Sen. Akaka, Daniel K. [D-HI

    2010-09-02

    Senate - 09/02/2010 Placed on Senate Legislative Calendar under General Orders. Calendar No. 553. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  5. Development of an integrated BEM approach for hot fluid structure interaction

    NASA Technical Reports Server (NTRS)

    Dargush, Gary F.; Banerjee, Prasanta K.; Dunn, Michael G.

    1988-01-01

    Significant progress was made toward the goal of developing a general purpose boundary element method for hot fluid-structure interaction. For the solid phase, a boundary-only formulation was developed and implemented for uncoupled transient thermoelasticity in two dimensions. The elimination of volume discretization not only drastically reduces required modeling effort, but also permits unconstrained variation of the through-the-thickness temperature distribution. Meanwhile, for the fluids, fundamental solutions were derived for transient incompressible and compressible flow in the absence of the convective terms. Boundary element formulations were developed and described. For the incompressible case, the necessary kernal functions, under transient and steady-state conditions, were derived and fully implemented into a general purpose, multi-region boundary element code. Several examples were examined to study the suitability and convergence characteristics of the various algorithms.

  6. Water Hammer Simulations of MMH Propellant - New Capability Demonstration of the Generalized Fluid Flow Simulation Program

    NASA Technical Reports Server (NTRS)

    Burkhardt, Z.; Ramachandran, N.; Majumdar, A.

    2017-01-01

    Fluid Transient analysis is important for the design of spacecraft propulsion system to ensure structural stability of the system in the event of sudden closing or opening of the valve. Generalized Fluid System Simulation Program (GFSSP), a general purpose flow network code developed at NASA/MSFC is capable of simulating pressure surge due to sudden opening or closing of valve when thermodynamic properties of real fluid are available for the entire range of simulation. Specifically GFSSP needs an accurate representation of pressure-density relationship in order to predict pressure surge during a fluid transient. Unfortunately, the available thermodynamic property programs such as REFPROP, GASP or GASPAK does not provide the thermodynamic properties of Monomethylhydrazine (MMH). This paper will illustrate the process used for building a customized table of properties of state variables from available properties and speed of sound that is required by GFSSP for simulation. Good agreement was found between the simulations and measured data. This method can be adopted for modeling flow networks and systems with other fluids whose properties are not known in detail in order to obtain general technical insight. Rigorous code validation of this approach will be done and reported at a future date.

  7. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA andmore » MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.« less

  8. Finite element for rotor/stator interactive forces in general engine dynamic simulation. Part 1: Development of bearing damper element

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1980-01-01

    A general purpose squeeze-film damper interactive force element was developed, coded into a software package (module) and debugged. This software package was applied to nonliner dynamic analyses of some simple rotor systems. Results for pressure distributions show that the long bearing (end sealed) is a stronger bearing as compared to the short bearing as expected. Results of the nonlinear dynamic analysis, using a four degree of freedom simulation model, showed that the orbit of the rotating shaft increases nonlinearity to fill the bearing clearance as the unbalanced weight increases.

  9. On the Representation of the Porosity-Pressure Relationship in General Subsurface Flow Codes

    DOE PAGES

    Birdsell, Daniel Traver; Karra, Satish; Rajaram, Harihar

    2018-01-11

    The governing equations for subsurface flow codes in a deformable porous media are derived from the balance of fluid mass and Darcy's equation. One class of these codes, which we call general subsurface flow codes (GSFs), allow for more general constitutive relations for material properties such as porosity, permeability and density. Examples of GSFs include PFLOTRAN, FEHM, TOUGH2, STOMP, and some reservoir simulators such as BOAST. Depending on the constitutive relations used in GSFs, an inconsistency arises between the standard groundwater flow equation and the governing equation of GSFs, and we clarify that the reason for this inconsistency is becausemore » the Darcy's equation used in the GSFs should account for the velocity of fluid with respect to solid. Due to lack of awareness of this inconsistency, users of the GSFs tend to use a porosity-pressure relationship that comes from the standard groundwater flow equation and assumes that the relative velocity is already accounted for. For the Theis problem, we show that using this traditional relationship in the GSFs leads to significantly large errors. We propose an alternate porosity-pressure relationship that is consistent with the derivation of the governing equations in the GSFs where the solid velocity is not tracked, and show that, with this relationship, the results are more accurate for the Theis problem. In conclusion, the purpose of this note is to make the users and developers of these GSFs aware of this inconsistency and to advocate that the alternate porosity model derived here should be incorporated in GSFs.« less

  10. On the Representation of the Porosity-Pressure Relationship in General Subsurface Flow Codes

    NASA Astrophysics Data System (ADS)

    Birdsell, Daniel T.; Karra, Satish; Rajaram, Harihar

    2018-02-01

    The governing equations for subsurface flow codes in a deformable porous media are derived from the balance of fluid mass and Darcy's equation. One class of these codes, which we call general subsurface flow codes (GSFs), allow for more general constitutive relations for material properties such as porosity, permeability and density. Examples of GSFs include PFLOTRAN, FEHM, TOUGH2, STOMP, and some reservoir simulators such as BOAST. Depending on the constitutive relations used in GSFs, an inconsistency arises between the standard groundwater flow equation and the governing equation of GSFs, and we clarify that the reason for this inconsistency is because the Darcy's equation used in the GSFs should account for the velocity of fluid with respect to solid. Due to lack of awareness of this inconsistency, users of the GSFs tend to use a porosity-pressure relationship that comes from the standard groundwater flow equation and assumes that the relative velocity is already accounted for. For the Theis problem, we show that using this traditional relationship in the GSFs leads to significantly large errors. We propose an alternate porosity-pressure relationship that is consistent with the derivation of the governing equations in the GSFs where the solid velocity is not tracked, and show that, with this relationship, the results are more accurate for the Theis problem. The purpose of this note is to make the users and developers of these GSFs aware of this inconsistency and to advocate that the alternate porosity model derived here should be incorporated in GSFs.

  11. On the Representation of the Porosity-Pressure Relationship in General Subsurface Flow Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birdsell, Daniel Traver; Karra, Satish; Rajaram, Harihar

    The governing equations for subsurface flow codes in a deformable porous media are derived from the balance of fluid mass and Darcy's equation. One class of these codes, which we call general subsurface flow codes (GSFs), allow for more general constitutive relations for material properties such as porosity, permeability and density. Examples of GSFs include PFLOTRAN, FEHM, TOUGH2, STOMP, and some reservoir simulators such as BOAST. Depending on the constitutive relations used in GSFs, an inconsistency arises between the standard groundwater flow equation and the governing equation of GSFs, and we clarify that the reason for this inconsistency is becausemore » the Darcy's equation used in the GSFs should account for the velocity of fluid with respect to solid. Due to lack of awareness of this inconsistency, users of the GSFs tend to use a porosity-pressure relationship that comes from the standard groundwater flow equation and assumes that the relative velocity is already accounted for. For the Theis problem, we show that using this traditional relationship in the GSFs leads to significantly large errors. We propose an alternate porosity-pressure relationship that is consistent with the derivation of the governing equations in the GSFs where the solid velocity is not tracked, and show that, with this relationship, the results are more accurate for the Theis problem. In conclusion, the purpose of this note is to make the users and developers of these GSFs aware of this inconsistency and to advocate that the alternate porosity model derived here should be incorporated in GSFs.« less

  12. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  13. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  14. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  15. Enhancing the ABAQUS Thermomechanics Code to Simulate Steady and Transient Fuel Rod Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. L. Williamson; D. A. Knoll

    2009-09-01

    A powerful multidimensional fuels performance capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth , gap heat transfer, and gap/plenum gas behavior during irradiation. The various modeling capabilities are demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multi-pellet fuel rod, during both steady and transient operation. Computational results demonstrate the importancemore » of a multidimensional fully-coupled thermomechanics treatment. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermo-mechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less

  16. The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems

    DOE R&D Accomplishments Database

    Nelson, W. R.; Namito, Yoshihito

    1990-03-01

    In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.

  17. Development of 3D electromagnetic modeling tools for airborne vehicles

    NASA Technical Reports Server (NTRS)

    Volakis, John L.

    1992-01-01

    The main goal of this project is to develop methodologies for scattering by airborne composite vehicles. Although our primary focus continues to be the development of a general purpose code for analyzing the entire structure as a single unit, a number of other tasks are also pursued in parallel with this effort. These tasks are important in testing the overall approach and in developing suitable models for materials coatings, junctions and, more generally, in assessing the effectiveness of the various parts comprising the final code. Here, we briefly discuss our progress on the five different tasks which were pursued during this period. Our progress on each of these tasks is described in the detailed reports (listed at the end of this report) and the memoranda included. The first task described below is, of course, the core of this project and deals with the development of the overall code. Undoubtedly, it is the outcome of the research which was funded by NASA-Ames and the Navy over the past three years. During this year we developed the first finite element code for scattering by structures of arbitrary shape and composition. The code employs a new absorbing boundary condition which allows termination of the finite element mesh only 0.3 lambda from the outer surface of the target. This leads to a remarkable reduction of the mesh size and is a unique feature of the code. Other unique features of this code include capabilities to model resistive sheets, impedance sheets and anisotropic materials. This last capability is the latest feature of the code and is still under development. The code has been extensively validated for a number of composite geometries and some examples are given. The validation of the code is still in progress for anisotropic and larger non-metallic geometries and cavities. The developed finite element code is based on a Galerkin's formulation and employs edge-based tetrahedral elements for discretizing the dielectric sections and the region between the target and the outer mesh termination boundary (ATB). This boundary is placed in conformity with the target's outer surface, thus resulting in additional reduction of the unknown count.

  18. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added featuremore » is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User's Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.« less

  19. Circumstances of Trauma and Accidents in Children: A Thesaurus-based Survey

    PubMed

    Séjourné, Claire; Philbois, Olivier; Vercherin, Paul; Patural, Hugues

    2016-11-25

    Introduction : Injuries and accidents are major causes of morbidity and mortality in children in France. Identification and description of the mechanisms of accidents are essential to develop adapted prevention methods. For this purpose, a specific thesaurus of ICD-10 codes relating to the circumstances of trauma and accidents in children was created in the French Loire department. The objective of this study was to evaluate the relevance and acceptability of the thesaurus in the pediatric emergency unit of Saint-Etienne university hospital.Material and Methods : This study was conducted in two phases. The first, longitudinal phase was conducted over three periods between May and October 2014 to compare codings by emergency room physicians before using the thesaurus with those defined on the basis of the thesaurus. The second phase retrospectively compared coding in July and August 2014 before introduction of the thesaurus with thesaurus-based coding in July and August 2015.Results : The first phase showed a loss of more than half of the information without the thesaurus. The circumstances of trauma can be described by an appropriate code in more than 90% of cases. The second phase showed a 13% increase in coding of the circumstances of trauma, which nevertheless remains insufficient.Discussion : The thesaurus facilitates coding and generally meets the coding physician’s expectations and should be used in large-scale epidemiological surveys.

  20. Communications and information research: Improved space link performance via concatenated forward error correction coding

    NASA Technical Reports Server (NTRS)

    Rao, T. R. N.; Seetharaman, G.; Feng, G. L.

    1996-01-01

    With the development of new advanced instruments for remote sensing applications, sensor data will be generated at a rate that not only requires increased onboard processing and storage capability, but imposes demands on the space to ground communication link and ground data management-communication system. Data compression and error control codes provide viable means to alleviate these demands. Two types of data compression have been studied by many researchers in the area of information theory: a lossless technique that guarantees full reconstruction of the data, and a lossy technique which generally gives higher data compaction ratio but incurs some distortion in the reconstructed data. To satisfy the many science disciplines which NASA supports, lossless data compression becomes a primary focus for the technology development. While transmitting the data obtained by any lossless data compression, it is very important to use some error-control code. For a long time, convolutional codes have been widely used in satellite telecommunications. To more efficiently transform the data obtained by the Rice algorithm, it is required to meet the a posteriori probability (APP) for each decoded bit. A relevant algorithm for this purpose has been proposed which minimizes the bit error probability in the decoding linear block and convolutional codes and meets the APP for each decoded bit. However, recent results on iterative decoding of 'Turbo codes', turn conventional wisdom on its head and suggest fundamentally new techniques. During the past several months of this research, the following approaches have been developed: (1) a new lossless data compression algorithm, which is much better than the extended Rice algorithm for various types of sensor data, (2) a new approach to determine the generalized Hamming weights of the algebraic-geometric codes defined by a large class of curves in high-dimensional spaces, (3) some efficient improved geometric Goppa codes for disk memory systems and high-speed mass memory systems, and (4) a tree based approach for data compression using dynamic programming.

  1. Comparison between calculation and measured data on secondary neutron energy spectra by heavy ion reactions from different thick targets.

    PubMed

    Iwase, H; Wiegel, B; Fehrenbacher, G; Schardt, D; Nakamura, T; Niita, K; Radon, T

    2005-01-01

    Measured neutron energy fluences from high-energy heavy ion reactions through targets several centimeters to several hundred centimeters thick were compared with calculations made using the recently developed general-purpose particle and heavy ion transport code system (PHITS). It was confirmed that the PHITS represented neutron production by heavy ion reactions and neutron transport in thick shielding with good overall accuracy.

  2. The Spatiotemporal Characteristics of Visual Motion Priming

    DTIC Science & Technology

    1994-07-01

    859. Barden, W. (1982, June). A general-purpose I/O board for the Color Computer. BYTE Magazine, pp. 260-281. B . ->,.. H . & Levick , W. (1965). The... B y ...... . ........ Distribution I Availability Codes Avail and i or Dist Special DTIC qU(A~ry niNPETEM 3 iii ABSTRACT THE...bistable diamond, apparent motion figure 52 (after Ramachandran & Anstis, 1983). ( b ) "Streaming" and "bouncing" percepts of apparent 52 motion dot

  3. EZQUERY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holcomb, F.; Kroes, J.; Jessen, T.

    1973-10-18

    EZQUERY is a generalized information retrieval and reporting system developed by the Data Processing Services Department to provide a method of accessing and displaying information from common types of data-base files. By eliminating the costs and delays associated with coding and debugging special purpose programs, it produces simple reports. It was designed with the user in mind, and may be used by programmers and nonprogrammers to access data base files and obtain reports in a reasonably brief period of time. (auth)

  4. Sinda/Fluint Stratfied Tank Modeling

    NASA Technical Reports Server (NTRS)

    Sakowski, Barbara A.

    2014-01-01

    A general purpose SINDA/FLUINT (S/F) stratified tank model was created and used to simulate the Ksite1 LH2 liquid self-pressurization tests as well as axial jet mixing within the liquid region of the tank. The S/F model employed the use of stratified layers, i.e. S/F lumps, in the vapor ullage as well as in the liquid region. The model was constructed to analyze a general purpose stratified tank that could incorporate the following features: Multiple or singular lumps in the liquid and vapor regions of the tank, Real gases (also mixtures) and compressible liquids, Venting, pressurizing, and draining, Condensation and evaporation/boiling, Wall heat transfer, Elliptical, cylindrical, and spherical tank geometries. Extensive user logic was used to allow for tailoring of the above features to specific cases. Most of the code input for a specific case could be done through the Registers Data Block.

  5. Generalized Bezout's Theorem and its applications in coding theory

    NASA Technical Reports Server (NTRS)

    Berg, Gene A.; Feng, Gui-Liang; Rao, T. R. N.

    1996-01-01

    This paper presents a generalized Bezout theorem which can be used to determine a tighter lower bound of the number of distinct points of intersection of two or more curves for a large class of plane curves. A new approach to determine a lower bound on the minimum distance (and also the generalized Hamming weights) for algebraic-geometric codes defined from a class of plane curves is introduced, based on the generalized Bezout theorem. Examples of more efficient linear codes are constructed using the generalized Bezout theorem and the new approach. For d = 4, the linear codes constructed by the new construction are better than or equal to the known linear codes. For d greater than 5, these new codes are better than the known codes. The Klein code over GF(2(sup 3)) is also constructed.

  6. Implementing Shared Memory Parallelism in MCBEND

    NASA Astrophysics Data System (ADS)

    Bird, Adam; Long, David; Dobson, Geoff

    2017-09-01

    MCBEND is a general purpose radiation transport Monte Carlo code from AMEC Foster Wheelers's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. The existing MCBEND parallel capability effectively involves running the same calculation on many processors. This works very well except when the memory requirements of a model restrict the number of instances of a calculation that will fit on a machine. To more effectively utilise parallel hardware OpenMP has been used to implement shared memory parallelism in MCBEND. This paper describes the reasoning behind the choice of OpenMP, notes some of the challenges of multi-threading an established code such as MCBEND and assesses the performance of the parallel method implemented in MCBEND.

  7. Shielding calculation and criticality safety analysis of spent fuel transportation cask in research reactors.

    PubMed

    Mohammadi, A; Hassanzadeh, M; Gharib, M

    2016-02-01

    In this study, shielding calculation and criticality safety analysis were carried out for general material testing reactor (MTR) research reactors interim storage and relevant transportation cask. During these processes, three major terms were considered: source term, shielding, and criticality calculations. The Monte Carlo transport code MCNP5 was used for shielding calculation and criticality safety analysis and ORIGEN2.1 code for source term calculation. According to the results obtained, a cylindrical cask with body, top, and bottom thicknesses of 18, 13, and 13 cm, respectively, was accepted as the dual-purpose cask. Furthermore, it is shown that the total dose rates are below the normal transport criteria that meet the standards specified. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  9. Three-Dimensional Nacelle Aeroacoustics Code With Application to Impedance Education

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.

    2000-01-01

    A three-dimensional nacelle acoustics code that accounts for uniform mean flow and variable surface impedance liners is developed. The code is linked to a commercial version of the NASA-developed General Purpose Solver (for solution of linear systems of equations) in order to obtain the capability to study high frequency waves that may require millions of grid points for resolution. Detailed, single-processor statistics for the performance of the solver in rigid and soft-wall ducts are presented. Over the range of frequencies of current interest in nacelle liner research, noise attenuation levels predicted from the code were in excellent agreement with those predicted from mode theory. The equation solver is memory efficient, requiring only a small fraction of the memory available on modern computers. As an application, the code is combined with an optimization algorithm and used to reduce the impedance spectrum of a ceramic liner. The primary problem with using the code to perform optimization studies at frequencies above I1kHz is the excessive CPU time (a major portion of which is matrix assembly). The research recommends that research be directed toward development of a rapid sparse assembler and exploitation of the multiprocessor capability of the solver to further reduce CPU time.

  10. Simulation the spatial resolution of an X-ray imager based on zinc oxide nanowires in anodic aluminium oxide membrane by using MCNP and OPTICS Codes

    NASA Astrophysics Data System (ADS)

    Samarin, S. N.; Saramad, S.

    2018-05-01

    The spatial resolution of a detector is a very important parameter for x-ray imaging. A bulk scintillation detector because of spreading of light inside the scintillator does't have a good spatial resolution. The nanowire scintillators because of their wave guiding behavior can prevent the spreading of light and can improve the spatial resolution of traditional scintillation detectors. The zinc oxide (ZnO) scintillator nanowire, with its simple construction by electrochemical deposition in regular hexagonal structure of Aluminum oxide membrane has many advantages. The three dimensional absorption of X-ray energy in ZnO scintillator is simulated by a Monte Carlo transport code (MCNP). The transport, attenuation and scattering of the generated photons are simulated by a general-purpose scintillator light response simulation code (OPTICS). The results are compared with a previous publication which used a simulation code of the passage of particles through matter (Geant4). The results verify that this scintillator nanowire structure has a spatial resolution less than one micrometer.

  11. Comparison of numerical techniques for integration of stiff ordinary differential equations arising in combustion chemistry

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.

    1984-01-01

    The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.

  12. The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code.

    PubMed

    Kunkel, Susanne; Schenck, Wolfram

    2017-01-01

    NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling.

  13. The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code

    PubMed Central

    Kunkel, Susanne; Schenck, Wolfram

    2017-01-01

    NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling. PMID:28701946

  14. Evaluation of candidate working fluid formulations for the electrothermal-chemical wind tunnel

    NASA Technical Reports Server (NTRS)

    Akyurtlu, Jale F.; Akyurtlu, Ates

    1993-01-01

    A new hypersonic test facility which can simulate conditions typical of atmospheric flight at Mach numbers up to 20 is currently under study at the NASA/LaRC Hypersonic Propulsion Branch. In the proposed research, it was suggested that a combustion augmented electrothermal wind tunnel concept may be applied to the planned hypersonic testing facility. The purpose of the current investigation is to evaluate some candidate working fluid formulations which may be used in the chemical-electrothermal wind. The efforts in the initial phase of this research were concentrated on acquiring the code used by GASL to model the electrothermal wind tunnel and testing it using the conditions of GASL simulation. The early version of the general chemical kinetics code (GCKP84) was obtained from NASA and the latest updated version of the code (LSENS) was obtained from the author Dr. Bittker. Both codes are installed on a personal computer with a 486 25 MHz processor and 16 Mbyte RAM. Since the available memory was not sufficient to debug LSENS, for the current work GCKP84 was used.

  15. Optimisation of a parallel ocean general circulation model

    NASA Astrophysics Data System (ADS)

    Beare, M. I.; Stevens, D. P.

    1997-10-01

    This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.

  16. Multidisciplinary optimization of a controlled space structure using 150 design variables

    NASA Technical Reports Server (NTRS)

    James, Benjamin B.

    1992-01-01

    A general optimization-based method for the design of large space platforms through integration of the disciplines of structural dynamics and control is presented. The method uses the global sensitivity equations approach and is especially appropriate for preliminary design problems in which the structural and control analyses are tightly coupled. The method is capable of coordinating general purpose structural analysis, multivariable control, and optimization codes, and thus, can be adapted to a variety of controls-structures integrated design projects. The method is used to minimize the total weight of a space platform while maintaining a specified vibration decay rate after slewing maneuvers.

  17. VINE-A NUMERICAL CODE FOR SIMULATING ASTROPHYSICAL SYSTEMS USING PARTICLES. I. DESCRIPTION OF THE PHYSICS AND THE NUMERICAL METHODS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetzstein, M.; Nelson, Andrew F.; Naab, T.

    2009-10-01

    We present a numerical code for simulating the evolution of astrophysical systems using particles to represent the underlying fluid flow. The code is written in Fortran 95 and is designed to be versatile, flexible, and extensible, with modular options that can be selected either at the time the code is compiled or at run time through a text input file. We include a number of general purpose modules describing a variety of physical processes commonly required in the astrophysical community and we expect that the effort required to integrate additional or alternate modules into the code will be small. Inmore » its simplest form the code can evolve the dynamical trajectories of a set of particles in two or three dimensions using a module which implements either a Leapfrog or Runge-Kutta-Fehlberg integrator, selected by the user at compile time. The user may choose to allow the integrator to evolve the system using individual time steps for each particle or with a single, global time step for all. Particles may interact gravitationally as N-body particles, and all or any subset may also interact hydrodynamically, using the smoothed particle hydrodynamic (SPH) method by selecting the SPH module. A third particle species can be included with a module to model massive point particles which may accrete nearby SPH or N-body particles. Such particles may be used to model, e.g., stars in a molecular cloud. Free boundary conditions are implemented by default, and a module may be selected to include periodic boundary conditions. We use a binary 'Press' tree to organize particles for rapid access in gravity and SPH calculations. Modules implementing an interface with special purpose 'GRAPE' hardware may also be selected to accelerate the gravity calculations. If available, forces obtained from the GRAPE coprocessors may be transparently substituted for those obtained from the tree, or both tree and GRAPE may be used as a combination GRAPE/tree code. The code may be run without modification on single processors or in parallel using OpenMP compiler directives on large-scale, shared memory parallel machines. We present simulations of several test problems, including a merger simulation of two elliptical galaxies with 800,000 particles. In comparison to the Gadget-2 code of Springel, the gravitational force calculation, which is the most costly part of any simulation including self-gravity, is {approx}4.6-4.9 times faster with VINE when tested on different snapshots of the elliptical galaxy merger simulation when run on an Itanium 2 processor in an SGI Altix. A full simulation of the same setup with eight processors is a factor of 2.91 faster with VINE. The code is available to the public under the terms of the Gnu General Public License.« less

  18. Vine—A Numerical Code for Simulating Astrophysical Systems Using Particles. I. Description of the Physics and the Numerical Methods

    NASA Astrophysics Data System (ADS)

    Wetzstein, M.; Nelson, Andrew F.; Naab, T.; Burkert, A.

    2009-10-01

    We present a numerical code for simulating the evolution of astrophysical systems using particles to represent the underlying fluid flow. The code is written in Fortran 95 and is designed to be versatile, flexible, and extensible, with modular options that can be selected either at the time the code is compiled or at run time through a text input file. We include a number of general purpose modules describing a variety of physical processes commonly required in the astrophysical community and we expect that the effort required to integrate additional or alternate modules into the code will be small. In its simplest form the code can evolve the dynamical trajectories of a set of particles in two or three dimensions using a module which implements either a Leapfrog or Runge-Kutta-Fehlberg integrator, selected by the user at compile time. The user may choose to allow the integrator to evolve the system using individual time steps for each particle or with a single, global time step for all. Particles may interact gravitationally as N-body particles, and all or any subset may also interact hydrodynamically, using the smoothed particle hydrodynamic (SPH) method by selecting the SPH module. A third particle species can be included with a module to model massive point particles which may accrete nearby SPH or N-body particles. Such particles may be used to model, e.g., stars in a molecular cloud. Free boundary conditions are implemented by default, and a module may be selected to include periodic boundary conditions. We use a binary "Press" tree to organize particles for rapid access in gravity and SPH calculations. Modules implementing an interface with special purpose "GRAPE" hardware may also be selected to accelerate the gravity calculations. If available, forces obtained from the GRAPE coprocessors may be transparently substituted for those obtained from the tree, or both tree and GRAPE may be used as a combination GRAPE/tree code. The code may be run without modification on single processors or in parallel using OpenMP compiler directives on large-scale, shared memory parallel machines. We present simulations of several test problems, including a merger simulation of two elliptical galaxies with 800,000 particles. In comparison to the Gadget-2 code of Springel, the gravitational force calculation, which is the most costly part of any simulation including self-gravity, is ~4.6-4.9 times faster with VINE when tested on different snapshots of the elliptical galaxy merger simulation when run on an Itanium 2 processor in an SGI Altix. A full simulation of the same setup with eight processors is a factor of 2.91 faster with VINE. The code is available to the public under the terms of the Gnu General Public License.

  19. Effect of the multiple scattering of electrons in Monte Carlo simulation of LINACS.

    PubMed

    Vilches, Manuel; García-Pareja, Salvador; Guerrero, Rafael; Anguiano, Marta; Lallena, Antonio M

    2008-01-01

    Results obtained from Monte Carlo simulations of the transport of electrons in thin slabs of dense material media and air slabs with different widths are analyzed. Various general purpose Monte Carlo codes have been used: PENELOPE, GEANT3, GEANT4, EGSNRC, MCNPX. Non-negligible differences between the angular and radial distributions after the slabs have been found. The effects of these differences on the depth doses measured in water are also discussed.

  20. Flowfield predictions for multiple body launch vehicles

    NASA Technical Reports Server (NTRS)

    Deese, Jerry E.; Pavish, D. L.; Johnson, Jerry G.; Agarwal, Ramesh K.; Soni, Bharat K.

    1992-01-01

    A method is developed for simulating inviscid and viscous flow around multicomponent launch vehicles. Grids are generated by the GENIE general-purpose grid-generation code, and the flow solver is a finite-volume Runge-Kutta time-stepping method. Turbulence effects are simulated using Baldwin and Lomax (1978) turbulence model. Calculations are presented for three multibody launch vehicle configurations: one with two small-diameter solid motors, one with nine small-diameter solid motors, and one with three large-diameter solid motors.

  1. A bill to provide that certain provisions of subchapter I of chapter 35 of title 44, United States Code, relating to Federal information policy shall not apply to the collection of information during any investigation, audit, inspection, evaluation, or other review conducted by any Federal office of Inspector General, and for other purposes.

    THOMAS, 111th Congress

    Sen. Grassley, Chuck [R-IA

    2009-05-05

    Senate - 05/05/2009 Read twice and referred to the Committee on Homeland Security and Governmental Affairs. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  2. Design and Implementation of a CMOS Chip for a Prolog

    DTIC Science & Technology

    1988-03-01

    generation scheme . We use the P -circuit [9] with pre-conditioning and post- conditioning 12,3] circuits to generate the carry. The implementation of...system generates vertical microcode for a general purpose processor, the NCR 9300 sys- S tem, from W- code [7]. Three significant pieces of software are...calculation block generating the pro- pagate ( P ) and generate (G) signals needed for carry calculation, and a sum block supplying the final result. The top

  3. Public Law 94-553-Oct. 19, 1976. An Act For the General Revision of the Copyright Law, Title 17 of the United States Code, and for Other Purposes. Title 17-Copyrights. Ninety-Fourth Congress.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC.

    The copyright law of the United States is amended in its entirety by this act that takes effect in 1978. Literary works; musical works; dramatic works; pantomimes and choreographic works; pictorial, graphic, and sculptural works; motion pictures and other audiovisual works; and sound recordings are included in the subject matter of copyright.…

  4. An application of interactive computer graphics technology to the design of dispersal mechanisms

    NASA Technical Reports Server (NTRS)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  5. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  6. Expert system validation in prolog

    NASA Technical Reports Server (NTRS)

    Stock, Todd; Stachowitz, Rolf; Chang, Chin-Liang; Combs, Jacqueline

    1988-01-01

    An overview of the Expert System Validation Assistant (EVA) is being implemented in Prolog at the Lockheed AI Center. Prolog was chosen to facilitate rapid prototyping of the structure and logic checkers and since February 1987, we have implemented code to check for irrelevance, subsumption, duplication, deadends, unreachability, and cycles. The architecture chosen is extremely flexible and expansible, yet concise and complementary with the normal interactive style of Prolog. The foundation of the system is in the connection graph representation. Rules and facts are modeled as nodes in the graph and arcs indicate common patterns between rules. The basic activity of the validation system is then a traversal of the connection graph, searching for various patterns the system recognizes as erroneous. To aid in specifying these patterns, a metalanguage is developed, providing the user with the basic facilities required to reason about the expert system. Using the metalanguage, the user can, for example, give the Prolog inference engine the goal of finding inconsistent conclusions among the rules, and Prolog will search the graph intantiations which can match the definition of inconsistency. Examples of code for some of the checkers are provided and the algorithms explained. Technical highlights include automatic construction of a connection graph, demonstration of the use of metalanguage, the A* algorithm modified to detect all unique cycles, general-purpose stacks in Prolog, and a general-purpose database browser with pattern completion.

  7. Variable Complexity Structural Optimization of Shells

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Venkataraman, Satchi

    1999-01-01

    Structural designers today face both opportunities and challenges in a vast array of available analysis and optimization programs. Some programs such as NASTRAN, are very general, permitting the designer to model any structure, to any degree of accuracy, but often at a higher computational cost. Additionally, such general procedures often do not allow easy implementation of all constraints of interest to the designer. Other programs, based on algebraic expressions used by designers one generation ago, have limited applicability for general structures with modem materials. However, when applicable, they provide easy understanding of design decisions trade-off. Finally, designers can also use specialized programs suitable for designing efficiently a subset of structural problems. For example, PASCO and PANDA2 are panel design codes, which calculate response and estimate failure much more efficiently than general-purpose codes, but are narrowly applicable in terms of geometry and loading. Therefore, the problem of optimizing structures based on simultaneous use of several models and computer programs is a subject of considerable interest. The problem of using several levels of models in optimization has been dubbed variable complexity modeling. Work under NASA grant NAG1-2110 has been concerned with the development of variable complexity modeling strategies with special emphasis on response surface techniques. In addition, several modeling issues for the design of shells of revolution were studied.

  8. Variable Complexity Structural Optimization of Shells

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Venkataraman, Satchi

    1998-01-01

    Structural designers today face both opportunities and challenges in a vast array of available analysis and optimization programs. Some programs such as NASTRAN, are very general, permitting the designer to model any structure, to any degree of accuracy, but often at a higher computational cost. Additionally, such general procedures often do not allow easy implementation of all constraints of interest to the designer. Other programs, based on algebraic expressions used by designers one generation ago, have limited applicability for general structures with modem materials. However, when applicable, they provide easy understanding of design decisions trade-off. Finally, designers can also use specialized programs suitable for designing efficiently a subset of structural problems. For example, PASCO and PANDA2 are panel design codes, which calculate response and estimate failure much more efficiently than general-purpose codes, but are narrowly applicable in terms of geometry and loading. Therefore, the problem of optimizing structures based on simultaneous use of several models and computer programs is a subject of considerable interest. The problem of using several levels of models in optimization has been dubbed variable complexity modeling. Work under NASA grant NAG1-1808 has been concerned with the development of variable complexity modeling strategies with special emphasis on response surface techniques. In addition several modeling issues for the design of shells of revolution were studied.

  9. A Generalized Fluid Formulation for Turbomachinery Computations

    NASA Technical Reports Server (NTRS)

    Merkle, Charles L.; Sankaran, Venkateswaran; Dorney, Daniel J.; Sondak, Douglas L.

    2003-01-01

    A generalized formulation of the equations of motion of an arbitrary fluid are developed for the purpose of defining a common iterative algorithm for computational procedures. The method makes use of the equations of motion in conservation form with separate pseudo-time derivatives used for defining the numerical flux for a Riemann solver and the convergence algorithm. The partial differential equations are complemented by an thermodynamic and caloric equations of state of a complexity necessary for describing the fluid. Representative solutions with a new code based on this general equation formulation are provided for three turbomachinery problems. The first uses air as a working fluid while the second uses gaseous oxygen in a regime in which real gas effects are of little importance. These nearly perfect gas computations provide a basis for comparing with existing perfect gas code computations. The third case is for the flow of liquid oxygen through a turbine where real gas effects are significant. Vortex shedding predictions with the LOX formulations reduce the discrepancy between perfect gas computations and experiment by approximately an order of magnitude, thereby verifying the real gas formulation as well as providing an effective case where its capabilities are necessary.

  10. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  11. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers. Volume 1, Equations and numerics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added featuremore » is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User`s Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.« less

  12. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  13. Quantum Kronecker sum-product low-density parity-check codes with finite rate

    NASA Astrophysics Data System (ADS)

    Kovalev, Alexey A.; Pryadko, Leonid P.

    2013-07-01

    We introduce an ansatz for quantum codes which gives the hypergraph-product (generalized toric) codes by Tillich and Zémor and generalized bicycle codes by MacKay as limiting cases. The construction allows for both the lower and the upper bounds on the minimum distance; they scale as a square root of the block length. Many thus defined codes have a finite rate and limited-weight stabilizer generators, an analog of classical low-density parity-check (LDPC) codes. Compared to the hypergraph-product codes, hyperbicycle codes generally have a wider range of parameters; in particular, they can have a higher rate while preserving the estimated error threshold.

  14. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  15. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  16. Rubus: A compiler for seamless and extensible parallelism.

    PubMed

    Adnan, Muhammad; Aslam, Faisal; Nawaz, Zubair; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been achieved by Rubus on the same GPU. Moreover, Rubus achieves this performance without drastically increasing the memory footprint of a program.

  17. Rubus: A compiler for seamless and extensible parallelism

    PubMed Central

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been achieved by Rubus on the same GPU. Moreover, Rubus achieves this performance without drastically increasing the memory footprint of a program. PMID:29211758

  18. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    NASA Technical Reports Server (NTRS)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  19. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  20. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  1. EBQ code: Transport of space-charge beams in axially symmetric devices

    NASA Astrophysics Data System (ADS)

    Paul, A. C.

    1982-11-01

    Such general-purpose space charge codes as EGUN, BATES, WODF, and TRANSPORT do not gracefully accommodate the simulation of relativistic space-charged beams propagating a long distance in axially symmetric devices where a high degree of cancellation has occurred between the self-magnetic and self-electric forces of the beam. The EBQ code was written specifically to follow high current beam particles where space charge is important in long distance flight in axially symmetric machines possessing external electric and magnetic field. EBQ simultaneously tracks all trajectories so as to allow procedures for charge deposition based on inter-ray separations. The orbits are treated in Cartesian geometry (position and momentum) with z as the independent variable. Poisson's equation is solved in cylindrical geometry on an orthogonal rectangular mesh. EBQ can also handle problems involving multiple ion species where the space charge from each must be included. Such problems arise in the design of ion sources where different charge and mass states are present.

  2. Heat transfer, thermal stress analysis and the dynamic behaviour of high power RF structures. [MARC and SUPERFISH codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKeown, J.; Labrie, J.P.

    1983-08-01

    A general purpose finite element computer code called MARC is used to calculate the temperature distribution and dimensional changes in linear accelerator rf structures. Both steady state and transient behaviour are examined with the computer model. Combining results from MARC with the cavity evaluation computer code SUPERFISH, the static and dynamic behaviour of a structure under power is investigated. Structure cooling is studied to minimize loss in shunt impedance and frequency shifts during high power operation. Results are compared with an experimental test carried out on a cw 805 MHz on-axis coupled structure at an energy gradient of 1.8 MeV/m.more » The model has also been used to compare the performance of on-axis and coaxial structures and has guided the mechanical design of structures suitable for average gradients in excess of 2.0 MeV/m at 2.45 GHz.« less

  3. The Julia programming language: the future of scientific computing

    NASA Astrophysics Data System (ADS)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  4. Enhancing the ABAQUS thermomechanics code to simulate multipellet steady and transient LWR fuel rod behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. L. Williamson

    A powerful multidimensional fuels performance analysis capability, applicable to both steady and transient fuel behavior, is developed based on enhancements to the commercially available ABAQUS general-purpose thermomechanics code. Enhanced capabilities are described, including: UO2 temperature and burnup dependent thermal properties, solid and gaseous fission product swelling, fuel densification, fission gas release, cladding thermal and irradiation creep, cladding irradiation growth, gap heat transfer, and gap/plenum gas behavior during irradiation. This new capability is demonstrated using a 2D axisymmetric analysis of the upper section of a simplified multipellet fuel rod, during both steady and transient operation. Comparisons are made between discrete andmore » smeared-pellet simulations. Computational results demonstrate the importance of a multidimensional, multipellet, fully-coupled thermomechanical approach. Interestingly, many of the inherent deficiencies in existing fuel performance codes (e.g., 1D thermomechanics, loose thermomechanical coupling, separate steady and transient analysis, cumbersome pre- and post-processing) are, in fact, ABAQUS strengths.« less

  5. Applying Standard Interfaces to a Process-Control Language

    NASA Technical Reports Server (NTRS)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  6. Do marketing and alcohol treatment/public health experts think televised alcohol advertisements abide by regulatory guidelines?

    PubMed

    Lloyd, Kelly; Cameron, Elaine; Williams, Hannah; Banister, Emma; Donmall, Michael; Higgins, Alan; French, David P

    2018-04-01

    Televised alcohol advertisements in the United Kingdom must abide by the Broadcast Committee of Advertising Practice Code, which provides guidelines concerning advertisements not implying, condoning or encouraging immoderate, irresponsible or antisocial drinking. Previously, 75 per cent of 373 general public respondents were shown one of seven advertisements rated a breach of at least one guideline. This study assessed whether experts in marketing ( n = 25) and alcohol treatment/public health ( n = 25) perceived the same seven television alcohol advertisements as complying with the Broadcast Committee of Advertising Practice Code. Overall, 83 per cent of advertisements were rated as breaching at least one guideline. This provides further proof that self-regulatory alcohol guidelines are not fit for purpose.

  7. A look at scalable dense linear algebra libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, J.J.; Van de Geijn, R.A.; Walker, D.W.

    1992-01-01

    We discuss the essential design features of a library of scalable software for performing dense linear algebra computations on distributed memory concurrent computers. The square block scattered decomposition is proposed as a flexible and general-purpose way of decomposing most, if not all, dense matrix problems. An object- oriented interface to the library permits more portable applications to be written, and is easy to learn and use, since details of the parallel implementation are hidden from the user. Experiments on the Intel Touchstone Delta system with a prototype code that uses the square block scattered decomposition to perform LU factorization aremore » presented and analyzed. It was found that the code was both scalable and efficient, performing at about 14 GFLOPS (double precision) for the largest problem considered.« less

  8. A look at scalable dense linear algebra libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, J.J.; Van de Geijn, R.A.; Walker, D.W.

    1992-08-01

    We discuss the essential design features of a library of scalable software for performing dense linear algebra computations on distributed memory concurrent computers. The square block scattered decomposition is proposed as a flexible and general-purpose way of decomposing most, if not all, dense matrix problems. An object- oriented interface to the library permits more portable applications to be written, and is easy to learn and use, since details of the parallel implementation are hidden from the user. Experiments on the Intel Touchstone Delta system with a prototype code that uses the square block scattered decomposition to perform LU factorization aremore » presented and analyzed. It was found that the code was both scalable and efficient, performing at about 14 GFLOPS (double precision) for the largest problem considered.« less

  9. A complexity-scalable software-based MPEG-2 video encoder.

    PubMed

    Chen, Guo-bin; Lu, Xin-ning; Wang, Xing-guo; Liu, Ji-lin

    2004-05-01

    With the development of general-purpose processors (GPP) and video signal processing algorithms, it is possible to implement a software-based real-time video encoder on GPP, and its low cost and easy upgrade attract developers' interests to transfer video encoding from specialized hardware to more flexible software. In this paper, the encoding structure is set up first to support complexity scalability; then a lot of high performance algorithms are used on the key time-consuming modules in coding process; finally, at programming level, processor characteristics are considered to improve data access efficiency and processing parallelism. Other programming methods such as lookup table are adopted to reduce the computational complexity. Simulation results showed that these ideas could not only improve the global performance of video coding, but also provide great flexibility in complexity regulation.

  10. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  11. United States Air Force Graduate Student Summer Support Program (1987). Program Technical Report. Volume 1.

    DTIC Science & Technology

    1987-12-01

    developed for a large percentage of the participants in the Summer Faculty Research Program in 1979-1983 period through an AFOSR Minigrant Program . On 1...Analysis of a Bimodal Nuclear Rocket Core by Dav,, C. Carpenter ABSTRACT The framework for a general purpose finite element analysis code was developed ...to study the 2-D temperature distribution in a hot-channel S hexagonal fuel element in the core of a bimodal nuclear’ rocket. Prelim- inary thermal

  12. Computational Performance of Intel MIC, Sandy Bridge, and GPU Architectures: Implementation of a 1D c++/OpenMP Electrostatic Particle-In-Cell Code

    DTIC Science & Technology

    2014-05-01

    fusion, space and astrophysical plasmas, but still the general picture can be presented quite well with the fluid approach [6, 7]. The microscopic...purpose computing CPU for algorithms where processing of large blocks of data is done in parallel. The reason for that is the GPU’s highly effective...parallel structure. Most of the image and video processing computations involve heavy matrix and vector op- erations over large amounts of data and

  13. Non linear predictive control of a LEGO mobile robot

    NASA Astrophysics Data System (ADS)

    Merabti, H.; Bouchemal, B.; Belarbi, K.; Boucherma, D.; Amouri, A.

    2014-10-01

    Metaheuristics are general purpose heuristics which have shown a great potential for the solution of difficult optimization problems. In this work, we apply the meta heuristic, namely particle swarm optimization, PSO, for the solution of the optimization problem arising in NLMPC. This algorithm is easy to code and may be considered as alternatives for the more classical solution procedures. The PSO- NLMPC is applied to control a mobile robot for the tracking trajectory and obstacles avoidance. Experimental results show the strength of this approach.

  14. Finite element modelling of non-linear magnetic circuits using Cosmic NASTRAN

    NASA Technical Reports Server (NTRS)

    Sheerer, T. J.

    1986-01-01

    The general purpose Finite Element Program COSMIC NASTRAN currently has the ability to model magnetic circuits with constant permeablilities. An approach was developed which, through small modifications to the program, allows modelling of non-linear magnetic devices including soft magnetic materials, permanent magnets and coils. Use of the NASTRAN code resulted in output which can be used for subsequent mechanical analysis using a variation of the same computer model. Test problems were found to produce theoretically verifiable results.

  15. MCNP (Monte Carlo Neutron Photon) capabilities for nuclear well logging calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. The general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo Neutron Photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capabilitymore » of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data. A rich collections of variance reduction features can greatly increase the efficiency of a calculation. MCNP is written in FORTRAN 77 and has been run on variety of computer systems from scientific workstations to supercomputers. The next production version of MCNP will include features such as continuous-energy electron transport and a multitasking option. Areas of ongoing research of interest to the well logging community include angle biasing, adaptive Monte Carlo, improved discrete ordinates capabilities, and discrete ordinates/Monte Carlo hybrid development. Los Alamos has requested approval by the Department of Energy to create a Radiation Transport Computational Facility under their User Facility Program to increase external interactions with industry, universities, and other government organizations. 21 refs.« less

  16. Assessing primary care data quality.

    PubMed

    Lim, Yvonne Mei Fong; Yusof, Maryati; Sivasampu, Sheamini

    2018-04-16

    Purpose The purpose of this paper is to assess National Medical Care Survey data quality. Design/methodology/approach Data completeness and representativeness were computed for all observations while other data quality measures were assessed using a 10 per cent sample from the National Medical Care Survey database; i.e., 12,569 primary care records from 189 public and private practices were included in the analysis. Findings Data field completion ranged from 69 to 100 per cent. Error rates for data transfer from paper to web-based application varied between 0.5 and 6.1 per cent. Error rates arising from diagnosis and clinical process coding were higher than medication coding. Data fields that involved free text entry were more prone to errors than those involving selection from menus. The authors found that completeness, accuracy, coding reliability and representativeness were generally good, while data timeliness needs to be improved. Research limitations/implications Only data entered into a web-based application were examined. Data omissions and errors in the original questionnaires were not covered. Practical implications Results from this study provided informative and practicable approaches to improve primary health care data completeness and accuracy especially in developing nations where resources are limited. Originality/value Primary care data quality studies in developing nations are limited. Understanding errors and missing data enables researchers and health service administrators to prevent quality-related problems in primary care data.

  17. Application of FUN3D and CFL3D to the Third Workshop on CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Thomas, J. L.

    2008-01-01

    Two Reynolds-averaged Navier-Stokes computer codes - one unstructured and one structured - are applied to two workshop cases (for the 3rd Workshop on CFD Uncertainty Analysis, held at Instituto Superior Tecnico, Lisbon, in October 2008) for the purpose of uncertainty analysis. The Spalart-Allmaras turbulence model is employed. The first case uses the method of manufactured solution and is intended as a verification case. In other words, the CFD solution is expected to approach the exact solution as the grid is refined. The second case is a validation case (comparison against experiment), for which modeling errors inherent in the turbulence model and errors/uncertainty in the experiment may prevent close agreement. The results from the two computer codes are also compared. This exercise verifies that the codes are consistent both with the exact manufactured solution and with each other. In terms of order property, both codes behave as expected for the manufactured solution. For the backward facing step, CFD uncertainty on the finest grid is computed and is generally very low for both codes (whose results are nearly identical). Agreement with experiment is good at some locations for particular variables, but there are also many areas where the CFD and experimental uncertainties do not overlap.

  18. Towers of generalized divisible quantum codes

    NASA Astrophysics Data System (ADS)

    Haah, Jeongwan

    2018-04-01

    A divisible binary classical code is one in which every code word has weight divisible by a fixed integer. If the divisor is 2ν for a positive integer ν , then one can construct a Calderbank-Shor-Steane (CSS) code, where X -stabilizer space is the divisible classical code, that admits a transversal gate in the ν th level of Clifford hierarchy. We consider a generalization of the divisibility by allowing a coefficient vector of odd integers with which every code word has zero dot product modulo the divisor. In this generalized sense, we construct a CSS code with divisor 2ν +1 and code distance d from any CSS code of code distance d and divisor 2ν where the transversal X is a nontrivial logical operator. The encoding rate of the new code is approximately d times smaller than that of the old code. In particular, for large d and ν ≥2 , our construction yields a CSS code of parameters [[O (dν -1) ,Ω (d ) ,d ] ] admitting a transversal gate at the ν th level of Clifford hierarchy. For our construction we introduce a conversion from magic state distillation protocols based on Clifford measurements to those based on codes with transversal T gates. Our tower contains, as a subclass, generalized triply even CSS codes that have appeared in so-called gauge fixing or code switching methods.

  19. Generalized type II hybrid ARQ scheme using punctured convolutional coding

    NASA Astrophysics Data System (ADS)

    Kallel, Samir; Haccoun, David

    1990-11-01

    A method is presented to construct rate-compatible convolutional (RCC) codes from known high-rate punctured convolutional codes, obtained from best-rate 1/2 codes. The construction method is rather simple and straightforward, and still yields good codes. Moreover, low-rate codes can be obtained without any limit on the lowest achievable code rate. Based on the RCC codes, a generalized type-II hybrid ARQ scheme, which combines the benefits of the modified type-II hybrid ARQ strategy of Hagenauer (1988) with the code-combining ARQ strategy of Chase (1985), is proposed and analyzed. With the proposed generalized type-II hybrid ARQ strategy, the throughput increases as the starting coding rate increases, and as the channel degrades, it tends to merge with the throughput of rate 1/2 type-II hybrid ARQ schemes with code combining, thus allowing the system to be flexible and adaptive to channel conditions, even under wide noise variations and severe degradations.

  20. The weight hierarchies and chain condition of a class of codes from varieties over finite fields

    NASA Technical Reports Server (NTRS)

    Wu, Xinen; Feng, Gui-Liang; Rao, T. R. N.

    1996-01-01

    The generalized Hamming weights of linear codes were first introduced by Wei. These are fundamental parameters related to the minimal overlap structures of the subcodes and very useful in several fields. It was found that the chain condition of a linear code is convenient in studying the generalized Hamming weights of the product codes. In this paper we consider a class of codes defined over some varieties in projective spaces over finite fields, whose generalized Hamming weights can be determined by studying the orbits of subspaces of the projective spaces under the actions of classical groups over finite fields, i.e., the symplectic groups, the unitary groups and orthogonal groups. We give the weight hierarchies and generalized weight spectra of the codes from Hermitian varieties and prove that the codes satisfy the chain condition.

  1. seismo-live: Training in Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, Heiner; Krischer, Lion; van Driel, Martin; Tape, Carl

    2017-04-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

  2. A General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.; Shaklan, Stuart B.

    2009-01-01

    This paper describes a general purpose Coronagraph Performance Error Budget (CPEB) tool that we have developed under the NASA Exoplanet Exploration Program. The CPEB automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. It operates in 3 steps: first, a CodeV or Zemax prescription is converted into a MACOS optical prescription. Second, a Matlab program calls ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled coarse and fine-steering mirrors. Third, the sensitivity matrices are imported by macros into Excel 2007 where the error budget is created. Once created, the user specifies the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions and combines them with the sensitivity matrices to generate an error budget for the system. The user can easily modify the motion allocations to perform trade studies.

  3. A critical analysis of the accuracy of several numerical techniques for combustion kinetic rate equations

    NASA Technical Reports Server (NTRS)

    Radhadrishnan, Krishnan

    1993-01-01

    A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.

  4. Human factors in equipment development for the Space Shuttle - A study of the general purpose work station

    NASA Technical Reports Server (NTRS)

    Junge, M. K.; Giacomi, M. J.

    1981-01-01

    The results of a human factors test to assay the suitability of a prototype general purpose work station (GPWS) for biosciences experiments on the fourth Spacelab mission are reported. The evaluation was performed to verify that users of the GPWS would optimally interact with the GPWS configuration and instrumentation. Six male subjects sat on stools positioned to allow assimilation of the zero-g body posture. Trials were run concerning the operator viewing angles facing the console, the console color, procedures for injecting rates with dye, a rat blood cell count, mouse dissection, squirrel monkey transfer, and plant fixation. The trials were run for several days in order to gage improvement or poor performance conditions. Better access to the work surface was found necessary, together with more distinct and better located LEDs, better access window latches, clearer sequences on control buttons, color-coded sequential buttons, and provisions made for an intercom system when operators of the GPWS work in tandem.

  5. The re-identification risk of Canadians from longitudinal demographics

    PubMed Central

    2011-01-01

    Background The public is less willing to allow their personal health information to be disclosed for research purposes if they do not trust researchers and how researchers manage their data. However, the public is more comfortable with their data being used for research if the risk of re-identification is low. There are few studies on the risk of re-identification of Canadians from their basic demographics, and no studies on their risk from their longitudinal data. Our objective was to estimate the risk of re-identification from the basic cross-sectional and longitudinal demographics of Canadians. Methods Uniqueness is a common measure of re-identification risk. Demographic data on a 25% random sample of the population of Montreal were analyzed to estimate population uniqueness on postal code, date of birth, and gender as well as their generalizations, for periods ranging from 1 year to 11 years. Results Almost 98% of the population was unique on full postal code, date of birth and gender: these three variables are effectively a unique identifier for Montrealers. Uniqueness increased for longitudinal data. Considerable generalization was required to reach acceptably low uniqueness levels, especially for longitudinal data. Detailed guidelines and disclosure policies on how to ensure that the re-identification risk is low are provided. Conclusions A large percentage of Montreal residents are unique on basic demographics. For non-longitudinal data sets, the three character postal code, gender, and month/year of birth represent sufficiently low re-identification risk. Data custodians need to generalize their demographic information further for longitudinal data sets. PMID:21696636

  6. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING AND CODING VERIFICATION (HAND ENTRY) (UA-D-14.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...

  7. Evaluation of Life Sciences Glovebox (LSG) and Multi-Purpose Crew Restraint Concepts

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban

    2005-01-01

    Within the scope of the Multi-purpose Crew Restraints for Long Duration Spaceflights project, funded by Code U, it was proposed to conduct a series of evaluations on the ground and on the KC-135 to investigate the human factors issues concerning confined/unique workstations, such as the design of crew restraints. The usability of multiple crew restraints was evaluated for use with the Life Sciences Glovebox (LSG) and for performing general purpose tasks. The purpose of the KC-135 microgravity evaluation was to: (1) to investigate the usability and effectiveness of the concepts developed, (2) to gather recommendations for further development of the concepts, and (3) to verify the validity of the existing requirements. Some designs had already been tested during a March KC-135 evaluation, and testing revealed the need for modifications/enhancements. This flight was designed to test the new iterations, as well as some new concepts. This flight also involved higher fidelity tasks in the LSG, and the addition of load cells on the gloveports.

  8. The recruitment of new members to existing PBSGL small groups: a qualitative study.

    PubMed

    Park, Julia; Cunningham, David E

    2018-04-23

    Introduction Practice-Based Small Group Learning (PBSGL) is a learning programme widely adopted by primary healthcare professions (general practitioners, general practice nurses and pharmacists) in Scotland and other countries in the UK. PBSGL groups recruit members and decide on meeting dates and venues. Study aims To determine how groups recruit new members and discern what are the important attributes of the new members. Method A grounded theory approach was used with purposive sampling to recruit PBSGL groups to the study. Focus groups drawn from established PBSGL groups were conducted by two researchers following an iterative process, with interviews audio-recorded and transcribed, and codes and themes constructed. Data saturation was achieved. Results and conclusions Four themes were identified that affected group recruitment: group formation and purpose; group culture and ethos; experience and seniority range of group members; professional socialisation and cross-fertilisation. Groups whose main purpose was learning encouraged diverse membership, while groups that were stricter with recruitment often prioritised friendship, group safety, trust and peer support over learning. The variation in group's openness to recruitment may make it difficult for potential members to find a group and this may affect the development and expansion of the PBSGL programme.

  9. Optimization of atmospheric transport models on HPC platforms

    NASA Astrophysics Data System (ADS)

    de la Cruz, Raúl; Folch, Arnau; Farré, Pau; Cabezas, Javier; Navarro, Nacho; Cela, José María

    2016-12-01

    The performance and scalability of atmospheric transport models on high performance computing environments is often far from optimal for multiple reasons including, for example, sequential input and output, synchronous communications, work unbalance, memory access latency or lack of task overlapping. We investigate how different software optimizations and porting to non general-purpose hardware architectures improve code scalability and execution times considering, as an example, the FALL3D volcanic ash transport model. To this purpose, we implement the FALL3D model equations in the WARIS framework, a software designed from scratch to solve in a parallel and efficient way different geoscience problems on a wide variety of architectures. In addition, we consider further improvements in WARIS such as hybrid MPI-OMP parallelization, spatial blocking, auto-tuning and thread affinity. Considering all these aspects together, the FALL3D execution times for a realistic test case running on general-purpose cluster architectures (Intel Sandy Bridge) decrease by a factor between 7 and 40 depending on the grid resolution. Finally, we port the application to Intel Xeon Phi (MIC) and NVIDIA GPUs (CUDA) accelerator-based architectures and compare performance, cost and power consumption on all the architectures. Implications on time-constrained operational model configurations are discussed.

  10. A qualitative study of patient experiences of Type 2 Diabetes care delivered comparatively by General Practice Nurses and Medical Practitioners.

    PubMed

    Boyle, Eileen; Saunders, Rosemary; Drury, Vicki

    2016-07-01

    To explore patient experiences of type 2 diabetes mellitus care delivered by general practice nurses in collaboration with the general practitioner. Australian general practice nurses are expanding their role in multidisciplinary type 2 diabetes care with limited research on patient perceptions of care provision within this collaborative model. Qualitative interpretive. Purposeful sampling was used to invite the patients (n = 10). Data were collected from semi-structured face-to-face interviews. Braun and Clarke's () inductive coding thematic analysis process was used to interpret the data. All participants experienced their General Practice Nurse consultation as a clinical assessment for their General Practitioner. While they appreciated the extra time with the General Practice Nurse, they were unsure of the purpose of the consultation beyond clinical assessment. They described the ongoing challenge of living with T2DM and identified a need for additional information and advice. The results suggest that the model of general practice nurse type 2 diabetes care has an important role to play in the delivery of effective ongoing care of patients. However, this role requires further development to ensure that it is understood by the patients as a role that not only conducts clinical assessments but also provides relevant education and self-management support as part of a collaborative approach to care delivery with General Practitioners. The findings are relevant to primary health care clinicians providing diabetes care to inform more relevant supportive care by general practice nurses. © 2016 John Wiley & Sons Ltd.

  11. Orthorectification by Using Gpgpu Method

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kulur, S.

    2012-07-01

    Thanks to the nature of the graphics processing, the newly released products offer highly parallel processing units with high-memory bandwidth and computational power of more than teraflops per second. The modern GPUs are not only powerful graphic engines but also they are high level parallel programmable processors with very fast computing capabilities and high-memory bandwidth speed compared to central processing units (CPU). Data-parallel computations can be shortly described as mapping data elements to parallel processing threads. The rapid development of GPUs programmability and capabilities attracted the attentions of researchers dealing with complex problems which need high level calculations. This interest has revealed the concepts of "General Purpose Computation on Graphics Processing Units (GPGPU)" and "stream processing". The graphic processors are powerful hardware which is really cheap and affordable. So the graphic processors became an alternative to computer processors. The graphic chips which were standard application hardware have been transformed into modern, powerful and programmable processors to meet the overall needs. Especially in recent years, the phenomenon of the usage of graphics processing units in general purpose computation has led the researchers and developers to this point. The biggest problem is that the graphics processing units use different programming models unlike current programming methods. Therefore, an efficient GPU programming requires re-coding of the current program algorithm by considering the limitations and the structure of the graphics hardware. Currently, multi-core processors can not be programmed by using traditional programming methods. Event procedure programming method can not be used for programming the multi-core processors. GPUs are especially effective in finding solution for repetition of the computing steps for many data elements when high accuracy is needed. Thus, it provides the computing process more quickly and accurately. Compared to the GPUs, CPUs which perform just one computing in a time according to the flow control are slower in performance. This structure can be evaluated for various applications of computer technology. In this study covers how general purpose parallel programming and computational power of the GPUs can be used in photogrammetric applications especially direct georeferencing. The direct georeferencing algorithm is coded by using GPGPU method and CUDA (Compute Unified Device Architecture) programming language. Results provided by this method were compared with the traditional CPU programming. In the other application the projective rectification is coded by using GPGPU method and CUDA programming language. Sample images of various sizes, as compared to the results of the program were evaluated. GPGPU method can be used especially in repetition of same computations on highly dense data, thus finding the solution quickly.

  12. Features of MCNP6 Relevant to Medical Radiation Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, H. Grady III; Goorley, John T.

    2012-08-29

    MCNP (Monte Carlo N-Particle) is a general-purpose Monte Carlo code for simulating the transport of neutrons, photons, electrons, positrons, and more recently other fundamental particles and heavy ions. Over many years MCNP has found a wide range of applications in many different fields, including medical radiation physics. In this presentation we will describe and illustrate a number of significant recently-developed features in the current version of the code, MCNP6, having particular utility for medical physics. Among these are major extensions of the ability to simulate large, complex geometries, improvement in memory requirements and speed for large lattices, introduction of mesh-basedmore » isotopic reaction tallies, advances in radiography simulation, expanded variance-reduction capabilities, especially for pulse-height tallies, and a large number of enhancements in photon/electron transport.« less

  13. MCNP4A: Features and philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.

    This paper describes MCNP, states its philosophy, introduces a number of new features becoming available with version MCNP4A, and answers a number of questions asked by participants in the workshop. MCNP is a general-purpose three-dimensional neutron, photon and electron transport code. Its philosophy is ``Quality, Value and New Features.`` Quality is exemplified by new software quality assurance practices and a program of benchmarking against experiments. Value includes a strong emphasis on documentation and code portability. New features are the third priority. MCNP4A is now available at Los Alamos. New features in MCNP4A include enhanced statistical analysis, distributed processor multitasking, newmore » photon libraries, ENDF/B-VI capabilities, X-Windows graphics, dynamic memory allocation, expanded criticality output, periodic boundaries, plotting of particle tracks via SABRINA, and many other improvements. 23 refs.« less

  14. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    PubMed

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  15. 17 CFR 229.406 - (Item 406) Code of ethics.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 406) Code of ethics. 229... 406) Code of ethics. (a) Disclose whether the registrant has adopted a code of ethics that applies to... code of ethics, explain why it has not done so. (b) For purposes of this Item 406, the term code of...

  16. Computer program for calculation of complex chemical equilibrium compositions and applications. Part 1: Analysis

    NASA Technical Reports Server (NTRS)

    Gordon, Sanford; Mcbride, Bonnie J.

    1994-01-01

    This report presents the latest in a number of versions of chemical equilibrium and applications programs developed at the NASA Lewis Research Center over more than 40 years. These programs have changed over the years to include additional features and improved calculation techniques and to take advantage of constantly improving computer capabilities. The minimization-of-free-energy approach to chemical equilibrium calculations has been used in all versions of the program since 1967. The two principal purposes of this report are presented in two parts. The first purpose, which is accomplished here in part 1, is to present in detail a number of topics of general interest in complex equilibrium calculations. These topics include mathematical analyses and techniques for obtaining chemical equilibrium; formulas for obtaining thermodynamic and transport mixture properties and thermodynamic derivatives; criteria for inclusion of condensed phases; calculations at a triple point; inclusion of ionized species; and various applications, such as constant-pressure or constant-volume combustion, rocket performance based on either a finite- or infinite-chamber-area model, shock wave calculations, and Chapman-Jouguet detonations. The second purpose of this report, to facilitate the use of the computer code, is accomplished in part 2, entitled 'Users Manual and Program Description'. Various aspects of the computer code are discussed, and a number of examples are given to illustrate its versatility.

  17. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  18. 1 CFR 5.5 - Supplement to the Code of Federal Regulations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Supplement to the Code of Federal Regulations. 5.5 Section 5.5 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.5 Supplement to the Code of Federal Regulations. The Federal Register serves as a...

  19. 1 CFR 5.5 - Supplement to the Code of Federal Regulations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 1 General Provisions 1 2011-01-01 2011-01-01 false Supplement to the Code of Federal Regulations. 5.5 Section 5.5 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.5 Supplement to the Code of Federal Regulations. The Federal Register serves as a...

  20. 1 CFR 5.5 - Supplement to the Code of Federal Regulations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 1 General Provisions 1 2014-01-01 2012-01-01 true Supplement to the Code of Federal Regulations. 5.5 Section 5.5 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.5 Supplement to the Code of Federal Regulations. The Federal Register serves as a...

  1. 1 CFR 5.5 - Supplement to the Code of Federal Regulations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 1 General Provisions 1 2013-01-01 2012-01-01 true Supplement to the Code of Federal Regulations. 5.5 Section 5.5 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.5 Supplement to the Code of Federal Regulations. The Federal Register serves as a...

  2. 1 CFR 5.5 - Supplement to the Code of Federal Regulations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 1 General Provisions 1 2012-01-01 2012-01-01 false Supplement to the Code of Federal Regulations. 5.5 Section 5.5 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER THE FEDERAL REGISTER GENERAL § 5.5 Supplement to the Code of Federal Regulations. The Federal Register serves as a...

  3. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  4. Toward a Probabilistic Automata Model of Some Aspects of Code-Switching.

    ERIC Educational Resources Information Center

    Dearholt, D. W.; Valdes-Fallis, G.

    1978-01-01

    The purpose of the model is to select either Spanish or English as the language to be used; its goals at this stage of development include modeling code-switching for lexical need, apparently random code-switching, dependency of code-switching upon sociolinguistic context, and code-switching within syntactic constraints. (EJS)

  5. Multibody dynamics model building using graphical interfaces

    NASA Technical Reports Server (NTRS)

    Macala, Glenn A.

    1989-01-01

    In recent years, the extremely laborious task of manually deriving equations of motion for the simulation of multibody spacecraft dynamics has largely been eliminated. Instead, the dynamicist now works with commonly available general purpose dynamics simulation programs which generate the equations of motion either explicitly or implicitly via computer codes. The user interface to these programs has predominantly been via input data files, each with its own required format and peculiarities, causing errors and frustrations during program setup. Recent progress in a more natural method of data input for dynamics programs: the graphical interface, is described.

  6. [Features of PHITS and its application to medical physics].

    PubMed

    Hashimoto, Shintaro; Niita, Koji; Matsuda, Norihiro; Iwamoto, Yosuke; Iwase, Hiroshi; Sato, Tatsuhiko; Noda, Shusaku; Ogawa, Tatsuhiko; Nakashima, Hiroshi; Fukahori, Tokio; Furuta, Takuya; Chiba, Satoshi

    2013-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code to analyze the transport in three-dimensional phase space and collisions of nearly all particles, including heavy ions, over wide energy range up to 100 GeV/u. Various quantities, such as particle fluence and deposition energies in materials, can be deduced using estimator functions "tally". Recently, a microdosimetric tally function was also developed to apply PHITS to medical physics. Owing to these features, PHITS has been used for medical applications, such as radiation therapy and protection.

  7. General Reevaluation and Supplement to Environmental Impact Statement for Flood Control and Related Purposes. Red and Red Lake Rivers at East Grand Forks, Minnesota.

    DTIC Science & Technology

    1984-11-01

    ORGANIZATION (if applicable) 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK IWORK UNIT ELEMENT NO. NO. NO...participate in tne project. The city has also entered the regular phase of tne National Flood Insurance program adopted 23 September 1977. The State ’V of...releases It o Possible sites outside area of city control/ during periods of low flow. responsibility. -s Red Lake Watersned District has a current program

  8. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  9. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    PubMed

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  10. PAL: an object-oriented programming library for molecular evolution and phylogenetics.

    PubMed

    Drummond, A; Strimmer, K

    2001-07-01

    Phylogenetic Analysis Library (PAL) is a collection of Java classes for use in molecular evolution and phylogenetics. PAL provides a modular environment for the rapid construction of both special-purpose and general analysis programs. PAL version 1.1 consists of 145 public classes or interfaces in 13 packages, including classes for models of character evolution, maximum-likelihood estimation, and the coalescent, with a total of more than 27000 lines of code. The PAL project is set up as a collaborative project to facilitate contributions from other researchers. AVAILIABILTY: The program is free and is available at http://www.pal-project.org. It requires Java 1.1 or later. PAL is licensed under the GNU General Public License.

  11. CoCoNuT: General relativistic hydrodynamics code with dynamical space-time evolution

    NASA Astrophysics Data System (ADS)

    Dimmelmeier, Harald; Novak, Jérôme; Cerdá-Durán, Pablo

    2012-02-01

    CoCoNuT is a general relativistic hydrodynamics code with dynamical space-time evolution. The main aim of this numerical code is the study of several astrophysical scenarios in which general relativity can play an important role, namely the collapse of rapidly rotating stellar cores and the evolution of isolated neutron stars. The code has two flavors: CoCoA, the axisymmetric (2D) magnetized version, and CoCoNuT, the 3D non-magnetized version.

  12. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  13. Tuning iteration space slicing based tiled multi-core code implementing Nussinov's RNA folding.

    PubMed

    Palkowski, Marek; Bielecki, Wlodzimierz

    2018-01-15

    RNA folding is an ongoing compute-intensive task of bioinformatics. Parallelization and improving code locality for this kind of algorithms is one of the most relevant areas in computational biology. Fortunately, RNA secondary structure approaches, such as Nussinov's recurrence, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. This allows us to apply powerful polyhedral compilation techniques based on the transitive closure of dependence graphs to generate parallel tiled code implementing Nussinov's RNA folding. Such techniques are within the iteration space slicing framework - the transitive dependences are applied to the statement instances of interest to produce valid tiles. The main problem at generating parallel tiled code is defining a proper tile size and tile dimension which impact parallelism degree and code locality. To choose the best tile size and tile dimension, we first construct parallel parametric tiled code (parameters are variables defining tile size). With this purpose, we first generate two nonparametric tiled codes with different fixed tile sizes but with the same code structure and then derive a general affine model, which describes all integer factors available in expressions of those codes. Using this model and known integer factors present in the mentioned expressions (they define the left-hand side of the model), we find unknown integers in this model for each integer factor available in the same fixed tiled code position and replace in this code expressions, including integer factors, with those including parameters. Then we use this parallel parametric tiled code to implement the well-known tile size selection (TSS) technique, which allows us to discover in a given search space the best tile size and tile dimension maximizing target code performance. For a given search space, the presented approach allows us to choose the best tile size and tile dimension in parallel tiled code implementing Nussinov's RNA folding. Experimental results, received on modern Intel multi-core processors, demonstrate that this code outperforms known closely related implementations when the length of RNA strands is bigger than 2500.

  14. Empirically evaluating the WHO global code of practice on the international recruitment of health personnel's impact on four high-income countries four years after adoption.

    PubMed

    Tam, Vivian; Edge, Jennifer S; Hoffman, Steven J

    2016-10-12

    Shortages of health workers in low-income countries are exacerbated by the international migration of health workers to more affluent countries. This problem is compounded by the active recruitment of health workers by destination countries, particularly Australia, Canada, UK and USA. The World Health Organization (WHO) adopted a voluntary Code of Practice in May 2010 to mitigate tensions between health workers' right to migrate and the shortage of health workers in source countries. The first empirical impact evaluation of this Code was conducted 11-months after its adoption and demonstrated a lack of impact on health workforce recruitment policy and practice in the short-term. This second empirical impact evaluation was conducted 4-years post-adoption using the same methodology to determine whether there have been any changes in the perceived utility, applicability, and implementation of the Code in the medium-term. Forty-four respondents representing government, civil society and the private sector from Australia, Canada, UK and USA completed an email-based survey evaluating their awareness of the Code, perceived impact, changes to policy or recruitment practices resulting from the Code, and the effectiveness of non-binding Codes generally. The same survey instrument from the original study was used to facilitate direct comparability of responses. Key lessons were identified through thematic analysis. The main findings between the initial impact evaluation and the current one are unchanged. Both sets of key informants reported no significant policy or regulatory changes to health worker recruitment in their countries as a direct result of the Code due to its lack of incentives, institutional mechanisms and interest mobilizers. Participants emphasized the existence of previous bilateral and regional Codes, the WHO Code's non-binding nature, and the primacy of competing domestic healthcare priorities in explaining this perceived lack of impact. The Code has probably still not produced the tangible improvements in health worker flows it aspired to achieve. Several actions, including a focus on developing bilateral codes, linking the Code to topical global priorities, and reframing the Code's purpose to emphasize health system sustainability, are proposed to improve the Code's uptake and impact.

  15. Measurement Requirements for Improved Modeling of Arcjet Facility Flows

    NASA Technical Reports Server (NTRS)

    Fletcher, Douglas G.

    2000-01-01

    Current efforts to develop new reusable launch vehicles and to pursue low-cost robotic planetary missions have led to a renewed interest in understanding arc-jet flows. Part of this renewed interest is concerned with improving the understanding of arc-jet test results and the potential use of available computational-fluid- dynamic (CFD) codes to aid in this effort. These CFD codes have been extensively developed and tested for application to nonequilibrium, hypersonic flow modeling. It is envisioned, perhaps naively, that the application of these CFD codes to the simulation of arc-jet flows would serve two purposes: first. the codes would help to characterize the nonequilibrium nature of the arc-jet flows; and second. arc-jet experiments could potentially be used to validate the flow models. These two objectives are, to some extent, mutually exclusive. However, the purpose of the present discussion is to address what role CFD codes can play in the current arc-jet flow characterization effort, and whether or not the simulation of arc-jet facility tests can be used to eva1uate some of the modeling that is used to formu1ate these codes. This presentation is organized into several sections. In the introductory section, the development of large-scale, constricted-arc test facilities within NASA is reviewed, and the current state of flow diagnostics using conventional instrumentation is summarized. The motivation for using CFD to simulate arc-jet flows is addressed in the next section, and the basic requirements for CFD models that would be used for these simulations are briefly discussed. This section is followed by a more detailed description of experimental measurements that are needed to initiate credible simulations and to evaluate their fidelity in the different flow regions of an arc-jet facility. Observations from a recent combined computational and experiment.al investigation of shock-layer flows in a large-scale arc-jet facility are then used to illustrate the current state of development of diagnostic instrumentation, CFD simulations, and general knowledge in the field of arc-jet characterization. Finally, the main points are summarized and recommendations for future efforts are given.

  16. Documentation of the GLAS fourth order general circulation model. Volume 2: Scalar code

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 2, of a 3 volume technical memoranda contains a detailed documentation of the GLAS fourth order general circulation model. Volume 2 contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A variable name dictionary for the scalar code, and code listings are outlined.

  17. A Monte Carlo study on {sup 223}Ra imaging for unsealed radionuclide therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Akihiko, E-mail: takahsr@hs.med.kyushu-u.ac.jp; Miwa, Kenta; Sasaki, Masayuki

    Purpose: Radium-223 ({sup 223}Ra), an α-emitting radionuclide, is used in unsealed radionuclide therapy for metastatic bone tumors. The demand for qualitative {sup 223}Ra imaging is growing to optimize dosimetry. The authors simulated {sup 223}Ra imaging using an in-house Monte Carlo simulation code and investigated the feasibility and utility of {sup 223}Ra imaging. Methods: The Monte Carlo code comprises two modules, HEXAGON and NAI. The HEXAGON code simulates the photon and electron interactions in the tissues and collimator, and the NAI code simulates the response of the NaI detector system. A 3D numeric phantom created using computed tomography images of amore » chest phantom was installed in the HEXAGON code. {sup 223}Ra accumulated in a part of the spine, and three x-rays and 19 γ rays between 80 and 450 keV were selected as the emitted photons. To evaluate the quality of the {sup 223}Ra imaging, the authors also simulated technetium-99m ({sup 99m}Tc) imaging under the same conditions and compared the results. Results: The sensitivities of the three photopeaks were 147 counts per unit of source activity (cps MBq{sup −1}; photopeak: 84 keV, full width of energy window: 20%), 166 cps MBq{sup −1} (154 keV, 15%), and 158 cps MBq{sup −1} (270 keV, 10%) for a low-energy general-purpose (LEGP) collimator, and those for the medium-energy general-purpose (MEGP) collimator were 33, 13, and 8.0 cps MBq{sup −1}, respectively. In the case of {sup 99m}Tc, the sensitivity was 55 cps MBq{sup −1} (141 keV, 20%) for LEGP and 52 cps MBq{sup −1} for MEGP. The fractions of unscattered photons of the total photons reflecting the image quality were 0.09 (84 keV), 0.03 (154 keV), and 0.02 (270 keV) for the LEGP collimator and 0.41, 0.25, and 0.50 for the MEGP collimator, respectively. Conversely, this fraction was approximately 0.65 for the simulated {sup 99m}Tc imaging. The sensitivity with the LEGP collimator appeared very high. However, almost all of the counts were because of photons that penetrated or were scattered in the collimator; therefore, the proportions of unscattered photons were small. Conclusions: Their simulation study revealed that the most promising scheme for {sup 223}Ra imaging is an 84-keV window using an MEGP collimator. The sensitivity of the photopeaks above 100 keV is too low for {sup 223}Ra imaging. A comparison of the fractions of unscattered photons reveals that the sensitivity and image quality are approximately two-thirds of those for {sup 99m}Tc imaging.« less

  18. Calculation of Dose for Skyshine Radiation From a 45 MeV Electron LINAC

    NASA Astrophysics Data System (ADS)

    Hori, M.; Hikoji, M.; Takahashi, H.; Takahashi, K.; Kitaichi, M.; Sawamura, S.; Nojiri, I.

    1996-11-01

    Dose estimation for skyshine plays an important role in the evaluation of the environment around nuclear facilities. We performed calculations for the skyshine radiation from a Hokkaido University 45 MeV linear accelerator using a general purpose user's version of the EGS4 Monte Carlo Code. To verify accuracy of the code, the simulation results have been compared with our experimental results, in which a gated counting method was used to measure low-level pulsed leakage radiation. In experiment, measurements were carried out up to 600 m away from the LINAC. The simulation results are consistent with the experimental values at the distance between 100 and 400 m from the LINAC. However, agreements of both results up to 100 m from the LINAC are not as good because of the simplification of geometrical modeling in the simulation. It could be said that it is useful to apply this version to the calculation for skyshine.

  19. Use of Spacecraft Command Language for Advanced Command and Control Applications

    NASA Technical Reports Server (NTRS)

    Mims, Tikiela L.

    2008-01-01

    The purpose of this work is to evaluate the use of SCL in building and monitoring command and control applications in order to determine its fitness for space operations. Approximately 24,325 lines of PCG2 code was converted to SCL yielding a 90% reduction in the number of lines of code as many of the functions and scripts utilized in SCL could be ported and reused. Automated standalone testing, simulating the actual production environment, was performed in order to generalize and gauge the relative time it takes for SCL to update and write a given display. The use of SCL rules, functions, and scripts allowed the creation of several test cases permitting the detection of the amount of time it takes update a given set of measurements given the change in a globally existing CUI or CUI. It took the SCL system an average 926.09 ticks to update the entire display of 323 measurements.

  20. Numerical simulation of flow path in the oxidizer side hot gas manifold of the Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Lin, S. J.; Yang, R. J.; Chang, James L. C.; Kwak, D.

    1987-01-01

    The purpose of this study is to examine in detail incompressible laminar and turbulent flows inside the oxidizer side Hot Gas Manifold of the Space Shuttle Main Engine. To perform this study, an implicit finite difference code cast in general curvilinear coordinates is further developed. The code is based on the method of pseudo-compressibility and utilize ADI or implicit approximate factorization algorithm to achieve computational efficiency. A multiple-zone method is developed to overcome the complexity of the geometry. In the present study, the laminar and turbulent flows in the oxidizer side Hot Gas Manifold have been computed. The study reveals that: (1) there exists large recirculation zones inside the bowl if no vanes are present; (2) strong secondary flows are observed in the transfer tube; and (3) properly shaped and positioned guide vanes are effective in eliminating flow separation.

  1. Nonlinear analysis for high-temperature multilayered fiber composite structures. M.S. Thesis; [turbine blades

    NASA Technical Reports Server (NTRS)

    Hopkins, D. A.

    1984-01-01

    A unique upward-integrated top-down-structured approach is presented for nonlinear analysis of high-temperature multilayered fiber composite structures. Based on this approach, a special purpose computer code was developed (nonlinear COBSTRAN) which is specifically tailored for the nonlinear analysis of tungsten-fiber-reinforced superalloy (TFRS) composite turbine blade/vane components of gas turbine engines. Special features of this computational capability include accounting of; micro- and macro-heterogeneity, nonlinear (stess-temperature-time dependent) and anisotropic material behavior, and fiber degradation. A demonstration problem is presented to mainfest the utility of the upward-integrated top-down-structured approach, in general, and to illustrate the present capability represented by the nonlinear COBSTRAN code. Preliminary results indicate that nonlinear COBSTRAN provides the means for relating the local nonlinear and anisotropic material behavior of the composite constituents to the global response of the turbine blade/vane structure.

  2. Revised catalog of types of CODES applications implemented using linked state data : crash outcome data evaluation system (CODES)

    DOT National Transportation Integrated Search

    2000-06-01

    The purpose of the Revised Catalog of Types of CODES Applications Implemented Using Linked : State Data (CODES) is to inspire the development of new applications for linked data that support : efforts to reduce death, disability, severity, and health...

  3. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  4. A General-Purpose Optimization Engine for Multi-Disciplinary Design Applications

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A general purpose optimization tool for multidisciplinary applications, which in the literature is known as COMETBOARDS, is being developed at NASA Lewis Research Center. The modular organization of COMETBOARDS includes several analyzers and state-of-the-art optimization algorithms along with their cascading strategy. The code structure allows quick integration of new analyzers and optimizers. The COMETBOARDS code reads input information from a number of data files, formulates a design as a set of multidisciplinary nonlinear programming problems, and then solves the resulting problems. COMETBOARDS can be used to solve a large problem which can be defined through multiple disciplines, each of which can be further broken down into several subproblems. Alternatively, a small portion of a large problem can be optimized in an effort to improve an existing system. Some of the other unique features of COMETBOARDS include design variable formulation, constraint formulation, subproblem coupling strategy, global scaling technique, analysis approximation, use of either sequential or parallel computational modes, and so forth. The special features and unique strengths of COMETBOARDS assist convergence and reduce the amount of CPU time used to solve the difficult optimization problems of aerospace industries. COMETBOARDS has been successfully used to solve a number of problems, including structural design of space station components, design of nozzle components of an air-breathing engine, configuration design of subsonic and supersonic aircraft, mixed flow turbofan engines, wave rotor topped engines, and so forth. This paper introduces the COMETBOARDS design tool and its versatility, which is illustrated by citing examples from structures, aircraft design, and air-breathing propulsion engine design.

  5. Development of a GPU-Accelerated 3-D Full-Wave Code for Electromagnetic Wave Propagation in a Cold Plasma

    NASA Astrophysics Data System (ADS)

    Woodbury, D.; Kubota, S.; Johnson, I.

    2014-10-01

    Computer simulations of electromagnetic wave propagation in magnetized plasmas are an important tool for both plasma heating and diagnostics. For active millimeter-wave and microwave diagnostics, accurately modeling the evolution of the beam parameters for launched, reflected or scattered waves in a toroidal plasma requires that calculations be done using the full 3-D geometry. Previously, we reported on the application of GPGPU (General-Purpose computing on Graphics Processing Units) to a 3-D vacuum Maxwell code using the FDTD (Finite-Difference Time-Domain) method. Tests were done for Gaussian beam propagation with a hard source antenna, utilizing the parallel processing capabilities of the NVIDIA K20M. In the current study, we have modified the 3-D code to include a soft source antenna and an induced current density based on the cold plasma approximation. Results from Gaussian beam propagation in an inhomogeneous anisotropic plasma, along with comparisons to ray- and beam-tracing calculations will be presented. Additional enhancements, such as advanced coding techniques for improved speedup, will also be investigated. Supported by U.S. DoE Grant DE-FG02-99-ER54527 and in part by the U.S. DoE, Office of Science, WDTS under the Science Undergraduate Laboratory Internship program.

  6. Projected impact of the ICD-10-CM/PCS conversion on longitudinal data and the Joint Commission Core Measures.

    PubMed

    Fenton, Susan H; Benigni, Mary Sue

    2014-01-01

    The transition from ICD-9-CM to ICD-10-CM/PCS is expected to result in longitudinal data discontinuities, as occurred with cause-of-death in 1999. The General Equivalence Maps (GEMs), while useful for suggesting potential maps do not provide guidance regarding the frequency of any matches. Longitudinal data comparisons can only be reliable if they use comparability ratios or factors which have been calculated using records coded in both classification systems. This study utilized 3,969 de-identified dually coded records to examine raw comparability ratios, as well as the comparability ratios between the Joint Commission Core Measures. The raw comparability factor results range from 16.216 for Nicotine dependence, unspecified, uncomplicated to 118.009 for Chronic obstructive pulmonary disease, unspecified. The Joint Commission Core Measure comparability factor results range from 27.15 for Acute Respiratory Failure to 130.16 for Acute Myocardial Infarction. These results indicate significant differences in comparability between ICD-9-CM and ICD-10-CM code assignment, including when the codes are used for external reporting such as the Joint Commission Core Measures. To prevent errors in decision-making and reporting, all stakeholders relying on longitudinal data for measure reporting and other purposes should investigate the impact of the conversion on their data.

  7. Coordination analysis of players' distribution in football using cross-correlation and vector coding techniques.

    PubMed

    Moura, Felipe Arruda; van Emmerik, Richard E A; Santana, Juliana Exel; Martins, Luiz Eduardo Barreto; Barros, Ricardo Machado Leite de; Cunha, Sergio Augusto

    2016-12-01

    The purpose of this study was to investigate the coordination between teams spread during football matches using cross-correlation and vector coding techniques. Using a video-based tracking system, we obtained the trajectories of 257 players during 10 matches. Team spread was calculated as functions of time. For a general coordination description, we calculated the cross-correlation between the signals. Vector coding was used to identify the coordination patterns between teams during offensive sequences that ended in shots on goal or defensive tackles. Cross-correlation showed that opponent teams have a tendency to present in-phase coordination, with a short time lag. During offensive sequences, vector coding results showed that, although in-phase coordination dominated, other patterns were observed. We verified that during the early stages, offensive sequences ending in shots on goal present greater anti-phase and attacking team phase periods, compared to sequences ending in tackles. Results suggest that the attacking team may seek to present a contrary behaviour of its opponent (or may lead the adversary behaviour) in the beginning of the attacking play, regarding to the distribution strategy, to increase the chances of a shot on goal. The techniques allowed detecting the coordination patterns between teams, providing additional information about football dynamics and players' interaction.

  8. 76 FR 19971 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... 344), Silvopasture Establishment (Code 381), Tree/Shrub Establishment (Code 612), Waste Recycling... Criteria were added. Tree/Shrub Establishment (Code 612)--A new Purpose of ``Develop Renewable Energy...

  9. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  10. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    PubMed

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  11. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  12. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  13. DRA/NASA/ONERA Collaboration on Icing Research. Part 2; Prediction of Airfoil Ice Accretion

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Gent, R. W.; Guffond, Didier

    1997-01-01

    This report presents results from a joint study by DRA, NASA, and ONERA for the purpose of comparing, improving, and validating the aircraft icing computer codes developed by each agency. These codes are of three kinds: (1) water droplet trajectory prediction, (2) ice accretion modeling, and (3) transient electrothermal deicer analysis. In this joint study, the agencies compared their code predictions with each other and with experimental results. These comparison exercises were published in three technical reports, each with joint authorship. DRA published and had first authorship of Part 1 - Droplet Trajectory Calculations, NASA of Part 2 - Ice Accretion Prediction, and ONERA of Part 3 - Electrothermal Deicer Analysis. The results cover work done during the period from August 1986 to late 1991. As a result, all of the information in this report is dated. Where necessary, current information is provided to show the direction of current research. In this present report on ice accretion, each agency predicted ice shapes on two dimensional airfoils under icing conditions for which experimental ice shapes were available. In general, all three codes did a reasonable job of predicting the measured ice shapes. For any given experimental condition, one of the three codes predicted the general ice features (i.e., shape, impingement limits, mass of ice) somewhat better than did the other two. However, no single code consistently did better than the other two over the full range of conditions examined, which included rime, mixed, and glaze ice conditions. In several of the cases, DRA showed that the user's knowledge of icing can significantly improve the accuracy of the code prediction. Rime ice predictions were reasonably accurate and consistent among the codes, because droplets freeze on impact and the freezing model is simple. Glaze ice predictions were less accurate and less consistent among the codes, because the freezing model is more complex and is critically dependent upon unsubstantiated heat transfer and surface roughness models. Thus, heat transfer prediction methods used in the codes became the subject for a separate study in this report to compare predicted heat transfer coefficients with a limited experimental database of heat transfer coefficients for cylinders with simulated glaze and rime ice shapes. The codes did a good job of predicting heat transfer coefficients near the stagnation region of the ice shapes. But in the region of the ice horns, all three codes predicted heat transfer coefficients considerably higher than the measured values. An important conclusion of this study is that further research is needed to understand the finer detail of of the glaze ice accretion process and to develop improved glaze ice accretion models.

  14. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  15. GLSENS: A Generalized Extension of LSENS Including Global Reactions and Added Sensitivity Analysis for the Perfectly Stirred Reactor

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1996-01-01

    A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.

  16. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  17. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  18. [Prevalence of Cardiovascular Risk Factors at The Population Level: A Comparison of Ambulatory Physician-Coded Claims Data With Clinical Data From A Population-Based Study].

    PubMed

    Angelow, Aniela; Reber, Katrin Christiane; Schmidt, Carsten Oliver; Baumeister, Sebastian Edgar; Chenot, Jean-Francois

    2018-06-04

    The study assesses the validity of ICD-10 coded cardiovascular risk factors in claims data using gold-standard measurements from a population-based study for arterial hypertension, diabetes, dyslipidemia, smoking and obesity as a reference. Data of 1941 participants (46 % male, mean age 58±13 years) of the Study of Health in Pomerania (SHIP) were linked to electronic medical records from the regional association of statutory health insurance physicians from 2008 to 2012 used for billing purposes. Clinical data from SHIP was used as a gold standard to assess the agreement with claims data for ICD-10 codes I10.- (arterial hypertension), E10.- to E14.- (diabetes mellitus), E78.- (dyslipidemia), F17.- (smoking) and E65.- to E68.- (obesity). A higher agreement between ICD-coded and clinical diagnosis was found for diabetes (sensitivity (sens) 84%, specificity (spec) 95%, positive predictive value (ppv) 80%) and hypertension (sens 72%, spec 93%, ppv 97%) and a low level of agreement for smoking (sens 18%, spec 99%, ppv 89%), obesity (sens 22%, spec 99%, ppv 99%) and dyslipidemia (sens 40%, spec 60%, ppv 70%). Depending on the investigated cardiovascular risk factor, medication, documented additional cardiovascular co-morbidities, age, sex and clinical severity were associated with the ICD-coded cardiovascular risk factor. The quality of ICD-coding in ambulatory care is highly variable for different cardiovascular risk factors and outcomes. Diagnoses were generally undercoded, but those relevant for billing were coded more frequently. Our results can be used to quantify errors in population-based estimates of prevalence based on claims data for the investigated cardiovascular risk factors. © Georg Thieme Verlag KG Stuttgart · New York.

  19. Next-generation acceleration and code optimization for light transport in turbid media using GPUs

    PubMed Central

    Alerstam, Erik; Lo, William Chun Yip; Han, Tianyi David; Rose, Jonathan; Andersson-Engels, Stefan; Lilge, Lothar

    2010-01-01

    A highly optimized Monte Carlo (MC) code package for simulating light transport is developed on the latest graphics processing unit (GPU) built for general-purpose computing from NVIDIA - the Fermi GPU. In biomedical optics, the MC method is the gold standard approach for simulating light transport in biological tissue, both due to its accuracy and its flexibility in modelling realistic, heterogeneous tissue geometry in 3-D. However, the widespread use of MC simulations in inverse problems, such as treatment planning for PDT, is limited by their long computation time. Despite its parallel nature, optimizing MC code on the GPU has been shown to be a challenge, particularly when the sharing of simulation result matrices among many parallel threads demands the frequent use of atomic instructions to access the slow GPU global memory. This paper proposes an optimization scheme that utilizes the fast shared memory to resolve the performance bottleneck caused by atomic access, and discusses numerous other optimization techniques needed to harness the full potential of the GPU. Using these techniques, a widely accepted MC code package in biophotonics, called MCML, was successfully accelerated on a Fermi GPU by approximately 600x compared to a state-of-the-art Intel Core i7 CPU. A skin model consisting of 7 layers was used as the standard simulation geometry. To demonstrate the possibility of GPU cluster computing, the same GPU code was executed on four GPUs, showing a linear improvement in performance with an increasing number of GPUs. The GPU-based MCML code package, named GPU-MCML, is compatible with a wide range of graphics cards and is released as an open-source software in two versions: an optimized version tuned for high performance and a simplified version for beginners (http://code.google.com/p/gpumcml). PMID:21258498

  20. Constructions for finite-state codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Mceliece, R. J.; Abdel-Ghaffar, K.

    1987-01-01

    A class of codes called finite-state (FS) codes is defined and investigated. These codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived from known upper bounds on the minimum distance of block codes. A general construction for FS codes is then given, based on the idea of partitioning a given linear block into cosets of one of its subcodes, and it is shown that in many cases the FS codes constructed in this way have a d sub free which is as large as possible. These codes are found without the need for lengthy computer searches, and have potential applications for future deep-space coding systems. The issue of catastropic error propagation (CEP) for FS codes is also investigated.

  1. Extensions of the MCNP5 and TRIPOLI4 Monte Carlo Codes for Transient Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Hoogenboom, J. Eduard; Sjenitzer, Bart L.

    2014-06-01

    To simulate reactor transients for safety analysis with the Monte Carlo method the generation and decay of delayed neutron precursors is implemented in the MCNP5 and TRIPOLI4 general purpose Monte Carlo codes. Important new variance reduction techniques like forced decay of precursors in each time interval and the branchless collision method are included to obtain reasonable statistics for the power production per time interval. For simulation of practical reactor transients also the feedback effect from the thermal-hydraulics must be included. This requires coupling of the Monte Carlo code with a thermal-hydraulics (TH) code, providing the temperature distribution in the reactor, which affects the neutron transport via the cross section data. The TH code also provides the coolant density distribution in the reactor, directly influencing the neutron transport. Different techniques for this coupling are discussed. As a demonstration a 3x3 mini fuel assembly with a moving control rod is considered for MCNP5 and a mini core existing of 3x3 PWR fuel assemblies with control rods and burnable poisons for TRIPOLI4. Results are shown for reactor transients due to control rod movement or withdrawal. The TRIPOLI4 transient calculation is started at low power and includes thermal-hydraulic feedback. The power rises about 10 decades and finally stabilises the reactor power at a much higher level than initial. The examples demonstrate that the modified Monte Carlo codes are capable of performing correct transient calculations, taking into account all geometrical and cross section detail.

  2. Atomicrex—a general purpose tool for the construction of atomic interaction models

    NASA Astrophysics Data System (ADS)

    Stukowski, Alexander; Fransson, Erik; Mock, Markus; Erhart, Paul

    2017-07-01

    We introduce atomicrex, an open-source code for constructing interatomic potentials as well as more general types of atomic-scale models. Such effective models are required to simulate extended materials structures comprising many thousands of atoms or more, because electronic structure methods become computationally too expensive at this scale. atomicrex covers a wide range of interatomic potential types and fulfills many needs in atomistic model development. As inputs, it supports experimental property values as well as ab initio energies and forces, to which models can be fitted using various optimization algorithms. The open architecture of atomicrex allows it to be used in custom model development scenarios beyond classical interatomic potentials while thanks to its Python interface it can be readily integrated e.g., with electronic structure calculations or machine learning algorithms.

  3. Code conversion from signed-digit to complement representation based on look-ahead optical logic operations

    NASA Astrophysics Data System (ADS)

    Li, Guoqiang; Qian, Feng

    2001-11-01

    We present, for the first time to our knowledge, a generalized lookahead logic algorithm for number conversion from signed-digit to complement representation. By properly encoding the signed-digits, all the operations are performed by binary logic, and unified logical expressions can be obtained for conversion from modified-signed- digit (MSD) to 2's complement, trinary signed-digit (TSD) to 3's complement, and quarternary signed-digit (QSD) to 4's complement. For optical implementation, a parallel logical array module using an electron-trapping device is employed and experimental results are shown. This optical module is suitable for implementing complex logic functions in the form of the sum of the product. The algorithm and architecture are compatible with a general-purpose optoelectronic computing system.

  4. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  5. Design fabrication and test of graphite/polyimide composite joints and attachments for advanced aerospace vehicles

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Graphite/polyimide (Gr/PI) bolted and bonded joints were investigated. Possible failure modes and the design loads for the four generic joint types are discussed. Preliminary sizing of a type 1 joint, bonded and bolted configuration is described, including assumptions regarding material properties and sizing methodology. A general purpose finite element computer code is described that was formulated to analyze single and double lap joints, with and without tapered adherends, and with user-controlled variable element size arrangements. An initial order of Celion 6000/PMR-15 prepreg was received and characterized.

  6. An implementation of a reference symbol approach to generic modulation in fading channels

    NASA Technical Reports Server (NTRS)

    Young, R. J.; Lodge, J. H.; Pacola, L. C.

    1990-01-01

    As mobile satellite communications systems evolve over the next decade, they will have to adapt to a changing tradeoff between bandwidth and power. This paper presents a flexible approach to digital modulation and coding that will accommodate both wideband and narrowband schemes. This architecture could be the basis for a family of modems, each satisfying a specific power and bandwidth constraint, yet all having a large number of common signal processing blocks. The implementation of this generic approach, with general purpose digital processors for transmission of 4.8 kilobits per sec. digitally encoded speech, is described.

  7. Do-it-yourself networks: a novel method of generating weighted networks.

    PubMed

    Shanafelt, D W; Salau, K R; Baggio, J A

    2017-11-01

    Network theory is finding applications in the life and social sciences for ecology, epidemiology, finance and social-ecological systems. While there are methods to generate specific types of networks, the broad literature is focused on generating unweighted networks. In this paper, we present a framework for generating weighted networks that satisfy user-defined criteria. Each criterion hierarchically defines a feature of the network and, in doing so, complements existing algorithms in the literature. We use a general example of ecological species dispersal to illustrate the method and provide open-source code for academic purposes.

  8. ATLAS, an integrated structural analysis and design system. Volume 2: System design document

    NASA Technical Reports Server (NTRS)

    Erickson, W. J. (Editor)

    1979-01-01

    ATLAS is a structural analysis and design system, operational on the Control Data Corporation 6600/CYBER computers. The overall system design, the design of the individual program modules, and the routines in the ATLAS system library are described. The overall design is discussed in terms of system architecture, executive function, data base structure, user program interfaces and operational procedures. The program module sections include detailed code description, common block usage and random access file usage. The description of the ATLAS program library includes all information needed to use these general purpose routines.

  9. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3. Part 2.

    DTIC Science & Technology

    1983-09-01

    F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through

  10. Numerical methods for stiff systems of two-point boundary value problems

    NASA Technical Reports Server (NTRS)

    Flaherty, J. E.; Omalley, R. E., Jr.

    1983-01-01

    Numerical procedures are developed for constructing asymptotic solutions of certain nonlinear singularly perturbed vector two-point boundary value problems having boundary layers at one or both endpoints. The asymptotic approximations are generated numerically and can either be used as is or to furnish a general purpose two-point boundary value code with an initial approximation and the nonuniform computational mesh needed for such problems. The procedures are applied to a model problem that has multiple solutions and to problems describing the deformation of thin nonlinear elastic beam that is resting on an elastic foundation.

  11. Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER

    NASA Astrophysics Data System (ADS)

    Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena

    2015-11-01

    Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.

  12. MO-A-213-00: Economics Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less

  13. MO-A-213-01: 2015 Economics Update Part 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dirksen, B.

    2015-06-15

    The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less

  14. MO-A-213-02: 2015 Economics Update Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fontenot, J.

    2015-06-15

    The purpose of this session is to introduce attendees to the healthcare reimbursement system and how it applies to the clinical work of a Medical Physicist. This will include general information about the different categories of payers and payees, how work is described by CPT© codes, and how various payers set values for this work in different clinical settings. 2015 is a year of significant changes to the payment system. Many CPT© codes have been deleted and replaced with new CPT© codes. These codes define some of the most common work performed in our clinics including treatment planning and delivery.more » This presentation will describe what work is encompassed in these codes and will give attendees an overview of the changes for 2015 as they apply to radiation oncology. Finally, some insight into what can be expected during 2016 will be presented. This includes what information is typically released by the Centers for Medicaid & Medicare Services (CMS) during the year and how we as an organization respond. This will include ways members can interact with the AAPM professional economics committee and other resources members may find helpful. Learning Objectives: Basics of how Medicare is structured and how reimbursement rates are set. Basic understanding of proposed changes to the 2016 Medicare rules. What resources are available from the AAPM and how to interact with the professional economics committee. Ownership in pxAlpha, LLC, a medical device start up company.« less

  15. Reliability of a rating procedure to monitor industry self-regulation codes governing alcohol advertising content.

    PubMed

    Babor, Thomas F; Xuan, Ziming; Proctor, Dwayne

    2008-03-01

    The purposes of this study were to develop reliable procedures to monitor the content of alcohol advertisements broadcast on television and in other media, and to detect violations of the content guidelines of the alcohol industry's self-regulation codes. A set of rating-scale items was developed to measure the content guidelines of the 1997 version of the U.S. Beer Institute Code. Six focus groups were conducted with 60 college students to evaluate the face validity of the items and the feasibility of the procedure. A test-retest reliability study was then conducted with 74 participants, who rated five alcohol advertisements on two occasions separated by 1 week. Average correlations across all advertisements using three reliability statistics (r, rho, and kappa) were almost all statistically significant and the kappas were good for most items, which indicated high test-retest agreement. We also found high interrater reliabilities (intraclass correlations) among raters for item-level and guideline-level violations, indicating that regardless of the specific item, raters were consistent in their general evaluations of the advertisements. Naïve (untrained) raters can provide consistent (reliable) ratings of the main content guidelines proposed in the U.S. Beer Institute Code. The rating procedure may have future applications for monitoring compliance with industry self-regulation codes and for conducting research on the ways in which alcohol advertisements are perceived by young adults and other vulnerable populations.

  16. General linear codes for fault-tolerant matrix operations on processor arrays

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Abraham, J. A.

    1988-01-01

    Various checksum codes have been suggested for fault-tolerant matrix computations on processor arrays. Use of these codes is limited due to potential roundoff and overflow errors. Numerical errors may also be misconstrued as errors due to physical faults in the system. In this a set of linear codes is identified which can be used for fault-tolerant matrix operations such as matrix addition, multiplication, transposition, and LU-decomposition, with minimum numerical error. Encoding schemes are given for some of the example codes which fall under the general set of codes. With the help of experiments, a rule of thumb for the selection of a particular code for a given application is derived.

  17. On the Application of Time-Reversed Space-Time Block Code to Aeronautical Telemetry

    DTIC Science & Technology

    2014-06-01

    Keying (SOQPSK), bit error rate (BER), Orthogonal Frequency Division Multiplexing ( OFDM ), Generalized time-reversed space-time block codes (GTR-STBC) 16...Alamouti code [4]) is optimum [2]. Although OFDM is generally applied on a per subcarrier basis in frequency selective fading, it is not a viable...Calderbank, “Finite-length MIMO decision feedback equal- ization for space-time block-coded signals over multipath-fading channels,” IEEE Transac- tions on

  18. 1 CFR 11.3 - Code of Federal Regulations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...

  19. 1 CFR 11.3 - Code of Federal Regulations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 1 General Provisions 1 2011-01-01 2011-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...

  20. 1 CFR 21.14 - Deviations from standard organization of the Code of Federal Regulations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Deviations from standard organization of the... CODIFICATION General Numbering § 21.14 Deviations from standard organization of the Code of Federal Regulations. (a) Any deviation from standard Code of Federal Regulations designations must be approved in advance...

  1. 1 CFR 11.3 - Code of Federal Regulations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 1 General Provisions 1 2012-01-01 2012-01-01 false Code of Federal Regulations. 11.3 Section 11.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER AVAILABILITY OF OFFICE OF THE FEDERAL... complete set of the Code of Federal Regulations is $1,019 per year for the bound, paper edition, or $247...

  2. Establishing ethics in an organization by using principles.

    PubMed

    Hawks, Val D; Benzley, Steven E; Terry, Ronald E

    2004-04-01

    Laws, codes, and rules are essential for any community, public or private, to operate in an orderly and productive fashion. Without laws and codes, anarchy and chaos abound and the purpose and role of the organization is lost. However, danger is significant, and damage serious and far-reaching when individuals or organizations become so focused on rules, laws, and specifications that basic principles are ignored. This paper discusses the purpose of laws, rules, and codes, to help understand basic principles. With such an understanding an increase in the level of ethical and moral behavior can be obtained without imposing detailed rules.

  3. Contextual Constraint Treatment for coarse coding deficit in adults with right hemisphere brain damage: Generalization to narrative discourse comprehension

    PubMed Central

    Blake, Margaret Lehman; Tompkins, Connie A.; Scharp, Victoria L.; Meigh, Kimberly M.; Wambaugh, Julie

    2014-01-01

    Coarse coding is the activation of broad semantic fields that can include multiple word meanings and a variety of features, including those peripheral to a word’s core meaning. It is a partially domain-general process related to general discourse comprehension and contributes to both literal and non-literal language processing. Adults with damage to the right cerebral hemisphere (RHD) and a coarse coding deficit are particularly slow to activate features of words that are relatively distant or peripheral. This manuscript reports a pre-efficacy study of Contextual Constraint Treatment (CCT), a novel, implicit treatment designed to increase the efficiency of coarse coding with the goal of improving narrative comprehension and other language performance that relies on coarse coding. Participants were four adults with RHD. The study used a single-subject controlled experimental design across subjects and behaviors. The treatment involves pre-stimulation, using a hierarchy of strong- and moderately-biased contexts, to prime the intended distantly-related features of critical stimulus words. Three of the four participants exhibited gains in auditory narrative discourse comprehension, the primary outcome measure. All participants exhibited generalization to untreated items. No strong generalization to processing nonliteral language was evident. The results indicate that CCT yields both improved efficiency of the coarse coding process and generalization to narrative comprehension. PMID:24983133

  4. Physicochemical analog for modeling superimposed and coded memories

    NASA Astrophysics Data System (ADS)

    Ensanian, Minas

    1992-07-01

    The mammalian brain is distinguished by a life-time of memories being stored within the same general region of physicochemical space, and having two extraordinary features. First, memories to varying degrees are superimposed, as well as coded. Second, instantaneous recall of past events can often be affected by relatively simple, and seemingly unrelated sensory clues. For the purposes of attempting to mathematically model such complex behavior, and for gaining additional insights, it would be highly advantageous to be able to simulate or mimic similar behavior in a nonbiological entity where some analogical parameters of interest can reasonably be controlled. It has recently been discovered that in nonlinear accumulative metal fatigue memories (related to mechanical deformation) can be superimposed and coded in the crystal lattice, and that memory, that is, the total number of stress cycles can be recalled (determined) by scanning not the surfaces but the `edges' of the objects. The new scanning technique known as electrotopography (ETG) now makes the state space modeling of metallic networks possible. The author provides an overview of the new field and outlines the areas that are of immediate interest to the science of artificial neural networks.

  5. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  6. Structural Sizing of a Horizontal Take-Off Launch Vehicle with an Air Collection and Enrichment System

    NASA Technical Reports Server (NTRS)

    McCurdy, David R.; Roche, Joseph M.

    2004-01-01

    In support of NASA's Next Generation Launch Technology (NGLT) program, the Andrews Gryphon booster was studied. The Andrews Gryphon concept is a horizontal lift-off, two-stage-to-orbit, reusable launch vehicle that uses an air collection and enrichment system (ACES). The purpose of the ACES is to collect atmospheric oxygen during a subsonic flight loiter phase and cool it to cryogenic temperature, ultimately resulting in a reduced initial take-off weight To study the performance and size of an air-collection based booster, an initial airplane like shape was established as a baseline and modeled in a vehicle sizing code. The code, SIZER, contains a general series of volume, surface area, and fuel fraction relationships that tie engine and ACES performance with propellant requirements and volumetric constraints in order to establish vehicle closure for the given mission. A key element of system level weight optimization is the use of the SIZER program that provides rapid convergence and a great deal of flexibility for different tank architectures and material suites in order to study their impact on gross lift-off weight. This paper discusses important elements of the sizing code architecture followed by highlights of the baseline booster study.

  7. LDPC decoder with a limited-precision FPGA-based floating-point multiplication coprocessor

    NASA Astrophysics Data System (ADS)

    Moberly, Raymond; O'Sullivan, Michael; Waheed, Khurram

    2007-09-01

    Implementing the sum-product algorithm, in an FPGA with an embedded processor, invites us to consider a tradeoff between computational precision and computational speed. The algorithm, known outside of the signal processing community as Pearl's belief propagation, is used for iterative soft-decision decoding of LDPC codes. We determined the feasibility of a coprocessor that will perform product computations. Our FPGA-based coprocessor (design) performs computer algebra with significantly less precision than the standard (e.g. integer, floating-point) operations of general purpose processors. Using synthesis, targeting a 3,168 LUT Xilinx FPGA, we show that key components of a decoder are feasible and that the full single-precision decoder could be constructed using a larger part. Soft-decision decoding by the iterative belief propagation algorithm is impacted both positively and negatively by a reduction in the precision of the computation. Reducing precision reduces the coding gain, but the limited-precision computation can operate faster. A proposed solution offers custom logic to perform computations with less precision, yet uses the floating-point format to interface with the software. Simulation results show the achievable coding gain. Synthesis results help theorize the the full capacity and performance of an FPGA-based coprocessor.

  8. Merits and limitations of optimality criteria method for structural optimization

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Guptill, James D.; Berke, Laszlo

    1993-01-01

    The merits and limitations of the optimality criteria (OC) method for the minimum weight design of structures subjected to multiple load conditions under stress, displacement, and frequency constraints were investigated by examining several numerical examples. The examples were solved utilizing the Optimality Criteria Design Code that was developed for this purpose at NASA Lewis Research Center. This OC code incorporates OC methods available in the literature with generalizations for stress constraints, fully utilized design concepts, and hybrid methods that combine both techniques. Salient features of the code include multiple choices for Lagrange multiplier and design variable update methods, design strategies for several constraint types, variable linking, displacement and integrated force method analyzers, and analytical and numerical sensitivities. The performance of the OC method, on the basis of the examples solved, was found to be satisfactory for problems with few active constraints or with small numbers of design variables. For problems with large numbers of behavior constraints and design variables, the OC method appears to follow a subset of active constraints that can result in a heavier design. The computational efficiency of OC methods appears to be similar to some mathematical programming techniques.

  9. Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More

    NASA Technical Reports Server (NTRS)

    Kou, Yu; Lin, Shu; Fossorier, Marc

    1999-01-01

    Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.

  10. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  11. A model of a code of ethics for tissue banks operating in developing countries.

    PubMed

    Morales Pedraza, Jorge

    2012-12-01

    Ethical practice in the field of tissue banking requires the setting of principles, the identification of possible deviations and the establishment of mechanisms that will detect and hinder abuses that may occur during the procurement, processing and distribution of tissues for transplantation. This model of a Code of Ethics has been prepared with the purpose of being used for the elaboration of a Code of Ethics for tissue banks operating in the Latin American and the Caribbean, Asia and the Pacific and the African regions in order to guide the day-to-day operation of these banks. The purpose of this model of Code of Ethics is to assist interested tissue banks in the preparation of their own Code of Ethics towards ensuring that the tissue bank staff support with their actions the mission and values associated with tissue banking.

  12. A systematic literature review of automated clinical coding and classification systems

    PubMed Central

    Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome. PMID:20962126

  13. A systematic literature review of automated clinical coding and classification systems.

    PubMed

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  14. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  15. 12 CFR 573.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... access number or access code, does not include a number or code in an encrypted form, as long as you do... account number or similar form of access number or access code for a consumer's credit card account... or access code: (1) To your agent or service provider solely in order to perform marketing for your...

  16. 17 CFR 248.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... number or access code, does not include a number or code in an encrypted form, as long as you do not... agency, an account number or similar form of access number or access code for a consumer's credit card... number or access code: (1) To your agent or service provider solely in order to perform marketing for...

  17. 12 CFR 573.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... access number or access code, does not include a number or code in an encrypted form, as long as you do... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To your agent or service provider solely in order to perform marketing for...

  18. 12 CFR 573.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... access number or access code, does not include a number or code in an encrypted form, as long as you do... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To your agent or service provider solely in order to perform marketing for...

  19. 12 CFR 40.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... similar form of access number or access code, does not include a number or code in an encrypted form, as... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To the bank's agent or service provider solely in order to perform marketing...

  20. 17 CFR 248.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... number or access code, does not include a number or code in an encrypted form, as long as you do not... agency, an account number or similar form of access number or access code for a consumer's credit card... number or access code: (1) To your agent or service provider solely in order to perform marketing for...

  1. 12 CFR 40.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... similar form of access number or access code, does not include a number or code in an encrypted form, as... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To the bank's agent or service provider solely in order to perform marketing...

  2. 12 CFR 40.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... similar form of access number or access code, does not include a number or code in an encrypted form, as... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To the bank's agent or service provider solely in order to perform marketing...

  3. 17 CFR 160.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... form of access number or access code, does not include a number or code in an encrypted form, as long... consumer reporting agency, an account number or similar form of access number or access code for a consumer... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  4. 12 CFR 716.12 - Limits on sharing of account number information for marketing purposes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... form of access number or access code, does not include a number or code in an encrypted form, as long... consumer reporting agency, an account number or similar form of access number or access code for a consumer... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  5. 17 CFR 248.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., or similar form of access number or access code, does not include a number or code in an encrypted... consumer reporting agency, an account number or similar form of access number or access code for a consumer... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  6. 12 CFR 716.12 - Limits on sharing of account number information for marketing purposes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... form of access number or access code, does not include a number or code in an encrypted form, as long... consumer reporting agency, an account number or similar form of access number or access code for a consumer... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  7. 17 CFR 248.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... number or access code, does not include a number or code in an encrypted form, as long as you do not... agency, an account number or similar form of access number or access code for a consumer's credit card... number or access code: (1) To your agent or service provider solely in order to perform marketing for...

  8. 12 CFR 40.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... similar form of access number or access code, does not include a number or code in an encrypted form, as... reporting agency, an account number or similar form of access number or access code for a consumer's credit... number or access code: (1) To the bank's agent or service provider solely in order to perform marketing...

  9. 12 CFR 573.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... access number or access code, does not include a number or code in an encrypted form, as long as you do... account number or similar form of access number or access code for a consumer's credit card account... or access code: (1) To your agent or service provider solely in order to perform marketing for your...

  10. 12 CFR 716.12 - Limits on sharing of account number information for marketing purposes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... form of access number or access code, does not include a number or code in an encrypted form, as long... consumer reporting agency, an account number or similar form of access number or access code for a consumer... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  11. NED and SIMBAD Conventions for Bibliographic Reference Coding

    NASA Technical Reports Server (NTRS)

    Schmitz, M.; Helou, G.; Dubois, P.; LaGue, C.; Madore, B.; Jr., H. G. Corwin; Lesteven, S.

    1995-01-01

    The primary purpose of the 'reference code' is to provide a unique and traceable representation of a bibliographic reference within the structure of each database. The code is used frequently in the interfaces as a succinct abbreviation of a full bibliographic reference. Since its inception, it has become a standard code not only for NED and SIMBAD, but also for other bibliographic services.

  12. Creating Semantic Waves: Using Legitimation Code Theory as a Tool to Aid the Teaching of Chemistry

    ERIC Educational Resources Information Center

    Blackie, Margaret A. L.

    2014-01-01

    This is a conceptual paper aimed at chemistry educators. The purpose of this paper is to illustrate the use of the semantic code of Legitimation Code Theory in chemistry teaching. Chemistry is an abstract subject which many students struggle to grasp. Legitimation Code Theory provides a way of separating out abstraction from complexity both of…

  13. QR Codes as Mobile Learning Tools for Labor Room Nurses at the San Pablo Colleges Medical Center

    ERIC Educational Resources Information Center

    Del Rosario-Raymundo, Maria Rowena

    2017-01-01

    Purpose: The purpose of this paper is to explore the use of QR codes as mobile learning tools and examine factors that impact on their usefulness, acceptability and feasibility in assisting the nurses' learning. Design/Methodology/Approach: Study participants consisted of 14 regular, full-time, board-certified LR nurses. Over a two-week period,…

  14. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  15. Computation of Reacting Flows in Combustion Processes

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Chen, Kuo-Huey

    1997-01-01

    The main objective of this research was to develop an efficient three-dimensional computer code for chemically reacting flows. The main computer code developed is ALLSPD-3D. The ALLSPD-3D computer program is developed for the calculation of three-dimensional, chemically reacting flows with sprays. The ALL-SPD code employs a coupled, strongly implicit solution procedure for turbulent spray combustion flows. A stochastic droplet model and an efficient method for treatment of the spray source terms in the gas-phase equations are used to calculate the evaporating liquid sprays. The chemistry treatment in the code is general enough that an arbitrary number of reaction and species can be defined by the users. Also, it is written in generalized curvilinear coordinates with both multi-block and flexible internal blockage capabilities to handle complex geometries. In addition, for general industrial combustion applications, the code provides both dilution and transpiration cooling capabilities. The ALLSPD algorithm, which employs the preconditioning and eigenvalue rescaling techniques, is capable of providing efficient solution for flows with a wide range of Mach numbers. Although written for three-dimensional flows in general, the code can be used for two-dimensional and axisymmetric flow computations as well. The code is written in such a way that it can be run in various computer platforms (supercomputers, workstations and parallel processors) and the GUI (Graphical User Interface) should provide a user-friendly tool in setting up and running the code.

  16. GARLIC - A general purpose atmospheric radiative transfer line-by-line infrared-microwave code: Implementation and evaluation

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian

    2014-04-01

    A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code - GARLIC - is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus.

  17. Voxel-Based Lesion Symptom Mapping of Coarse Coding and Suppression Deficits in Patients With Right Hemisphere Damage

    PubMed Central

    Tompkins, Connie A.; Meigh, Kimberly M.; Prat, Chantel S.

    2015-01-01

    Purpose This study examined right hemisphere (RH) neuroanatomical correlates of lexical–semantic deficits that predict narrative comprehension in adults with RH brain damage. Coarse semantic coding and suppression deficits were related to lesions by voxel-based lesion symptom mapping. Method Participants were 20 adults with RH cerebrovascular accidents. Measures of coarse coding and suppression deficits were computed from lexical decision reaction times at short (175 ms) and long (1000 ms) prime-target intervals. Lesions were drawn on magnetic resonance imaging images and through normalization were registered on an age-matched brain template. Voxel-based lesion symptom mapping analysis was applied to build a general linear model at each voxel. Z score maps were generated for each deficit, and results were interpreted using automated anatomical labeling procedures. Results A deficit in coarse semantic activation was associated with lesions to the RH posterior middle temporal gyrus, dorsolateral prefrontal cortex, and lenticular nuclei. A maintenance deficit for coarsely coded representations involved the RH temporal pole and dorsolateral prefrontal cortex more medially. Ineffective suppression implicated lesions to the RH inferior frontal gyrus and subcortical regions, as hypothesized, along with the rostral temporal pole. Conclusion Beyond their scientific implications, these lesion–deficit correspondences may help inform the clinical diagnosis and enhance decisions about candidacy for deficit-focused treatment to improve narrative comprehension in individuals with RH damage. PMID:26425785

  18. 21 CFR 201.2 - Drugs and devices; National Drug Code numbers.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 4 2014-04-01 2014-04-01 false Drugs and devices; National Drug Code numbers. 201.2 Section 201.2 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.2 Drugs and devices; National Drug Code...

  19. 21 CFR 201.2 - Drugs and devices; National Drug Code numbers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Drugs and devices; National Drug Code numbers. 201.2 Section 201.2 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.2 Drugs and devices; National Drug Code...

  20. 21 CFR 201.2 - Drugs and devices; National Drug Code numbers.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Drugs and devices; National Drug Code numbers. 201.2 Section 201.2 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.2 Drugs and devices; National Drug Code...

  1. 21 CFR 201.2 - Drugs and devices; National Drug Code numbers.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Drugs and devices; National Drug Code numbers. 201.2 Section 201.2 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.2 Drugs and devices; National Drug Code...

  2. 21 CFR 201.2 - Drugs and devices; National Drug Code numbers.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Drugs and devices; National Drug Code numbers. 201.2 Section 201.2 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL LABELING General Labeling Provisions § 201.2 Drugs and devices; National Drug Code...

  3. 21 CFR 19.6 - Code of ethics for government service.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Code of ethics for government service. 19.6 Section 19.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL STANDARDS OF CONDUCT AND CONFLICTS OF INTEREST General Provisions § 19.6 Code of ethics for government...

  4. 21 CFR 19.6 - Code of ethics for government service.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Code of ethics for government service. 19.6 Section 19.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL STANDARDS OF CONDUCT AND CONFLICTS OF INTEREST General Provisions § 19.6 Code of ethics for government...

  5. 21 CFR 19.6 - Code of ethics for government service.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Code of ethics for government service. 19.6 Section 19.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL STANDARDS OF CONDUCT AND CONFLICTS OF INTEREST General Provisions § 19.6 Code of ethics for government...

  6. 21 CFR 19.6 - Code of ethics for government service.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Code of ethics for government service. 19.6 Section 19.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL STANDARDS OF CONDUCT AND CONFLICTS OF INTEREST General Provisions § 19.6 Code of ethics for government...

  7. General phase spaces: from discrete variables to rotor and continuum limits

    NASA Astrophysics Data System (ADS)

    Albert, Victor V.; Pascazio, Saverio; Devoret, Michel H.

    2017-12-01

    We provide a basic introduction to discrete-variable, rotor, and continuous-variable quantum phase spaces, explaining how the latter two can be understood as limiting cases of the first. We extend the limit-taking procedures used to travel between phase spaces to a general class of Hamiltonians (including many local stabilizer codes) and provide six examples: the Harper equation, the Baxter parafermionic spin chain, the Rabi model, the Kitaev toric code, the Haah cubic code (which we generalize to qudits), and the Kitaev honeycomb model. We obtain continuous-variable generalizations of all models, some of which are novel. The Baxter model is mapped to a chain of coupled oscillators and the Rabi model to the optomechanical radiation pressure Hamiltonian. The procedures also yield rotor versions of all models, five of which are novel many-body extensions of the almost Mathieu equation. The toric and cubic codes are mapped to lattice models of rotors, with the toric code case related to U(1) lattice gauge theory.

  8. 48 CFR 3001.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; descriptive questionnaire.

    The National Human Exposure Assessment...

  10. 48 CFR 1601.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... SYSTEM Purpose, Authority, Issuance 1601.104-1 Publication and code arrangement. (a) The FEHBAR and its...

  11. 48 CFR 2101.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2101.104-1 Section 2101.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... REGULATIONS SYSTEM Purpose, Authority, Issuance 2101.104-1 Publication and code arrangement. (a) The LIFAR and...

  12. 48 CFR 3001.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...

  13. 48 CFR 3001.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...

  14. 48 CFR 3001.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Publication and code arrangement. 3001.105-1 Section 3001.105-1 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND... Purpose, Authority, Issuance 3001.105-1 Publication and code arrangement. (a) The HSAR is published in: (1...

  15. Moving from Batch to Field Using the RT3D Reactive Transport Modeling System

    NASA Astrophysics Data System (ADS)

    Clement, T. P.; Gautam, T. R.

    2002-12-01

    The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.

  16. New Tool Released for Engine-Airframe Blade-Out Structural Simulations

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    2004-01-01

    Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.

  17. Residus de 2-formes differentielles sur les surfaces algebriques et applications aux codes correcteurs d'erreurs

    NASA Astrophysics Data System (ADS)

    Couvreur, A.

    2009-05-01

    The theory of algebraic-geometric codes has been developed in the beginning of the 80's after a paper of V.D. Goppa. Given a smooth projective algebraic curve X over a finite field, there are two different constructions of error-correcting codes. The first one, called "functional", uses some rational functions on X and the second one, called "differential", involves some rational 1-forms on this curve. Hundreds of papers are devoted to the study of such codes. In addition, a generalization of the functional construction for algebraic varieties of arbitrary dimension is given by Y. Manin in an article of 1984. A few papers about such codes has been published, but nothing has been done concerning a generalization of the differential construction to the higher-dimensional case. In this thesis, we propose a differential construction of codes on algebraic surfaces. Afterwards, we study the properties of these codes and particularly their relations with functional codes. A pretty surprising fact is that a main difference with the case of curves appears. Indeed, if in the case of curves, a differential code is always the orthogonal of a functional one, this assertion generally fails for surfaces. Last observation motivates the study of codes which are the orthogonal of some functional code on a surface. Therefore, we prove that, under some condition on the surface, these codes can be realized as sums of differential codes. Moreover, we show that some answers to some open problems "a la Bertini" could give very interesting informations on the parameters of these codes.

  18. An FPGA design of generalized low-density parity-check codes for rate-adaptive optical transport networks

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2016-02-01

    Forward error correction (FEC) is as one of the key technologies enabling the next-generation high-speed fiber optical communications. In this paper, we propose a rate-adaptive scheme using a class of generalized low-density parity-check (GLDPC) codes with a Hamming code as local code. We show that with the proposed unified GLDPC decoder architecture, a variable net coding gains (NCGs) can be achieved with no error floor at BER down to 10-15, making it a viable solution in the next-generation high-speed fiber optical communications.

  19. Speech coding and compression using wavelets and lateral inhibitory networks

    NASA Astrophysics Data System (ADS)

    Ricart, Richard

    1990-12-01

    The purpose of this thesis is to introduce the concept of lateral inhibition as a generalized technique for compressing time/frequency representations of electromagnetic and acoustical signals, particularly speech. This requires at least a rudimentary treatment of the theory of frames- which generalizes most commonly known time/frequency distributions -the biology of hearing, and digital signal processing. As such, this material, along with the interrelationships of the disparate subjects, is presented in a tutorial style. This may leave the mathematician longing for more rigor, the neurophysiological psychologist longing for more substantive support of the hypotheses presented, and the engineer longing for a reprieve from the theoretical barrage. Despite the problems that arise when trying to appeal to too wide an audience, this thesis should be a cogent analysis of the compression of time/frequency distributions via lateral inhibitory networks.

  20. Development and Validation of Various Phenotyping Algorithms for Diabetes Mellitus Using Data from Electronic Health Records.

    PubMed

    Esteban, Santiago; Rodríguez Tablado, Manuel; Peper, Francisco; Mahumud, Yamila S; Ricci, Ricardo I; Kopitowski, Karin; Terrasa, Sergio

    2017-01-01

    Precision medicine requires extremely large samples. Electronic health records (EHR) are thought to be a cost-effective source of data for that purpose. Phenotyping algorithms help reduce classification errors, making EHR a more reliable source of information for research. Four algorithm development strategies for classifying patients according to their diabetes status (diabetics; non-diabetics; inconclusive) were tested (one codes-only algorithm; one boolean algorithm, four statistical learning algorithms and six stacked generalization meta-learners). The best performing algorithms within each strategy were tested on the validation set. The stacked generalization algorithm yielded the highest Kappa coefficient value in the validation set (0.95 95% CI 0.91, 0.98). The implementation of these algorithms allows for the exploitation of data from thousands of patients accurately, greatly reducing the costs of constructing retrospective cohorts for research.

  1. PAN AIR modeling studies. [higher order panel method for aircraft design

    NASA Technical Reports Server (NTRS)

    Towne, M. C.; Strande, S. M.; Erickson, L. L.; Kroo, I. M.; Enomoto, F. Y.; Carmichael, R. L.; Mcpherson, K. F.

    1983-01-01

    PAN AIR is a computer program that predicts subsonic or supersonic linear potential flow about arbitrary configurations. The code's versatility and generality afford numerous possibilities for modeling flow problems. Although this generality provides great flexibility, it also means that studies are required to establish the dos and don'ts of modeling. The purpose of this paper is to describe and evaluate a variety of methods for modeling flows with PAN AIR. The areas discussed are effects of panel density, internal flow modeling, forebody modeling in subsonic flow, propeller slipstream modeling, effect of wake length, wing-tail-wake interaction, effect of trailing-edge paneling on the Kutta condition, well- and ill-posed boundary-value problems, and induced-drag calculations. These nine topics address problems that are of practical interest to the users of PAN AIR.

  2. Impact of School Uniforms on Student Discipline and the Learning Climate: A Comparative Case Study of Two Middle Schools with Uniform Dress Codes and Two Middle Schools without Uniform Dress Codes

    ERIC Educational Resources Information Center

    Dulin, Charles Dewitt

    2016-01-01

    The purpose of this research is to evaluate the impact of uniform dress codes on a school's climate for student behavior and learning in four middle schools in North Carolina. The research will compare the perceptions of parents, teachers, and administrators in schools with uniform dress codes against schools without uniform dress codes. This…

  3. Assessment of the prevailing physics codes: LEOPARD, LASER, and EPRI-CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lan, J.S.

    1981-01-01

    In order to analyze core performance and fuel management, it is necessary to verify reactor physics codes in great detail. This kind of work not only serves the purpose of understanding and controlling the characteristics of each code, but also ensures the reliability as codes continually change due to constant modifications and machine transfers. This paper will present the results of a comprehensive verification of three code packages - LEOPARD, LASER, and EPRI-CELL.

  4. Regulatory changes that affect coding for immunotherapy.

    PubMed

    Atwater, J Spencer

    2006-02-01

    During the past decade, a variety of federal regulations have had a significant impact on the way allergen immunotherapy is reimbursed and how Current Procedural Terminology (CPT) codes are used for this purpose. As mandated by the US Congress, the Centers for Medicare and Medicaid Services (CMS) through the Office of the Inspector General (OIG) targeted immunotherapy codes for scrutiny, because they are some of the most frequently used codes. To examine how federal regulations have affected reimbursement for allergy immunotherapy and other allergy services. A review was performed of the OIG survey of allergy immunotherapy and the OIG recommendations on CPT coding compliance guidelines. A preliminary survey found problems with medical appropriateness of allergen immunotherapy. For this reason, the OIG performed a more comprehensive study of 301 physicians using code 95165 to analyze by medical record and billing data whether the new billing rules were being correctly used and found that only 44% of physicians were following the new definition of a billable dose. In the early 1990s, the federal government served notice of its intent to more aggressively identify and prosecute health care providers who improperly billed and collected for medical services. Through the adoption of the 1991 US Sentencing Commission Guidelines, the government sought to enhance compliance by mandating lesser criminal penalties for violating organizations that nevertheless maintained and operated "effective compliance plans." In 2002, the OIG audited health care providers and recouped dollar 14.4 billion in improper payments by Medicare. Between January and June 2003, Medicare excluded 1,241 individual providers and health care entities due to fraudulent billing practices. Federal regulations have significantly affected reimbursement for allergy immunotherapy and other allergy services. Allergists need to be aware of these changes and implement the new recommendations into their practices.

  5. Assigning clinical codes with data-driven concept representation on Dutch clinical free text.

    PubMed

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter

    2017-05-01

    Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.

  6. Introduction of the ASP3D Computer Program for Unsteady Aerodynamic and Aeroelastic Analyses

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    2005-01-01

    A new computer program has been developed called ASP3D (Advanced Small Perturbation 3D), which solves the small perturbation potential flow equation in an advanced form including mass-consistent surface and trailing wake boundary conditions, and entropy, vorticity, and viscous effects. The purpose of the program is for unsteady aerodynamic and aeroelastic analyses, especially in the nonlinear transonic flight regime. The program exploits the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP3D code is the result of a decade of developmental work on improvements to the small perturbation formulation, performed while the author was employed as a Senior Research Scientist in the Configuration Aerodynamics Branch at the NASA Langley Research Center. The ASP3D code is a significant improvement to the state-of-the-art for transonic aeroelastic analyses over the CAP-TSD code (Computational Aeroelasticity Program Transonic Small Disturbance), which was developed principally by the author in the mid-1980s. The author is in a unique position as the developer of both computer programs to compare, contrast, and ultimately make conclusions regarding the underlying formulations and utility of each code. The paper describes the salient features of the ASP3D code including the rationale for improvements in comparison with CAP-TSD. Numerous results are presented to demonstrate the ASP3D capability. The general conclusion is that the new ASP3D capability is superior to the older CAP-TSD code because of the myriad improvements developed and incorporated.

  7. Implementation and Evaluation of Microcomputer Systems for the Republic of Turkey’s Naval Ships.

    DTIC Science & Technology

    1986-03-01

    important database design tool for both logical and physical database design, such as flowcharts or pseudocodes are used for program design. Logical...string manipulation in FORTRAN is difficult but not impossible. BASIC ( Beginners All-Purpose Symbolic Instruction Code): Basic is currently the most...63 APPENDIX B GLOSSARY/ACRONYM LIST AC Alternating Current AP Application Program BASIC Beginners All-purpose Symbolic Instruction Code CCP

  8. IRIG Serial Time Code Formats

    DTIC Science & Technology

    2016-08-01

    codes contain control functions (CFs) that are reserved for encoding various controls, identification, and other special- purpose functions. Time...set of CF bits for the encoding of various control, identification, and other special- purpose functions. The control bits may be programmed in any... recycles yearly. • There are 18 CFs occur between position identifiers P6 and P8. Any CF bit or combination of bits can be programmed to read a

  9. 28 CFR 36.602 - General rule.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.602 General... that a code meets or exceeds the minimum requirements of the Act for the accessibility and usability of...

  10. 28 CFR 36.602 - General rule.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.602 General... that a code meets or exceeds the minimum requirements of the Act for the accessibility and usability of...

  11. 28 CFR 36.602 - General rule.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.602 General... that a code meets or exceeds the minimum requirements of the Act for the accessibility and usability of...

  12. 28 CFR 36.602 - General rule.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.602 General... that a code meets or exceeds the minimum requirements of the Act for the accessibility and usability of...

  13. 28 CFR 36.602 - General rule.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ACCOMMODATIONS AND IN COMMERCIAL FACILITIES Certification of State Laws or Local Building Codes § 36.602 General... that a code meets or exceeds the minimum requirements of the Act for the accessibility and usability of...

  14. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  15. File compression and encryption based on LLS and arithmetic coding

    NASA Astrophysics Data System (ADS)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.

  16. Performance Analysis and Optimization on the UCLA Parallel Atmospheric General Circulation Model Code

    NASA Technical Reports Server (NTRS)

    Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos

    1996-01-01

    An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.

  17. [Prevalence of chronic fatigue syndrome in 4 family practices in Leiden].

    PubMed

    Versluis, R G; de Waal, M W; Opmeer, C; Petri, H; Springer, M P

    1997-08-02

    To determine the prevalence of chronic fatigue syndrome (CFS) in general practice. Descriptive. General practice and primary health care centres in Leyden region, the Netherlands. RNUH-LEO is a computerized database which contains the anonymous patient information of one general practice (with two practitioners) and four primary health care centres. The fourteen participating general practitioners were asked what International Classification of Primary Care (ICPC) code they used to indicate a patient with chronic fatigue or with CFS. With these codes and with the code for depression patients were selected from the database. It then was determined whether these patients met the criteria of CFS by Holmes et al. The general practitioners used 10 codes. Including the code for depression a total of 601 patients were preselected from a total of 23,000 patients in the database. Based on the information from the patients' records in the database, 42 of the preselected patients were selected who might fulfill the Holmes' criteria of CFS. According to the patients' own general practitioner, 25 of the 42 patients would fulfil the Holmes' criteria. The men:women ratio was 1:5. The prevalence of CFS in the population surveyed was estimated to be at least 1.1 per 1,000 patients.

  18. Fisher Matrix Preloaded — FISHER4CAST

    NASA Astrophysics Data System (ADS)

    Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques

    The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.

  19. 2-3D nonlocal transport model in magnetized laser plasmas.

    NASA Astrophysics Data System (ADS)

    Nicolaï, Philippe; Feugeas, Jean-Luc; Schurtz, Guy

    2004-11-01

    We present a model of nonlocal transport for multidimensional radiation magneto-hydrodynamics codes. This model, based on simplified Fokker-Planck equations, aims at extending the formulae of G Schurtz,Ph.Nicolaï and M. Busquet [Phys. Plasmas,7,4238 (2000)] to magnetized plasmas.The improvements concern various points as the electric field effects on nonlocal transport or conversely the kinetic effects on E field. However the main purpose of this work is to generalize the previous model by including magnetic field effects. A complete system of nonlocal equations is derived from kinetic equations with self-consistent E and B fields. These equations are analyzed and simplified in order to be implemented into large laser fusion codes and coupled to other relevent physics. Finally, our model allows to obtain the deformation of the electron distribution function due to nonlocal effects. This deformation leads to a non-maxwellian function which could be used to compute the influence on other physical processes.

  20. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  1. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iandola, F N; O'Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improvesmore » usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.« less

  2. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  3. High Temperature Composite Analyzer (HITCAN) demonstration manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Singhal, S. N; Lackney, J. J.; Murthy, P. L. N.

    1993-01-01

    This manual comprises a variety of demonstration cases for the HITCAN (HIgh Temperature Composite ANalyzer) code. HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. HITCAN is written in FORTRAN 77 computer language and has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. Detailed description of all program variables and terms used in this manual may be found in the User's Manual. The demonstration includes various cases to illustrate the features and analysis capabilities of the HITCAN computer code. These cases include: (1) static analysis, (2) nonlinear quasi-static (incremental) analysis, (3) modal analysis, (4) buckling analysis, (5) fiber degradation effects, (6) fabrication-induced stresses for a variety of structures; namely, beam, plate, ring, shell, and built-up structures. A brief discussion of each demonstration case with the associated input data file is provided. Sample results taken from the actual computer output are also included.

  4. Astrophysical Supercomputing with GPUs: Critical Decisions for Early Adopters

    NASA Astrophysics Data System (ADS)

    Fluke, Christopher J.; Barnes, David G.; Barsdell, Benjamin R.; Hassan, Amr H.

    2011-01-01

    General-purpose computing on graphics processing units (GPGPU) is dramatically changing the landscape of high performance computing in astronomy. In this paper, we identify and investigate several key decision areas, with a goal of simplifying the early adoption of GPGPU in astronomy. We consider the merits of OpenCL as an open standard in order to reduce risks associated with coding in a native, vendor-specific programming environment, and present a GPU programming philosophy based on using brute force solutions. We assert that effective use of new GPU-based supercomputing facilities will require a change in approach from astronomers. This will likely include improved programming training, an increased need for software development best practice through the use of profiling and related optimisation tools, and a greater reliance on third-party code libraries. As with any new technology, those willing to take the risks and make the investment of time and effort to become early adopters of GPGPU in astronomy, stand to reap great benefits.

  5. HPGMG 1.0: A Benchmark for Ranking High Performance Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Mark; Brown, Jed; Shalf, John

    2014-05-05

    This document provides an overview of the benchmark ? HPGMG ? for ranking large scale general purpose computers for use on the Top500 list [8]. We provide a rationale for the need for a replacement for the current metric HPL, some background of the Top500 list and the challenges of developing such a metric; we discuss our design philosophy and methodology, and an overview of the specification of the benchmark. The primary documentation with maintained details on the specification can be found at hpgmg.org and the Wiki and benchmark code itself can be found in the repository https://bitbucket.org/hpgmg/hpgmg.

  6. Development and Implementation of Kumamoto Technopolis Regional Database T-KIND

    NASA Astrophysics Data System (ADS)

    Onoue, Noriaki

    T-KIND (Techno-Kumamoto Information Network for Data-Base) is a system for effectively searching information of technology, human resources and industries which are necessary to realize Kumamoto Technopolis. It is composed of coded database, image database and LAN inside technoresearch park which is the center of R & D in the Technopolis. It constructs on-line system by networking general-purposed computers, minicomputers, optical disk file systems and so on, and provides the service through public telephone line. Two databases are now available on enterprise information and human resource information. The former covers about 4,000 enterprises, and the latter does about 2,000 persons.

  7. To amend the Internal Revenue Code of 1986 to repeal the new carryover basis rules in order to prevent tax increases and the imposition of compliance burdens on many more estates than would benefit from repeal, to retain the estate tax with a $3,500,000 exemption, to reinstitute and update the Pay-As-You-Go requirement of budget neutrality on new tax and mandatory spending legislation, enforced by the threat of annual, automatic sequestration, and for other purposes.

    THOMAS, 111th Congress

    Rep. Pomeroy, Earl [D-ND-At Large

    2009-11-19

    Senate - 01/20/2010 Read the second time. Placed on Senate Legislative Calendar under General Orders. Calendar No. 253. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  8. Cable-Porn and Dial-A-Porn Control Act. Hearing before the Subcommittee on Criminal Law of the Committee on the Judiciary. United States Senate, Ninety-Ninth Congress, First Session on S. 1090. A Bill to Amend Section 1464 of Title 18, United States Code, Relating to Broadcasting Obscene Language, and for Other Purposes (July 31, 1985).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.

    A Senate hearing on the cable porn and dial-a-porn control bill is presented in this document. Opening statements by Senators Jeremiah Denton, Arlen Specter, and Jesse Helms discuss the need for this bill and its content. The text of the bill itself is included. Jack D. Smith, General Counsel of the Federal Communications Commission (FCC)…

  9. Energy Levels and Radiative Rates for Transitions in F-like Sc XIII and Ne-like Sc XII and Y XXX

    NASA Astrophysics Data System (ADS)

    Aggarwal, Kanti

    2018-05-01

    Energy levels, radiative rates and lifetimes are reported for F-like Sc~XIII and Ne-like Sc~XII and Y~XXX for which the general-purpose relativistic atomic structure package ({\\sc grasp}) has been adopted. For all three ions limited data exist in the literature but comparisons have been made wherever possible to assess the accuracy of the calculations. In the present work the lowest 102, 125 and 139 levels have been considered for the respective ions. Additionally, calculations have also been performed with the flexible atomic code ({\\sc fac}) to (particularly) confirm the accuracy of energy levels.

  10. Space shuttle main engine high pressure fuel pump aft platform seal cavity flow analysis

    NASA Technical Reports Server (NTRS)

    Lowry, S. A.; Keeton, L. W.

    1987-01-01

    A general purpose, three-dimensional computational fluid dynamics code named PHOENICS, developed by CHAM Inc., is used to model the flow in the aft-platform seal cavity in the high pressure fuel pump of the space shuttle main engine. The model is used to predict the temperatures, velocities, and pressures in the cavity for six different sets of boundary conditions. The results are presented as input for further analysis of two known problems in the region, specifically: erratic pressures and temperatures in the adjacent coolant liner cavity and cracks in the blade shanks near the outer diameter of the aft-platform seal.

  11. Construction safety program for the National Ignition Facility, July 30, 1999 (NIF-0001374-OC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin, D W

    1999-07-30

    These rules apply to all LLNL employees, non-LLNL employees (including contract labor, supplemental labor, vendors, personnel matrixed/assigned from other National Laboratories, participating guests, visitors and students) and contractors/subcontractors. The General Rules-Code of Safe Practices shall be used by management to promote accident prevention through indoctrination, safety and health training and on-the-job application. As a condition for contracts award, all contractors and subcontractors and their employees must certify on Form S and H A-l that they have read and understand, or have been briefed and understand, the National Ignition Facility OCIP Project General Rules-Code of Safe Practices. (An interpreter must briefmore » those employees who do not speak or read English fluently.) In addition, all contractors and subcontractors shall adopt a written General Rules-Code of Safe Practices that relates to their operations. The General Rules-Code of Safe Practices must be posted at a conspicuous location at the job site office or be provided to each supervisory employee who shall have it readily available. Copies of the General Rules-Code of Safe Practices can also be included in employee safety pamphlets.« less

  12. The revised APTA code of ethics for the physical therapist and standards of ethical conduct for the physical therapist assistant: theory, purpose, process, and significance.

    PubMed

    Swisher, Laura Lee; Hiller, Peggy

    2010-05-01

    In June 2009, the House of Delegates (HOD) of the American Physical Therapy Association (APTA) passed a major revision of the APTA Code of Ethics for physical therapists and the Standards of Ethical Conduct for the Physical Therapist Assistant. The revised documents will be effective July 1, 2010. The purposes of this article are: (1) to provide a historical, professional, and theoretical context for this important revision; (2) to describe the 4-year revision process; (3) to examine major features of the documents; and (4) to discuss the significance of the revisions from the perspective of the maturation of physical therapy as a doctoring profession. PROCESS OF REVISION: The process for revision is delineated within the context of history and the Bylaws of APTA. FORMAT, STRUCTURE, AND CONTENT OF REVISED CORE ETHICS DOCUMENTS: The revised documents represent a significant change in format, level of detail, and scope of application. Previous APTA Codes of Ethics and Standards of Ethical Conduct for the Physical Therapist Assistant have delineated very broad general principles, with specific obligations spelled out in the Ethics and Judicial Committee's Guide for Professional Conduct and Guide for Conduct of the Physical Therapist Assistant. In contrast to the current documents, the revised documents address all 5 roles of the physical therapist, delineate ethical obligations in organizational and business contexts, and align with the tenets of Vision 2020. The significance of this revision is discussed within historical parameters, the implications for physical therapists and physical therapist assistants, the maturation of the profession, societal accountability and moral community, potential regulatory implications, and the inclusive and deliberative process of moral dialogue by which changes were developed, revised, and approved.

  13. QR codes: next level of social media.

    PubMed

    Gottesman, Wesley; Baum, Neil

    2013-01-01

    The OR code, which is short for quick response code, system was invented in Japan for the auto industry. Its purpose was to track vehicles during manufacture; it was designed to allow high-speed component scanning. Now the scanning can be easily accomplished via cell phone, making the technology useful and within reach of your patients. There are numerous applications for OR codes in the contemporary medical practice. This article describes QR codes and how they might be applied for marketing and practice management.

  14. Bayesian Atmospheric Radiative Transfer (BART): Model, Statistics Driver, and Application to HD 209458b

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.

    2014-11-01

    Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  15. ActiWiz 3 – an overview of the latest developments and their application

    NASA Astrophysics Data System (ADS)

    Vincke, H.; Theis, C.

    2018-06-01

    In 2011 the ActiWiz code was developed at CERN in order to optimize the choice of materials for accelerator equipment from a radiological point of view. Since then the code has been extended to allow for calculating complete nuclide inventories and provide evaluations with respect to radiotoxicity, inhalation doses, etc. Until now the software included only pre-defined radiation environments for CERN’s high-energy proton accelerators which were based on FLUKA Monte Carlo calculations. Eventually the decision was taken to invest into a major revamping of the code. Starting with version 3 the software is not limited anymore to pre-defined radiation fields but within a few seconds it can also treat arbitrary environments of which fluence spectra are available. This has become possible due to the use of ~100 CPU years’ worth of FLUKA Monte Carlo simulations as well as the JEFF cross-section library for neutrons < 20 MeV. Eventually the latest code version allowed for the efficient inclusion of 42 additional radiation environments of the LHC experiments as well as considerably more flexibility in view of characterizing also waste from CERN’s Large Electron Positron collider (LEP). New fully integrated analysis functionalities like automatic evaluation of difficult-to-measure nuclides, rapid assessment of the temporal evolution of quantities like radiotoxicity or dose-rates, etc. make the software a powerful tool for characterization complementary to general purpose MC codes like FLUKA. In this paper an overview of the capabilities will be given using recent examples from the domain of waste characterization as well as operational radiation protection.

  16. EASY-II Renaissance: n, p, d, α, γ-induced Inventory Code System

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.

    2014-04-01

    The European Activation SYstem has been re-engineered and re-written in modern programming languages so as to answer today's and tomorrow's needs in terms of activation, transmutation, depletion, decay and processing of radioactive materials. The new FISPACT-II inventory code development project has allowed us to embed many more features in terms of energy range: up to GeV; incident particles: alpha, gamma, proton, deuteron and neutron; and neutron physics: self-shielding effects, temperature dependence and covariance, so as to cover all anticipated application needs: nuclear fission and fusion, accelerator physics, isotope production, stockpile and fuel cycle stewardship, materials characterization and life, and storage cycle management. In parallel, the maturity of modern, truly general purpose libraries encompassing thousands of target isotopes such as TENDL-2012, the evolution of the ENDF-6 format and the capabilities of the latest generation of processing codes PREPRO, NJOY and CALENDF have allowed the activation code to be fed with more robust, complete and appropriate data: cross sections with covariance, probability tables in the resonance ranges, kerma, dpa, gas and radionuclide production and 24 decay types. All such data for the five most important incident particles (n, p, d, α, γ), are placed in evaluated data files up to an incident energy of 200 MeV. The resulting code system, EASY-II is designed as a functional replacement for the previous European Activation System, EASY-2010. It includes many new features and enhancements, but also benefits already from the feedback from extensive validation and verification activities performed with its predecessor.

  17. 77 FR 43103 - Policy on the 2009 Revision of the International Maritime Organization Code for the Construction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ... Offshore Drilling Units AGENCY: Coast Guard, DHS. ACTION: Notice of availability. SUMMARY: The Coast Guard...), Code for the Construction and Equipment of Mobile Offshore Drilling Units, 2009 (2009 MODU Code). CG...: Background and Purpose Foreign documented MODUs engaged in any offshore activity associated with the...

  18. Support for Debugging Automatically Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2001-01-01

    This viewgraph presentation provides information on support sources available for the automatic parallelization of computer program. CAPTools, a support tool developed at the University of Greenwich, transforms, with user guidance, existing sequential Fortran code into parallel message passing code. Comparison routines are then run for debugging purposes, in essence, ensuring that the code transformation was accurate.

  19. A Critical Reflection on Codes of Conduct in Vocational Education

    ERIC Educational Resources Information Center

    Bagnall, Richard G.; Nakar, Sonal

    2018-01-01

    The contemporary cultural context may be seen as presenting a moral void in vocational education, sanctioning the ascendency of instrumental epistemology and a proliferation of codes of conduct, to which workplace actions are expected to conform. Important among the purposes of such codes is that of encouraging ethical conduct, but, true to their…

  20. Highway Safety Program Manual: Volume 6: Codes and Laws.

    ERIC Educational Resources Information Center

    National Highway Traffic Safety Administration (DOT), Washington, DC.

    Volume 6 of the 19-volume Highway Safety Program Manual (which provides guidance to State and local governments on preferred safety practices) concentrates on codes and laws. The purpose and specific objectives of the Codes and Laws Program, Federal authority in the area of highway safety, and policies regarding traffic regulation are described.…

  1. 48 CFR Appendix D to Chapter 7 - Direct USAID Contracts With a U.S. Citizen or a U.S. Resident Alien for Personal Services Abroad

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Government employees for purposes of the Internal Revenue Code (Title 26 of the United States Code) and are... of the Foreign Assistance Act of 1961, as amended, and the Internal Revenue Code (Title 26 of the...

  2. 48 CFR Appendix D to Chapter 7 - Direct USAID Contracts With a U.S. Citizen or a U.S. Resident Alien for Personal Services Abroad

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Government employees for purposes of the Internal Revenue Code (Title 26 of the United States Code) and are... of the Foreign Assistance Act of 1961, as amended, and the Internal Revenue Code (Title 26 of the...

  3. 48 CFR Appendix D to Chapter 7 - Direct USAID Contracts With a U.S. Citizen or a U.S. Resident Alien for Personal Services Abroad

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Government employees for purposes of the Internal Revenue Code (Title 26 of the United States Code) and are... of the Foreign Assistance Act of 1961, as amended, and the Internal Revenue Code (Title 26 of the...

  4. Development of Code-Switching: A Case Study on a Turkish/English/Arabic Multilingual Child

    ERIC Educational Resources Information Center

    Tunaz, Mehmet

    2016-01-01

    The purpose of this research was to investigate the early code switching patterns of a simultaneous multilingual subject (Aris) in accordance with Muysken's (2000) code switching typology: insertion and alternation. Firstly, the records of naturalistic spontaneous conversations were obtained from the parents via e-mail, phone calls and…

  5. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: QUESTIONNAIRE FEEDBACK FORM (UA-D-46.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the coding strategy for the Questionnaire Feedback form. This Questionnaire Feedback form was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; questionnaire feedback form.

    The National Hu...

  6. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DIET DIARY QUESTIONNAIRE (UA-D-43.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Diet Diary Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; diet diary questionnaire.

    The National Human Exposure Assessme...

  7. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: TECHNICIAN WALK-THROUGH QUESTIONNAIRE (UA-D-35.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Technician Walk-Through Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; technician walk-through questionnaire.

    The Nationa...

  8. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR GLOBAL CODING FOR SCANNED FORMS (UA-D-31.1)

    EPA Science Inventory

    The purpose of this SOP is to define the strategy for the Global Coding of Scanned Forms. This procedure applies to the Arizona NHEXAS project and the "Border" study. Keywords: Coding; scannable forms.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal interag...

  9. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Keywords: data; coding; descriptive questionnaire.

    The U.S.-Mexico Border Program is sponso...

  10. Content Analysis Coding Schemes for Online Asynchronous Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  11. Utilizing codes of ethics in health professions education.

    PubMed

    Dahnke, Michael D

    2014-10-01

    Codes of ethics abound in health care, the aims and purposes of which are multiple and varied, from operating as a decision making tool to acting as a standard of practice that can be operational in a legal context to providing a sense of elevated seriousness and professionalism within a field of practice. There is some doubt and controversy, however, regarding the value and use of these codes both in professional practice and in the education of healthcare professionals. I intend to review and analyze the various aims and purposes of ethics codes particularly within the study and practice of healthcare in light of various criticisms of codes of ethics. After weighing the strength and import of these criticisms, I plan to explore effective means for utilizing such codes as part of the ethics education of healthcare professionals. While noting significant limitations of this tool, both in practice and in education, I plan to demonstrate its potential usefulness as well, in both generating critical thinking within the study of ethics and as a guide for practice for the professional.

  12. Application of physics engines in virtual worlds

    NASA Astrophysics Data System (ADS)

    Norman, Mark; Taylor, Tim

    2002-03-01

    Dynamic virtual worlds potentially can provide a much richer and more enjoyable experience than static ones. To realize such worlds, three approaches are commonly used. The first of these, and still widely applied, involves importing traditional animations from a modeling system such as 3D Studio Max. This approach is therefore limited to predefined animation scripts or combinations/blends thereof. The second approach involves the integration of some specific-purpose simulation code, such as car dynamics, and is thus generally limited to one (class of) application(s). The third approach involves the use of general-purpose physics engines, which promise to enable a range of compelling dynamic virtual worlds and to considerably speed up development. By far the largest market today for real-time simulation is computer games, revenues exceeding those of the movie industry. Traditionally, the simulation is produced by game developers in-house for specific titles. However, off-the-shelf middleware physics engines are now available for use in games and related domains. In this paper, we report on our experiences of using middleware physics engines to create a virtual world as an interactive experience, and an advanced scenario where artificial life techniques generate controllers for physically modeled characters.

  13. Cloud immersion building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-12-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.

  14. Efficient Simulation of Secondary Fluorescence Via NIST DTSA-II Monte Carlo.

    PubMed

    Ritchie, Nicholas W M

    2017-06-01

    Secondary fluorescence, the final term in the familiar matrix correction triumvirate Z·A·F, is the most challenging for Monte Carlo models to simulate. In fact, only two implementations of Monte Carlo models commonly used to simulate electron probe X-ray spectra can calculate secondary fluorescence-PENEPMA and NIST DTSA-II a (DTSA-II is discussed herein). These two models share many physical models but there are some important differences in the way each implements X-ray emission including secondary fluorescence. PENEPMA is based on PENELOPE, a general purpose software package for simulation of both relativistic and subrelativistic electron/positron interactions with matter. On the other hand, NIST DTSA-II was designed exclusively for simulation of X-ray spectra generated by subrelativistic electrons. NIST DTSA-II uses variance reduction techniques unsuited to general purpose code. These optimizations help NIST DTSA-II to be orders of magnitude more computationally efficient while retaining detector position sensitivity. Simulations execute in minutes rather than hours and can model differences that result from detector position. Both PENEPMA and NIST DTSA-II are capable of handling complex sample geometries and we will demonstrate that both are of similar accuracy when modeling experimental secondary fluorescence data from the literature.

  15. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    PubMed

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.

  16. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study

    PubMed Central

    Weems, Shelley; Heller, Pamela; Fenton, Susan H.

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity.Coder training and type of record (inpatient versus outpatient) affect coding productivity.Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID:26396553

  17. 24 CFR 200.925c - Model codes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property Standards § 200.925c Model codes. (a... Plumbing Code, 1993 Edition, and the BOCA National Mechanical Code, 1993 Edition, excluding Chapter I, Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood...

  18. 24 CFR 200.925c - Model codes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DEVELOPMENT GENERAL INTRODUCTION TO FHA PROGRAMS Minimum Property Standards § 200.925c Model codes. (a... Plumbing Code, 1993 Edition, and the BOCA National Mechanical Code, 1993 Edition, excluding Chapter I, Administration, for the Building, Plumbing and Mechanical Codes and the references to fire retardant treated wood...

  19. Analytical modeling of operating characteristics of premixing-prevaporizing fuel-air mixing passages. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Chiappetta, L. M.; Edwards, D. E.; Mcvey, J. B.

    1982-01-01

    A user's manual describing the operation of three computer codes (ADD code, PTRAK code, and VAPDIF code) is presented. The general features of the computer codes, the input/output formats, run streams, and sample input cases are described.

  20. 24 CFR 578.75 - General operations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... assistance under this part must meet State or local building codes, and in the absence of State or local building codes, the International Residential Code or International Building Code (as applicable to the type of structure) of the International Code Council. (2) Services provided with assistance under this...

  1. 24 CFR 578.75 - General operations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... assistance under this part must meet State or local building codes, and in the absence of State or local building codes, the International Residential Code or International Building Code (as applicable to the type of structure) of the International Code Council. (2) Services provided with assistance under this...

  2. Biosemiotics: a new understanding of life.

    PubMed

    Barbieri, Marcello

    2008-07-01

    Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes--copying and coding--and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.

  3. GCKP84-general chemical kinetics code for gas-phase flow and batch processes including heat transfer effects

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.; Scullin, V. J.

    1984-01-01

    A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.

  4. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zehtabian, M; Zaker, N; Sina, S

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less

  5. The Role of Spatial Disorientation in Fatal General Aviation Accidents

    NASA Technical Reports Server (NTRS)

    Scheuring, RIchard

    2005-01-01

    In-flight Spatial Disorientation (SD) in pilots is a serious threat to aviation safety. Indeed, SD may play a much larger role in aviation accidents than the approximate 6-8% reported by the National Transportation Safety Board (NTSB) each year, because some accidents coded by the NTSB as aircraft control-not maintained (ACNM) may actually result from SD. The purpose of this study is to determine whether SD is underestimated as a cause of fatal general aviation (GA) accidents in the NTSB database. Fatal GA airplane accidents occurring between January 1995 and December 1999 were reviewed from the NTSB aviation accident database. Cases coded as ACNM or SD as the probable cause were selected for review by a panel of aerospace medicine specialists. Using a rating scale, each rater was instructed to determine if SD was the probable cause of the accident. Agreement between the raters and agreement between the raters and the NTSB were evaluated by Kappa statistics. The raters agreed that 11 out of 20 (55%) accidents coded by the NTSB as ACNM were probably caused by SD (p less than 0.05). Agreement between the raters and the NTSB did not reach significance (p greater than 0.05). The 95% C.I. for the sampling population estimated that between 33-77% of cases that the NTSB identified as ACNM could be identified by aerospace medicine experts as SD. Aerospace medicine specialists agreed that some cases coded by the NTSB as ACNM were probably caused by SD. Consequently, a larger number of accidents may be caused by the pilot succumbing to SD than indicated in the NTSB database. This new information should encourage regulating agencies to insure that pilots receive SD recognition training, enabling them to take appropriate corrective actions during flight. This could lead to new training standards, ultimately saving lives among GA airplane pilots.

  6. Transonic Drag Prediction on a DLR-F6 Transport Configuration Using Unstructured Grid Solvers

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, E. M.; Frink, N. T.; Mavriplis, D. J.; Rausch, R. D.; Milholen, W. E.

    2004-01-01

    A second international AIAA Drag Prediction Workshop (DPW-II) was organized and held in Orlando Florida on June 21-22, 2003. The primary purpose was to inves- tigate the code-to-code uncertainty. address the sensitivity of the drag prediction to grid size and quantify the uncertainty in predicting nacelle/pylon drag increments at a transonic cruise condition. This paper presents an in-depth analysis of the DPW-II computational results from three state-of-the-art unstructured grid Navier-Stokes flow solvers exercised on similar families of tetrahedral grids. The flow solvers are USM3D - a tetrahedral cell-centered upwind solver. FUN3D - a tetrahedral node-centered upwind solver, and NSU3D - a general element node-centered central-differenced solver. For the wingbody, the total drag predicted for a constant-lift transonic cruise condition showed a decrease in code-to-code variation with grid refinement as expected. For the same flight condition, the wing/body/nacelle/pylon total drag and the nacelle/pylon drag increment predicted showed an increase in code-to-code variation with grid refinement. Although the range in total drag for the wingbody fine grids was only 5 counts, a code-to-code comparison of surface pressures and surface restricted streamlines indicated that the three solvers were not all converging to the same flow solutions- different shock locations and separation patterns were evident. Similarly, the wing/body/nacelle/pylon solutions did not appear to be converging to the same flow solutions. Overall, grid refinement did not consistently improve the correlation with experimental data for either the wingbody or the wing/body/nacelle pylon configuration. Although the absolute values of total drag predicted by two of the solvers for the medium and fine grids did not compare well with the experiment, the incremental drag predictions were within plus or minus 3 counts of the experimental data. The correlation with experimental incremental drag was not significantly changed by specifying transition. Although the sources of code-to-code variation in force and moment predictions for the three unstructured grid codes have not yet been identified, the current study reinforces the necessity of applying multiple codes to the same application to assess uncertainty.

  7. Large calculation of the flow over a hypersonic vehicle using a GPU

    NASA Astrophysics Data System (ADS)

    Elsen, Erich; LeGresley, Patrick; Darve, Eric

    2008-12-01

    Graphics processing units are capable of impressive computing performance up to 518 Gflops peak performance. Various groups have been using these processors for general purpose computing; most efforts have focussed on demonstrating relatively basic calculations, e.g. numerical linear algebra, or physical simulations for visualization purposes with limited accuracy. This paper describes the simulation of a hypersonic vehicle configuration with detailed geometry and accurate boundary conditions using the compressible Euler equations. To the authors' knowledge, this is the most sophisticated calculation of this kind in terms of complexity of the geometry, the physical model, the numerical methods employed, and the accuracy of the solution. The Navier-Stokes Stanford University Solver (NSSUS) was used for this purpose. NSSUS is a multi-block structured code with a provably stable and accurate numerical discretization which uses a vertex-based finite-difference method. A multi-grid scheme is used to accelerate the solution of the system. Based on a comparison of the Intel Core 2 Duo and NVIDIA 8800GTX, speed-ups of over 40× were demonstrated for simple test geometries and 20× for complex geometries.

  8. Computer-aided design of antenna structures and components

    NASA Technical Reports Server (NTRS)

    Levy, R.

    1976-01-01

    This paper discusses computer-aided design procedures for antenna reflector structures and related components. The primary design aid is a computer program that establishes cross sectional sizes of the structural members by an optimality criterion. Alternative types of deflection-dependent objectives can be selected for designs subject to constraints on structure weight. The computer program has a special-purpose formulation to design structures of the type frequently used for antenna construction. These structures, in common with many in other areas of application, are represented by analytical models that employ only the three translational degrees of freedom at each node. The special-purpose construction of the program, however, permits coding and data management simplifications that provide advantages in problem size and execution speed. Size and speed are essentially governed by the requirements of structural analysis and are relatively unaffected by the added requirements of design. Computation times to execute several design/analysis cycles are comparable to the times required by general-purpose programs for a single analysis cycle. Examples in the paper illustrate effective design improvement for structures with several thousand degrees of freedom and within reasonable computing times.

  9. A simple clinical coding strategy to improve recording of child maltreatment concerns: an audit study.

    PubMed

    McGovern, Andrew Peter; Woodman, Jenny; Allister, Janice; van Vlymen, Jeremy; Liyanage, Harshana; Jones, Simon; Rafi, Imran; de Lusignan, Simon; Gilbert, Ruth

    2015-01-14

    Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE) but there is evidence of substantial under-recording. To determine whether a simple coding strategy improved recording of maltreatment-related concerns in electronic primary care records. Clinical audit of rates of maltreatment-related coding before January 2010-December 2011 and after January-December 2012 implementation of a simple coding strategy in 11 English family practices. The strategy included encouraging general practitioners to use, always and as a minimum, the Read code 'Child is cause for concern'. A total of 25,106 children aged 0-18 years were registered with these practices. We also undertook a qualitative service evaluation to investigate barriers to recording. Outcomes were recording of 1) any maltreatment-related codes, 2) child protection proceedings and 3) child was a cause for concern. We found increased recording of any maltreatment-related code (rate ratio 1.4; 95% CI 1.1-1.6), child protection procedures (RR 1.4; 95% CI 1.1-1.6) and cause for concern (RR 2.5; 95% CI 1.8-3.4) after implementation of the coding strategy. Clinicians cited the simplicity of the coding strategy as the most important factor assisting implementation. This simple coding strategy improved clinician's recording of maltreatment-related concerns in a small sample of practices with some 'buy-in'. Further research should investigate how recording can best support the doctor-patient relationship. HOW THIS FITS IN: Recording concerns about child maltreatment, including minor concerns, is recommended by the General Medical Council (GMC) and National Institute for Health and Clinical Excellence (NICE), but there is evidence of substantial under-recording. We describe a simple clinical coding strategy that helped general practitioners to improve recording of maltreatment-related concerns. These improvements could improve case finding of children at risk and information sharing.

  10. Optical Surface Analysis Code (OSAC). 7.0

    NASA Technical Reports Server (NTRS)

    Glenn, P.

    1998-01-01

    The purpose of this modification to the Optical Surface Analysis Code (OSAC) is to upgrade the PSF program to allow the user to get proper diffracted energy normalization even when deliberately obscuring rays with internal obscurations.

  11. 41 CFR 101-6.500 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Federal Government for the purpose of performing official business, at least one copy of the Code shall be... display the Code. Display shall be consistent with the decor and architecture of the building space...

  12. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  13. Biosemiotics: a new understanding of life

    NASA Astrophysics Data System (ADS)

    Barbieri, Marcello

    2008-07-01

    Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes—copying and coding—and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.

  14. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: TECHNICIAN WALK-THROUGH QUESTIONNAIRE (UA-D-35.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Technician Walk-Through Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the Border study. Keywords: data; coding; technician walk-through questionnaire.

    The U.S.-Mexi...

  15. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: FOOD DIARY FOLLOW UP (UA-D-10.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Food Diary Follow Up Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; food diary follow up questionnaire.

    The National Human Ex...

  16. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DIET DIARY QUESTIONNAIRE (UA-D-43.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Diet Diary Questionnaire. This questionnaire was developed for use during the Arizona NHEXAS project and the Border study. Keywords: data; coding; diet diary questionnaire.

    The U.S.-Mexico Border Program is spon...

  17. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: TIME DIARY AND ACTIVITY QUESTIONNAIRE (UA-D-9.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for the Time Diary and Activity Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: Data; Coding; Time Diary and Activity Questionnaire.

    The National Hu...

  18. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: QUESTIONNAIRE FEEDBACK FORM (UA-D-46.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the coding strategy for the Questionnaire Feedback form. This Questionnaire Feedback form was developed for use during the Arizona NHEXAS project and the Border study. Keywords: data; coding; questionnaire feedback form.

    The U.S.-Mexico B...

  19. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: ARIZONA LAB DATA (UA-D-13.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for Arizona Lab Data. This strategy was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; lab data forms.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal ...

  20. Advanced Spectral Modeling Development

    DTIC Science & Technology

    1992-09-14

    above, the AFGL line-by-line code already possesses many of the attributes desired of a generally applicable transmittance/radiance simulation code, it...transmittance calculations, (b) perform generalized multiple scattering calculations, (c) calculate both heating and dissociative fluxes, (d) provide...This report is subdivided into task specific subsections. The following section describes our general approach to address these technical issues (Section

  1. Rapid Prediction of Unsteady Three-Dimensional Viscous Flows in Turbopump Geometries

    NASA Technical Reports Server (NTRS)

    Dorney, Daniel J.

    1998-01-01

    A program is underway to improve the efficiency of a three-dimensional Navier-Stokes code and generalize it for nozzle and turbopump geometries. Code modifications have included the implementation of parallel processing software, incorporation of new physical models and generalization of the multiblock capability. The final report contains details of code modifications, numerical results for several nozzle and turbopump geometries, and the implementation of the parallelization software.

  2. Honoring Native American Code Talkers: The Road to the Code Talkers Recognition Act of 2008 (Public Law 110-420)

    ERIC Educational Resources Information Center

    Meadows, William C.

    2011-01-01

    Interest in North American Indian code talkers continues to increase. In addition to numerous works about the Navajo code talkers, several publications on other groups of Native American code talkers--including the Choctaw, Comanche, Hopi, Meskwaki, Canadian Cree--and about code talkers in general have appeared. This article chronicles recent…

  3. General practitioners learning qualitative research: A case study of postgraduate education.

    PubMed

    Hepworth, Julie; Kay, Margaret

    2015-10-01

    Qualitative research is increasingly being recognised as a vital aspect of primary healthcare research. Teaching and learning how to conduct qualitative research is especially important for general practitioners and other clinicians in the professional educational setting. This article examines a case study of postgraduate professional education in qualitative research for clinicians, for the purpose of enabling a robust discussion around teaching and learning in medicine and the health sciences. A series of three workshops was delivered for primary healthcare academics. The workshops were evaluated using a quantitative survey and qualitative free-text responses to enable descriptive analyses. Participants found qualitative philosophy and theory the most difficult areas to engage with, and learning qualitative coding and analysis was considered the easiest to learn. Key elements for successful teaching were identified, including the use of adult learning principles, the value of an experienced facilitator and an awareness of the impact of clinical subcultures on learning.

  4. Advanced Architectures for Astrophysical Supercomputing

    NASA Astrophysics Data System (ADS)

    Barsdell, B. R.; Barnes, D. G.; Fluke, C. J.

    2010-12-01

    Astronomers have come to rely on the increasing performance of computers to reduce, analyze, simulate and visualize their data. In this environment, faster computation can mean more science outcomes or the opening up of new parameter spaces for investigation. If we are to avoid major issues when implementing codes on advanced architectures, it is important that we have a solid understanding of our algorithms. A recent addition to the high-performance computing scene that highlights this point is the graphics processing unit (GPU). The hardware originally designed for speeding-up graphics rendering in video games is now achieving speed-ups of O(100×) in general-purpose computation - performance that cannot be ignored. We are using a generalized approach, based on the analysis of astronomy algorithms, to identify the optimal problem-types and techniques for taking advantage of both current GPU hardware and future developments in computing architectures.

  5. Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique

    NASA Technical Reports Server (NTRS)

    Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann

    2010-01-01

    We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package

  6. Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; White, Todd; Mangini, Nancy

    2009-01-01

    Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.

  7. Second order gyrokinetic theory for particle-in-cell codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tronko, Natalia; Bottino, Alberto; Sonnendrücker, Eric

    2016-08-15

    The main idea of the gyrokinetic dynamical reduction consists in a systematical removal of the fast scale motion (the gyromotion) from the dynamics of the plasma, resulting in a considerable simplification and a significant gain of computational time. The gyrokinetic Maxwell–Vlasov equations are nowadays implemented in for modeling (both laboratory and astrophysical) strongly magnetized plasmas. Different versions of the reduced set of equations exist, depending on the construction of the gyrokinetic reduction procedure and the approximations performed in the derivation. The purpose of this article is to explicitly show the connection between the general second order gyrokinetic Maxwell–Vlasov system issuedmore » from the modern gyrokinetic theory and the model currently implemented in the global electromagnetic Particle-in-Cell code ORB5. Necessary information about the modern gyrokinetic formalism is given together with the consistent derivation of the gyrokinetic Maxwell–Vlasov equations from first principles. The variational formulation of the dynamics is used to obtain the corresponding energy conservation law, which in turn is used for the verification of energy conservation diagnostics currently implemented in ORB5. This work fits within the context of the code verification project VeriGyro currently run at IPP Max-Planck Institut in collaboration with others European institutions.« less

  8. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β(+)-emitting nuclei during therapeutic particle irradiation to measured data.

    PubMed

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-21

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β(+)-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β(+)-activity and dose is not feasible, a simulation of the expected β(+)-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β(+)-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β(+)-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  9. Comparison of PHITS, GEANT4, and HIBRAC simulations of depth-dependent yields of β+-emitting nuclei during therapeutic particle irradiation to measured data

    NASA Astrophysics Data System (ADS)

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2013-09-01

    For quality assurance in particle therapy, a non-invasive, in vivo range verification is highly desired. Particle therapy positron-emission-tomography (PT-PET) is the only clinically proven method up to now for this purpose. It makes use of the β+-activity produced during the irradiation by the nuclear fragmentation processes between the therapeutic beam and the irradiated tissue. Since a direct comparison of β+-activity and dose is not feasible, a simulation of the expected β+-activity distribution is required. For this reason it is essential to have a quantitatively reliable code for the simulation of the yields of the β+-emitting nuclei at every position of the beam path. In this paper results of the three-dimensional Monte-Carlo simulation codes PHITS, GEANT4, and the one-dimensional deterministic simulation code HIBRAC are compared to measurements of the yields of the most abundant β+-emitting nuclei for carbon, lithium, helium, and proton beams. In general, PHITS underestimates the yields of positron-emitters. With GEANT4 the overall most accurate results are obtained. HIBRAC and GEANT4 provide comparable results for carbon and proton beams. HIBRAC is considered as a good candidate for the implementation to clinical routine PT-PET.

  10. Satellite attitude motion models for capture and retrieval investigations

    NASA Technical Reports Server (NTRS)

    Cochran, John E., Jr.; Lahr, Brian S.

    1986-01-01

    The primary purpose of this research is to provide mathematical models which may be used in the investigation of various aspects of the remote capture and retrieval of uncontrolled satellites. Emphasis has been placed on analytical models; however, to verify analytical solutions, numerical integration must be used. Also, for satellites of certain types, numerical integration may be the only practical or perhaps the only possible method of solution. First, to provide a basis for analytical and numerical work, uncontrolled satellites were categorized using criteria based on: (1) orbital motions, (2) external angular momenta, (3) internal angular momenta, (4) physical characteristics, and (5) the stability of their equilibrium states. Several analytical solutions for the attitude motions of satellite models were compiled, checked, corrected in some minor respects and their short-term prediction capabilities were investigated. Single-rigid-body, dual-spin and multi-rotor configurations are treated. To verify the analytical models and to see how the true motion of a satellite which is acted upon by environmental torques differs from its corresponding torque-free motion, a numerical simulation code was developed. This code contains a relatively general satellite model and models for gravity-gradient and aerodynamic torques. The spacecraft physical model for the code and the equations of motion are given. The two environmental torque models are described.

  11. Comparison of memory thresholds for planar qudit geometries

    NASA Astrophysics Data System (ADS)

    Marks, Jacob; Jochym-O'Connor, Tomas; Gheorghiu, Vlad

    2017-11-01

    We introduce and analyze a new type of decoding algorithm called general color clustering, based on renormalization group methods, to be used in qudit color codes. The performance of this decoder is analyzed under a generalized bit-flip error model, and is used to obtain the first memory threshold estimates for qudit 6-6-6 color codes. The proposed decoder is compared with similar decoding schemes for qudit surface codes as well as the current leading qubit decoders for both sets of codes. We find that, as with surface codes, clustering performs sub-optimally for qubit color codes, giving a threshold of 5.6 % compared to the 8.0 % obtained through surface projection decoding methods. However, the threshold rate increases by up to 112% for large qudit dimensions, plateauing around 11.9 % . All the analysis is performed using QTop, a new open-source software for simulating and visualizing topological quantum error correcting codes.

  12. Development and Implementation of Non-Newtonian Rheology Into the Generalized Fluid System Simulation Program (GFSSP)

    NASA Technical Reports Server (NTRS)

    DiSalvo, Roberto; Deaconu, Stelu; Majumdar, Alok

    2006-01-01

    One of the goals of this program was to develop the experimental and analytical/computational tools required to predict the flow of non-Newtonian fluids through the various system components of a propulsion system: pipes, valves, pumps etc. To achieve this goal we selected to augment the capabilities of NASA's Generalized Fluid System Simulation Program (GFSSP) software. GFSSP is a general-purpose computer program designed to calculate steady state and transient pressure and flow distributions in a complex fluid network. While the current version of the GFSSP code is able to handle various systems components the implicit assumption in the code is that the fluids in the system are Newtonian. To extend the capability of the code to non-Newtonian fluids, such as silica gelled fuels and oxidizers, modifications to the momentum equations of the code have been performed. We have successfully implemented in GFSSP flow equations for fluids with power law behavior. The implementation of the power law fluid behavior into the GFSSP code depends on knowledge of the two fluid coefficients, n and K. The determination of these parameters for the silica gels used in this program was performed experimentally. The n and K parameters for silica water gels were determined experimentally at CFDRC's Special Projects Laboratory, with a constant shear rate capillary viscometer. Batches of 8:1 (by weight) water-silica gel were mixed using CFDRC s 10-gallon gelled propellant mixer. Prior to testing the gel was allowed to rest in the rheometer tank for at least twelve hours to ensure that the delicate structure of the gel had sufficient time to reform. During the tests silica gel was pressure fed and discharged through stainless steel pipes ranging from 1", to 36", in length and three diameters; 0.0237", 0.032", and 0.047". The data collected in these tests included pressure at tube entrance and volumetric flowrate. From these data the uncorrected shear rate, shear stress, residence time, and viscosity were evaluated using formulae for non-Newtonian, power law fluids. The maximum shear rates (corrected for entrance effects) obtained in the rheometer with the current setup were in the 150,000 to 170,000sec- range. GFSSP simulations were performed with a flow circuit simulating the capillary rheometer and using Power Law gel viscosity coefficients from the experimental data. The agreement between the experimental data and the simulated flow curves was within +/-4% given quality entrance effect data.

  13. National Underground Mines Inventory

    DTIC Science & Technology

    1983-10-01

    system is well designed to minimize water accumulation on the drift levels. In many areas, sufficient water has accumulated to make the use of boots a...four characters designate Field office. 17-18 State Code Pic 99 FIPS code for state in which minets located. 19-21 County Code Plc 999 FIPS code for... Designate a general product class based onSIC code. 28-29 Nine Type Plc 99 Natal/Nonmetal mine type code. Based on subunit operations code and canvass code

  14. Multi-Zone Liquid Thrust Chamber Performance Code with Domain Decomposition for Parallel Processing

    NASA Technical Reports Server (NTRS)

    Navaz, Homayun K.

    2002-01-01

    Computational Fluid Dynamics (CFD) has considerably evolved in the last decade. There are many computer programs that can perform computations on viscous internal or external flows with chemical reactions. CFD has become a commonly used tool in the design and analysis of gas turbines, ramjet combustors, turbo-machinery, inlet ducts, rocket engines, jet interaction, missile, and ramjet nozzles. One of the problems of interest to NASA has always been the performance prediction for rocket and air-breathing engines. Due to the complexity of flow in these engines it is necessary to resolve the flowfield into a fine mesh to capture quantities like turbulence and heat transfer. However, calculation on a high-resolution grid is associated with a prohibitively increasing computational time that can downgrade the value of the CFD for practical engineering calculations. The Liquid Thrust Chamber Performance (LTCP) code was developed for NASA/MSFC (Marshall Space Flight Center) to perform liquid rocket engine performance calculations. This code is a 2D/axisymmetric full Navier-Stokes (NS) solver with fully coupled finite rate chemistry and Eulerian treatment of liquid fuel and/or oxidizer droplets. One of the advantages of this code has been the resemblance of its input file to the JANNAF (Joint Army Navy NASA Air Force Interagency Propulsion Committee) standard TDK code, and its automatic grid generation for JANNAF defined combustion chamber wall geometry. These options minimize the learning effort for TDK users, and make the code a good candidate for performing engineering calculations. Although the LTCP code was developed for liquid rocket engines, it is a general-purpose code and has been used for solving many engineering problems. However, the single zone formulation of the LTCP has limited the code to be applicable to problems with complex geometry. Furthermore, the computational time becomes prohibitively large for high-resolution problems with chemistry, two-equation turbulence model, and two-phase flow. To overcome these limitations, the LTCP code is rewritten to include the multi-zone capability with domain decomposition that makes it suitable for parallel processing, i.e., enabling the code to run every zone or sub-domain on a separate processor. This can reduce the run time by a factor of 6 to 8, depending on the problem.

  15. 41 CFR 105-72.502 - Codes of conduct.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Codes of conduct. 105-72.502 Section 105-72.502 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services...

  16. A FORTRAN code for the calculation of probe volume geometry changes in a laser anemometry system caused by window refraction

    NASA Technical Reports Server (NTRS)

    Owen, Albert K.

    1987-01-01

    A computer code was written which utilizes ray tracing techniques to predict the changes in position and geometry of a laser Doppler velocimeter probe volume resulting from refraction effects. The code predicts the position change, changes in beam crossing angle, and the amount of uncrossing that occur when the beams traverse a region with a changed index of refraction, such as a glass window. The code calculates the changes for flat plate, cylinder, general axisymmetric and general surface windows and is currently operational on a VAX 8600 computer system.

  17. Effect Coding as a Mechanism for Improving the Accuracy of Measuring Students Who Self-Identify with More than One Race

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this paper is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, multi-raced independent variables in higher education research. Not only may effect coding enable researchers to get closer to respondents' original intentions, it allows for more accurate analyses of all race…

  18. 12 CFR 1016.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... encrypted form, as long as you do not provide the recipient with a means to decode the number or code. (2... reporting agency, an account number or similar form of access number or access code for a consumer's credit... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  19. 12 CFR 1016.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... encrypted form, as long as you do not provide the recipient with a means to decode the number or code. (2... reporting agency, an account number or similar form of access number or access code for a consumer's credit... similar form of access number or access code: (1) To your agent or service provider solely in order to...

  20. 12 CFR 216.12 - Limits on sharing account number information for marketing purposes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... code in an encrypted form, as long as you do not provide the recipient with a means to decode the... than to a consumer reporting agency, an account number or similar form of access number or access code... account number or similar form of access number or access code: (1) To your agent or service provider...

Top