Sample records for code development description

  1. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  2. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; descriptive questionnaire.

    The National Human Exposure Assessment...

  3. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: DESCRIPTIVE QUESTIONNAIRE (UA-D-6.0)

    EPA Science Inventory

    This purpose of this SOP is to define the coding strategy for the Descriptive Questionnaire. This questionnaire was developed for use in the Arizona NHEXAS project and the Border study. Keywords: data; coding; descriptive questionnaire.

    The U.S.-Mexico Border Program is sponso...

  4. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C

    2016-06-01

    Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. The language parallel Pascal and other aspects of the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Reeves, A. P.; Bruner, J. D.

    1982-01-01

    A high level language for the Massively Parallel Processor (MPP) was designed. This language, called Parallel Pascal, is described in detail. A description of the language design, a description of the intermediate language, Parallel P-Code, and details for the MPP implementation are included. Formal descriptions of Parallel Pascal and Parallel P-Code are given. A compiler was developed which converts programs in Parallel Pascal into the intermediate Parallel P-Code language. The code generator to complete the compiler for the MPP is being developed independently. A Parallel Pascal to Pascal translator was also developed. The architecture design for a VLSI version of the MPP was completed with a description of fault tolerant interconnection networks. The memory arrangement aspects of the MPP are discussed and a survey of other high level languages is given.

  6. Optimal bit allocation for hybrid scalable/multiple-description video transmission over wireless channels

    NASA Astrophysics Data System (ADS)

    Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.

    2006-01-01

    In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.

  7. Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*

    PubMed Central

    Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab

    2006-01-01

    This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546

  8. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  9. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  10. Downtown Waterfront Form-Based Code Workshop

    EPA Pesticide Factsheets

    This document is a description of a Smart Growth Implementation Assistance for Coastal Communities project in Marquette, Michigan, to develop a form-based code that would attract and support vibrant development.

  11. Software quality and process improvement in scientific simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  12. Utilization of recently developed codes for high power Brayton and Rankine cycle power systems

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.

    1993-01-01

    Two recently developed FORTRAN computer codes for high power Brayton and Rankine thermodynamic cycle analysis for space power applications are presented. The codes were written in support of an effort to develop a series of subsystem models for multimegawatt Nuclear Electric Propulsion, but their use is not limited just to nuclear heat sources or to electric propulsion. Code development background, a description of the codes, some sample input/output from one of the codes, and state future plans/implications for the use of these codes by NASA's Lewis Research Center are provided.

  13. Transport and equilibrium in field-reversed mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd, J.K.

    Two plasma models relevant to compact torus research have been developed to study transport and equilibrium in field reversed mirrors. In the first model for small Larmor radius and large collision frequency, the plasma is described as an adiabatic hydromagnetic fluid. In the second model for large Larmor radius and small collision frequency, a kinetic theory description has been developed. Various aspects of the two models have been studied in five computer codes ADB, AV, NEO, OHK, RES. The ADB code computes two dimensional equilibrium and one dimensional transport in a flux coordinate. The AV code calculates orbit average integralsmore » in a harmonic oscillator potential. The NEO code follows particle trajectories in a Hill's vortex magnetic field to study stochasticity, invariants of the motion, and orbit average formulas. The OHK code displays analytic psi(r), B/sub Z/(r), phi(r), E/sub r/(r) formulas developed for the kinetic theory description. The RES code calculates resonance curves to consider overlap regions relevant to stochastic orbit behavior.« less

  14. Development of a MELCOR Sodium Chemistry (NAC) Package - FY17 Progress.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    This report describes the status of the development of MELCOR Sodium Chemistry (NAC) package. This development is based on the CONTAIN-LMR sodium physics and chemistry models to be implemented in MELCOR. In the past three years, the sodium equation of state as a working fluid from the nuclear fusion safety research and from the SIMMER code has been implemented into MELCOR. The chemistry models from the CONTAIN-LMR code, such as the spray and pool fire mode ls, have also been implemented into MELCOR. This report describes the implemented models and the issues encountered. Model descriptions and input descriptions are provided.more » Development testing of the spray and pool fire models is described, including the code-to-code comparison with CONTAIN-LMR. The report ends with an expected timeline for the remaining models to be implemented, such as the atmosphere chemistry, sodium-concrete interactions, and experimental validation tests .« less

  15. Reactive transport codes for subsurface environmental simulation

    DOE PAGES

    Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...

    2014-09-26

    A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less

  16. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  17. 50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Gear Codes, Descriptions, and Use 15 Table... ALASKA Pt. 679, Table 15 Table 15 to Part 679—Gear Codes, Descriptions, and Use Gear Codes, Descriptions, and Use (X indicates where this code is used) Name of gear Use alphabetic code to complete the...

  18. The Social Interactive Coding System (SICS): An On-Line, Clinically Relevant Descriptive Tool.

    ERIC Educational Resources Information Center

    Rice, Mabel L.; And Others

    1990-01-01

    The Social Interactive Coding System (SICS) assesses the continuous verbal interactions of preschool children as a function of play areas, addressees, script codes, and play levels. This paper describes the 26 subjects and the setting involved in SICS development, coding definitions and procedures, training procedures, reliability, sample…

  19. CRAC2 model description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.

  20. Coupled-cluster based R-matrix codes (CCRM): Recent developments

    NASA Astrophysics Data System (ADS)

    Sur, Chiranjib; Pradhan, Anil K.

    2008-05-01

    We report the ongoing development of the new coupled-cluster R-matrix codes (CCRM) for treating electron-ion scattering and radiative processes within the framework of the relativistic coupled-cluster method (RCC), interfaced with the standard R-matrix methodology. The RCC method is size consistent and in principle equivalent to an all-order many-body perturbation theory. The RCC method is one of the most accurate many-body theories, and has been applied for several systems. This project should enable the study of electron-interactions with heavy atoms/ions, utilizing not only high speed computing platforms but also improved theoretical description of the relativistic and correlation effects for the target atoms/ions as treated extensively within the RCC method. Here we present a comprehensive outline of the newly developed theoretical method and a schematic representation of the new suite of CCRM codes. We begin with the flowchart and description of various stages involved in this development. We retain the notations and nomenclature of different stages as analogous to the standard R-matrix codes.

  1. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  2. User's manual: Subsonic/supersonic advanced panel pilot code

    NASA Technical Reports Server (NTRS)

    Moran, J.; Tinoco, E. N.; Johnson, F. T.

    1978-01-01

    Sufficient instructions for running the subsonic/supersonic advanced panel pilot code were developed. This software was developed as a vehicle for numerical experimentation and it should not be construed to represent a finished production program. The pilot code is based on a higher order panel method using linearly varying source and quadratically varying doublet distributions for computing both linearized supersonic and subsonic flow over arbitrary wings and bodies. This user's manual contains complete input and output descriptions. A brief description of the method is given as well as practical instructions for proper configurations modeling. Computed results are also included to demonstrate some of the capabilities of the pilot code. The computer program is written in FORTRAN IV for the SCOPE 3.4.4 operations system of the Ames CDC 7600 computer. The program uses overlay structure and thirteen disk files, and it requires approximately 132000 (Octal) central memory words.

  3. Multiple description distributed image coding with side information for mobile wireless transmission

    NASA Astrophysics Data System (ADS)

    Wu, Min; Song, Daewon; Chen, Chang Wen

    2005-03-01

    Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.

  4. Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold

    1997-01-01

    The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.

  5. Utilization of Patch/Triangular Target Description Data in BRL Parallel Ray Vulnerability Assessment Codes

    DTIC Science & Technology

    1979-09-01

    KEY WORDS (Continue on revmrem elde It necmmemry and Identity by block number) Target Descriptions GIFT Code C0MGE0M Descriptions FASTGEN Code...which accepts the COMGEOM target description and 1 2 produces the shotline data is the GIFT ’ code. The GIFT code evolved 3 4 from and has...the COMGEOM/ GIFT methodology, while the Navy and Air Force use the PATCH/SHOTGEN-FASTGEN methodology. Lawrence W. Bain, Mathew J. Heisinger

  6. User's manual for three dimensional FDTD version B code for scattering from frequency-dependent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file, a discussion of radar cross section computations, a discussion of some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  7. The Marshall Engineering Thermosphere (MET) Model. Volume 1; Technical Description

    NASA Technical Reports Server (NTRS)

    Smith, R. E.

    1998-01-01

    Volume 1 presents a technical description of the Marshall Engineering Thermosphere (MET) model atmosphere and a summary of its historical development. Various programs developed to augment the original capability of the model are discussed in detail. The report also describes each of the individual subroutines developed to enhance the model. Computer codes for these subroutines are contained in four appendices.

  8. Development and validation of a complementary map to enhance the existing 1998 to 2008 Abbreviated Injury Scale map

    PubMed Central

    2011-01-01

    Introduction Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Methods Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. Results The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. Conclusions The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available. PMID:21548991

  9. Development and validation of a complementary map to enhance the existing 1998 to 2008 Abbreviated Injury Scale map.

    PubMed

    Palmer, Cameron S; Franklyn, Melanie; Read-Allsopp, Christine; McLellan, Susan; Niggemeyer, Louise E

    2011-05-08

    Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available.

  10. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  11. Micro PAVER, Version 1.0, User’s Guide, Airport Pavement Management System,

    DTIC Science & Technology

    1986-10-01

    repair data have been entered for the policy, the following prompts will appear on your screen. Policy Numbr:I Policy Description: PRIMA Y UNW AYS ND... Materia ’ Codes (those material codes entered by the Micro PAVER developers) can not be modified or deleted. New material codes can be added, modified, or

  12. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  13. User's manual for three dimensional FDTD version D code for scattering from frequency-dependent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version D is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version D code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMOND.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  14. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  15. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  16. User's manual for three dimensional FDTD version B code for scattering from frequency-dependent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version B is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version B code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONB.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  17. Liquid rocket combustor computer code development

    NASA Technical Reports Server (NTRS)

    Liang, P. Y.

    1985-01-01

    The Advanced Rocket Injector/Combustor Code (ARICC) that has been developed to model the complete chemical/fluid/thermal processes occurring inside rocket combustion chambers are highlighted. The code, derived from the CONCHAS-SPRAY code originally developed at Los Alamos National Laboratory incorporates powerful features such as the ability to model complex injector combustion chamber geometries, Lagrangian tracking of droplets, full chemical equilibrium and kinetic reactions for multiple species, a fractional volume of fluid (VOF) description of liquid jet injection in addition to the gaseous phase fluid dynamics, and turbulent mass, energy, and momentum transport. Atomization and droplet dynamic models from earlier generation codes are transplated into the present code. Currently, ARICC is specialized for liquid oxygen/hydrogen propellants, although other fuel/oxidizer pairs can be easily substituted.

  18. First principles numerical model of avalanche-induced arc discharges in electron-irradiated dielectrics

    NASA Technical Reports Server (NTRS)

    Beers, B. L.; Pine, V. W.; Hwang, H. C.; Bloomberg, H. W.; Lin, D. L.; Schmidt, M. J.; Strickland, D. J.

    1979-01-01

    The model consists of four phases: single electron dynamics, single electron avalanche, negative streamer development, and tree formation. Numerical algorithms and computer code implementations are presented for the first three phases. An approach to developing a code description of fourth phase is discussed. Numerical results are presented for a crude material model of Teflon.

  19. TFaNS Tone Fan Noise Design/Prediction System. Volume 1; System Description, CUP3D Technical Documentation and Manual for Code Developers

    NASA Technical Reports Server (NTRS)

    Topol, David A.

    1999-01-01

    TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFaNS consists of: The codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. Cup3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report provides technical background for TFaNS including the organization of the system and CUP3D technical documentation. This document also provides information for code developers who must write Acoustic Property Files in the CUP3D format. This report is divided into three volumes: Volume I: System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume II: User's Manual, TFaNS Vers. 1.4; Volume III: Evaluation of System Codes.

  20. Progress toward the development of an aircraft icing analysis capability

    NASA Technical Reports Server (NTRS)

    Shaw, R. J.

    1984-01-01

    An overview of the NASA efforts to develop an aircraft icing analysis capability is presented. Discussions are included of the overall and long term objectives of the program as well as current capabilities and limitations of the various computer codes being developed. Descriptions are given of codes being developed to analyze two and three dimensional trajectories of water droplets, airfoil ice accretion, aerodynamic performance degradation of components and complete aircraft configurations, electrothermal deicer, and fluid freezing point depressant deicer. The need for bench mark and verification data to support the code development is also discussed.

  1. The EDIT-COMGEOM Code

    DTIC Science & Technology

    1975-09-01

    This report assumes a familiarity with the GIFT and MAGIC computer codes. The EDIT-COMGEOM code is a FORTRAN computer code. The EDIT-COMGEOM code...converts the target description data which was used in the MAGIC computer code to the target description data which can be used in the GIFT computer code

  2. Infrastructure for Rapid Development of Java GUI Programs

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip

    2006-01-01

    The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.

  3. Math Description Engine Software Development Kit

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.

    2010-01-01

    The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.

  4. Computer programs to predict induced effects of jets exhausting into a crossflow

    NASA Technical Reports Server (NTRS)

    Perkins, S. C., Jr.; Mendenhall, M. R.

    1984-01-01

    A user's manual for two computer programs was developed to predict the induced effects of jets exhausting into a crossflow. Program JETPLT predicts pressures induced on an infinite flat plate by a jet exhausting at angles to the plate and Program JETBOD, in conjunction with a panel code, predicts pressures induced on a body of revolution by a jet exhausting normal to the surface. Both codes use a potential model of the jet and adjacent surface with empirical corrections for the viscous or nonpotential effects. This program manual contains a description of the use of both programs, instructions for preparation of input, descriptions of the output, limitations of the codes, and sample cases. In addition, procedures to extend both codes to include additional empirical correlations are described.

  5. User's manual for three-dimensional analysis of propeller flow fields

    NASA Technical Reports Server (NTRS)

    Chaussee, D. S.; Kutler, P.

    1983-01-01

    A detailed operating manual is presented for the prop-fan computer code (in addition to supporting programs) recently developed by Kutler, Chaussee, Sorenson, and Pulliam while at the NASA'S Ames Research Center. This code solves the inviscid Euler equations using an implicit numerical procedure developed by Beam and Warming of Ames. A description of the underlying theory, numerical techniques, and boundary conditions with equations, formulas, and methods for the mesh generation program (MGP), three dimensional prop-fan flow field program (3DPFP), and data reduction program (DRP) is provided, together with complete operating instructions. In addition, a programmer's manual is also provided to assist the user interested in modifying the codes. Included in the programmer's manual for each program is a description of the input and output variables, flow charts, program listings, sample input and output data, and operating hints.

  6. PubMed Central

    Leduc, Y.; Cauchon, M.; Emond, J. G.; Ouellet, J.

    1995-01-01

    OBJECTIVE: To develop and implement a computerized version of the International Classification of Primary Care. To create a data bank and to conduct a descriptive study of our clinic's clientele. DESIGN: Testing a software program and creating a data bank. SETTING: Family Medicine Unit at Enfant-Jésus Hospital, Quebec City. PARTICIPANTS: All Family Medicine Unit doctors and patients seen between July 1, 1990, and June 30, 1993. MAIN OUTCOME MEASURE: Description of our clientele's health problems using the ICPC. RESULTS: During the study, 48,415 diagnostic codes for 33,033 visits were entered into the bank. For close to 50% of these visits, two or more health problems were coded. There was good correlation between the description of our clientele and descriptions in other studies in the literature. CONCLUSION: This article describes the development of a data bank in a family medicine unit using a software program based on the ICPC. Our 3-year experiment demonstrated that the method works well in family physicians' daily practice. A descriptive study of our clientele is presented, as well as a few examples of the many applications of such a data bank. PMID:7580382

  7. REXOR 2 rotorcraft simulation model. Volume 1: Engineering documentation

    NASA Technical Reports Server (NTRS)

    Reaser, J. S.; Kretsinger, P. H.

    1978-01-01

    A rotorcraft nonlinear simulation called REXOR II, divided into three volumes, is described. The first volume is a development of rotorcraft mechanics and aerodynamics. The second is a development and explanation of the computer code required to implement the equations of motion. The third volume is a user's manual, and contains a description of code input/output as well as operating instructions.

  8. 42 CFR Appendix A to Part 81 - Glossary of ICD-9 Codes and Their Cancer Descriptions 1

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Glossary of ICD-9 Codes and Their Cancer.... 81, App. A Appendix A to Part 81—Glossary of ICD-9 Codes and Their Cancer Descriptions 1 ICD-9 code Cancer description 140 Malignant neoplasm of lip. 141 Malignant neoplasm of tongue. 142 Malignant...

  9. Computer Description of Black Hawk Helicopter

    DTIC Science & Technology

    1979-06-01

    Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents

  10. The "Motherese" of Mr. Rogers: A Description of the Dialogue of Educational Television Programs.

    ERIC Educational Resources Information Center

    Rice, Mabel L.; Haight, Patti L.

    Dialogue from 30-minute samples from "Sesame Street" and "Mr. Rogers' Neighborhood" was coded for grammar, content, and discourse. Grammatical analysis used the LINGQUEST computer-assisted language assessment program (Mordecai, Palen, and Palmer 1982). Content coding was based on categories developed by Rice (1984) and…

  11. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  12. Sandia National Laboratories environmental fluid dynamics code. Marine Hydrokinetic Module User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Scott Carlton; Roberts, Jesse D.

    2014-03-01

    This document describes the marine hydrokinetic (MHK) input file and subroutines for the Sandia National Laboratories Environmental Fluid Dynamics Code (SNL-EFDC), which is a combined hydrodynamic, sediment transport, and water quality model based on the Environmental Fluid Dynamics Code (EFDC) developed by John Hamrick [1], formerly sponsored by the U.S. Environmental Protection Agency, and now maintained by Tetra Tech, Inc. SNL-EFDC has been previously enhanced with the incorporation of the SEDZLJ sediment dynamics model developed by Ziegler, Lick, and Jones [2-4]. SNL-EFDC has also been upgraded to more accurately simulate algae growth with specific application to optimizing biomass in anmore » open-channel raceway for biofuels production [5]. A detailed description of the input file containing data describing the MHK device/array is provided, along with a description of the MHK FORTRAN routine. Both a theoretical description of the MHK dynamics as incorporated into SNL-EFDC and an explanation of the source code are provided. This user manual is meant to be used in conjunction with the original EFDC [6] and sediment dynamics SNL-EFDC manuals [7]. Through this document, the authors provide information for users who wish to model the effects of an MHK device (or array of devices) on a flow system with EFDC and who also seek a clear understanding of the source code, which is available from staff in the Water Power Technologies Department at Sandia National Laboratories, Albuquerque, New Mexico.« less

  13. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  14. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  15. Large Eddy Simulation of Flow in Turbine Cascades Using LESTool and UNCLE Codes

    NASA Technical Reports Server (NTRS)

    Huang, P. G.

    2004-01-01

    During the period December 23,1997 and December August 31,2004, we accomplished the development of 2 CFD codes for DNS/LES/RANS simulation of turbine cascade flows, namely LESTool and UNCLE. LESTool is a structured code making use of 5th order upwind differencing scheme and UNCLE is a second-order-accuracy unstructured code. LESTool has both Dynamic SGS and Spalart's DES models and UNCLE makes use of URANS and DES models. The current report provides a description of methodologies used in the codes.

  16. Large Eddy Simulation of Flow in Turbine Cascades Using LEST and UNCLE Codes

    NASA Technical Reports Server (NTRS)

    Ashpis, David (Technical Monitor); Huang, P. G.

    2004-01-01

    During the period December 23, 1997 and December August 31, 2004, we accomplished the development of 2 CFD codes for DNS/LES/RANS simulation of turbine cascade flows, namely LESTool and UNCLE. LESTool is a structured code making use of 5th order upwind differencing scheme and UNCLE is a second-order-accuracy unstructured code. LESTool has both Dynamic SGS and Sparlart's DES models and UNCLE makes use of URANS and DES models. The current report provides a description of methodologies used in the codes.

  17. Turbulence modeling

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1995-01-01

    The objective of this work is to develop, verify, and incorporate the baseline two-equation turbulence models which account for the effects of compressibility into the three-dimensional Reynolds averaged Navier-Stokes (RANS) code and to provide documented descriptions of the models and their numerical procedures so that they can be implemented into 3-D CFD codes for engineering applications.

  18. User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, S.B.; Rainey, R.H.

    1979-05-01

    The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.

  19. Functional Requirements of a Target Description System for Vulnerability Analysis

    DTIC Science & Technology

    1979-11-01

    called GIFT .1,2 Together the COMGEOM description model and GIFT codes make up the BRL’s target description system. The significance of a target...and modifying target descriptions are described. 1 Lawrence W. Bain, Jr. and Mathew J. Reisinger, "The GIFT Code User Manual; Volume 1...34The GIFT Code User Manual; Volume II, The Output Options," unpublished draft of BRL report. II. UNDERLYING PHILOSOPHY The BRL has a computer

  20. A Combinatorial Geometry Computer Description of the M9 ACE (Armored Combat Earthmover) Vehicle

    DTIC Science & Technology

    1984-12-01

    program requires as input the M9 target descriptions as processed by the Geometric Information for Targets ( GIFT ) ’ computer code. The first step is...model of the target. This COM-GEOM target description is used as input to the Geometric Information For Targets ( GIFT ) computer code. Among other...things, the GIFT code traces shotlines through a COM-GEOM description from any specified aspect, listing pertinent information about each component hit

  1. A simple approach to improve recording of concerns about childmaltreatment in primary care records: developing a quality improvement intervention

    PubMed Central

    Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth

    2012-01-01

    Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996

  2. Biomass Economy

    DTIC Science & Technology

    1985-11-01

    Boiler and Pressure Vessel Code HEI Heat Exchanger Institute Heat and Material Balance c. System Description (1) Condenser... Boiler and Pressure Vessel Code "AN(SI B31.1 Power Piping d. System Description (1) Deaerator The deaerator will be d direct contact feedwater heater, and...vent, and drain piping. "b . Applicable Codes ASME Boiler and Pressure Vessel Code "ANSI B31.1 - Power Piping Code

  3. Propellant Chemistry for CFD Applications

    NASA Technical Reports Server (NTRS)

    Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.

    1996-01-01

    Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.

  4. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  5. Design and implementation in VHDL code of the two-dimensional fast Fourier transform for frequency filtering, convolution and correlation operations

    NASA Astrophysics Data System (ADS)

    Vilardy, Juan M.; Giacometto, F.; Torres, C. O.; Mattos, L.

    2011-01-01

    The two-dimensional Fast Fourier Transform (FFT 2D) is an essential tool in the two-dimensional discrete signals analysis and processing, which allows developing a large number of applications. This article shows the description and synthesis in VHDL code of the FFT 2D with fixed point binary representation using the programming tool Simulink HDL Coder of Matlab; showing a quick and easy way to handle overflow, underflow and the creation registers, adders and multipliers of complex data in VHDL and as well as the generation of test bench for verification of the codes generated in the ModelSim tool. The main objective of development of the hardware architecture of the FFT 2D focuses on the subsequent completion of the following operations applied to images: frequency filtering, convolution and correlation. The description and synthesis of the hardware architecture uses the XC3S1200E family Spartan 3E FPGA from Xilinx Manufacturer.

  6. Transient dynamics capability at Sandia National Laboratories

    NASA Technical Reports Server (NTRS)

    Attaway, Steven W.; Biffle, Johnny H.; Sjaardema, G. D.; Heinstein, M. W.; Schoof, L. A.

    1993-01-01

    A brief overview of the transient dynamics capabilities at Sandia National Laboratories, with an emphasis on recent new developments and current research is presented. In addition, the Sandia National Laboratories (SNL) Engineering Analysis Code Access System (SEACAS), which is a collection of structural and thermal codes and utilities used by analysts at SNL, is described. The SEACAS system includes pre- and post-processing codes, analysis codes, database translation codes, support libraries, Unix shell scripts for execution, and an installation system. SEACAS is used at SNL on a daily basis as a production, research, and development system for the engineering analysts and code developers. Over the past year, approximately 190 days of CPU time were used by SEACAS codes on jobs running from a few seconds up to two and one-half days of CPU time. SEACAS is running on several different systems at SNL including Cray Unicos, Hewlett Packard PH-UX, Digital Equipment Ultrix, and Sun SunOS. An overview of SEACAS, including a short description of the codes in the system, are presented. Abstracts and references for the codes are listed at the end of the report.

  7. Image Transmission via Spread Spectrum Techniques. Part A

    DTIC Science & Technology

    1976-01-01

    Code 408 DR. EDWIN H. WRENCH (714-225-6871) Code 408 and HARPER J. WHITEHOUSE (714:225-6315), Code 4002 Naval Undersea Center San Diego. California...progress report appears in two parts. Part A is a summary of work done in support of this program at the Naval Undersea Center. Part B contains final...a technical description of the bandwidth compression system developed at the Naval Undersea Center. This paper is an excerpt from the specifications

  8. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  9. Development of cost-effective surfactant flooding technology. First annual report for the period, September 30, 1992--September 29, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Sepehrnoori, K.

    1994-08-01

    This research consists of the parallel development of a new chemical flooding simulator and the application of existing UTCHEM simulation code to model surfactant flooding. The new code is based upon a completely new numerical method that combines for the first time higher order finite difference methods, flux limiters, and implicit algorithms. Early results indicate that this approach has significant advantages in some problems and will likely enable simulation of much larger and more realistic chemical floods once it is fully developed. Additional improvements have also been made to the UTCHEM code and it has been applied for the firstmore » time to the study of stochastic reservoirs with and without horizontal wells to evaluate methods to reduce the cost and risk of surfactant flooding. During the first year of this contract, significant progress has been made on both of these tasks. The authors have found that there are indeed significant differences between the performance predictions based upon the traditional layered reservoir description and the more realistic and flexible descriptions using geostatistics. These preliminary studies of surfactant flooding using horizontal wells shows that although they have significant potential to greatly reduce project life and thus improve the economics of the process, their use requires accurate reservoir descriptions and simulations to be effective. Much more needs to be done to fully understand and optimize their use and develop reliable design criteria.« less

  10. Iterative channel decoding of FEC-based multiple-description codes.

    PubMed

    Chang, Seok-Ho; Cosman, Pamela C; Milstein, Laurence B

    2012-03-01

    Multiple description coding has been receiving attention as a robust transmission framework for multimedia services. This paper studies the iterative decoding of FEC-based multiple description codes. The proposed decoding algorithms take advantage of the error detection capability of Reed-Solomon (RS) erasure codes. The information of correctly decoded RS codewords is exploited to enhance the error correction capability of the Viterbi algorithm at the next iteration of decoding. In the proposed algorithm, an intradescription interleaver is synergistically combined with the iterative decoder. The interleaver does not affect the performance of noniterative decoding but greatly enhances the performance when the system is iteratively decoded. We also address the optimal allocation of RS parity symbols for unequal error protection. For the optimal allocation in iterative decoding, we derive mathematical equations from which the probability distributions of description erasures can be generated in a simple way. The performance of the algorithm is evaluated over an orthogonal frequency-division multiplexing system. The results show that the performance of the multiple description codes is significantly enhanced.

  11. User's manual for three dimensional FDTD version D code for scattering from frequency-dependent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.

  12. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Derek Elswick

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less

  13. Development of structured ICD-10 and its application to computer-assisted ICD coding.

    PubMed

    Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko

    2010-01-01

    This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.

  14. Turbulence modeling for hypersonic flight

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1993-01-01

    The objective of the proposed work is to continue to develop, verify, and incorporate the baseline two-equation turbulence models, which account for the effects of compressibility at high speeds, into a three-dimensional Reynolds averaged Navier-Stokes (RANS) code. Additionally, we plan to provide documented descriptions of the models and their numerical procedures so that they can be implemented into the NASP CFD codes.

  15. PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 1: Analysis description

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady

    1990-01-01

    A new computer code was developed to solve the two-dimensional or axisymmetric, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 1 is the Analysis Description, and describes in detail the governing equations, the turbulence model, the linearization of the equations and boundary conditions, the time and space differencing formulas, the ADI solution procedure, and the artificial viscosity models.

  16. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  18. Verification of the proteus two-dimensional Navier-Stokes code for flat plate and pipe flows

    NASA Technical Reports Server (NTRS)

    Conley, Julianne M.; Zeman, Patrick L.

    1991-01-01

    The Proteus Navier-Stokes Code is evaluated for 2-D/axisymmetric, viscous, incompressible, internal, and external flows. The particular cases to be discussed are laminar and turbulent flows over a flat plate, laminar and turbulent developing pipe flows, and turbulent pipe flow with swirl. Results are compared with exact solutions, empirical correlations, and experimental data. A detailed description of the code set-up, including boundary conditions, initial conditions, grid size, and grid packing is given for each case.

  19. Computer Description of the Field Artillery Ammunition Supply Vehicle

    DTIC Science & Technology

    1983-04-01

    Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and

  20. A Combinatorial Geometry Computer Description of the MEP-021A Generator Set

    DTIC Science & Technology

    1979-02-01

    Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] 󈧚*7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack

  1. CELCAP: A Computer Model for Cogeneration System Analysis

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A description of the CELCAP cogeneration analysis program is presented. A detailed description of the methodology used by the Naval Civil Engineering Laboratory in developing the CELCAP code and the procedures for analyzing cogeneration systems for a given user are given. The four engines modeled in CELCAP are: gas turbine with exhaust heat boiler, diesel engine with waste heat boiler, single automatic-extraction steam turbine, and back-pressure steam turbine. Both the design point and part-load performances are taken into account in the engine models. The load model describes how the hourly electric and steam demand of the user is represented by 24 hourly profiles. The economic model describes how the annual and life-cycle operating costs that include the costs of fuel, purchased electricity, and operation and maintenance of engines and boilers are calculated. The CELCAP code structure and principal functions of the code are described to how the various components of the code are related to each other. Three examples of the application of the CELCAP code are given to illustrate the versatility of the code. The examples shown represent cases of system selection, system modification, and system optimization.

  2. Designing and maintaining an effective chargemaster.

    PubMed

    Abbey, D C

    2001-03-01

    The chargemaster is the central repository of charges and associated coding information used to develop claims. But this simple description belies the chargemaster's true complexity. The chargemaster's role in the coding process differs from department to department, and not all codes provided on a claim form are necessarily included in the chargemaster, as codes for complex services may need to be developed and reviewed by coding staff. In addition, with the rise of managed care, the chargemaster increasingly is being used to track utilization of supplies and services. To ensure that the chargemaster performs all of its functions effectively, hospitals should appoint a chargemaster coordinator, supported by a chargemaster review team, to oversee the design and maintenance of the chargemaster. Important design issues that should be considered include the principle of "form follows function," static versus dynamic coding, how modifiers should be treated, how charges should be developed, how to incorporate physician fee schedules into the chargemaster, the interface between the chargemaster and cost reports, and how to include statistical information for tracking utilization.

  3. Exploring Trilingual Code-Switching: The Case of "Hokaglish"

    ERIC Educational Resources Information Center

    Gonzales, Wilkinson Daniel Wong

    2016-01-01

    This paper presents findings of an initial study on a trilingual code-switching (CS) phenomenon called "Hokaglish" in Binondo, Manila, The Philippines. Beginning with descriptions of multiculturalism and multilingualism in the Philippines, the discussion eventually leads to the description and survey of the code-switching phenomenon…

  4. A Combinatorial Geometry Computer Description of the XR311 Vehicle

    DTIC Science & Technology

    1978-04-01

    cards or magnetic tape. The shot line output of the GRID subroutine of the GIFT code is also stored on magnetic tape for future vulnera- bility...descriptions as processed by the Geometric Information For Targets ( GIFT )2 computer code. This report documents the COM-GEOM target description for all...72, March 1974. ’L.W. Bains and M.J. Reisinger, "The GIFT Code User Manual, VOL 1, Introduction and Input Requirements, " Ballistic Research

  5. MARS15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, Nikolai

    MARS is a Monte Carlo code for inclusive and exclusive simulation of three-dimensional hadronic and electromagnetic cascades, muon, heavy-ion and low-energy neutron transport in accelerator, detector, spacecraft and shielding components in the energy range from a fraction of an electronvolt up to 100 TeV. Recent developments in the MARS15 physical models of hadron, heavy-ion and lepton interactions with nuclei and atoms include a new nuclear cross section library, a model for soft pion production, the cascade-exciton model, the quark gluon string models, deuteron-nucleus and neutrino-nucleus interaction models, detailed description of negative hadron and muon absorption and a unified treatment ofmore » muon, charged hadron and heavy-ion electromagnetic interactions with matter. New algorithms are implemented into the code and thoroughly benchmarked against experimental data. The code capabilities to simulate cascades and generate a variety of results in complex media have been also enhanced. Other changes in the current version concern the improved photo- and electro-production of hadrons and muons, improved algorithms for the 3-body decays, particle tracking in magnetic fields, synchrotron radiation by electrons and muons, significantly extended histograming capabilities and material description, and improved computational performance. In addition to direct energy deposition calculations, a new set of fluence-to-dose conversion factors for all particles including neutrino are built into the code. The code includes new modules for calculation of Displacement-per-Atom and nuclide inventory. The powerful ROOT geometry and visualization model implemented in MARS15 provides a large set of geometrical elements with a possibility of producing composite shapes and assemblies and their 3D visualization along with a possible import/export of geometry descriptions created by other codes (via the GDML format) and CAD systems (via the STEP format). The built-in MARS-MAD Beamline Builder (MMBLB) was redesigned for use with the ROOT geometry package that allows a very efficient and highly-accurate description, modeling and visualization of beam loss induced effects in arbitrary beamlines and accelerator lattices. The MARS15 code includes links to the MCNP-family codes for neutron and photon production and transport below 20 MeV, to the ANSYS code for thermal and stress analyses and to the STRUCT code for multi-turn particle tracking in large synchrotrons and collider rings.« less

  6. Rocketdyne Safety Algorithm: Space Shuttle Main Engine Fault Detection

    NASA Technical Reports Server (NTRS)

    Norman, Arnold M., Jr.

    1994-01-01

    The Rocketdyne Safety Algorithm (RSA) has been developed to the point of use on the TTBE at MSFC on Task 4 of LeRC contract NAS3-25884. This document contains a description of the work performed, the results of the nominal test of the major anomaly test cases and a table of the resulting cutoff times, a plot of the RSA value vs. time for each anomaly case, a logic flow description of the algorithm, the algorithm code, and a development plan for future efforts.

  7. Airport-Noise Levels and Annoyance Model (ALAMO) system's reference manual

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    The airport-noise levels and annoyance model (ALAMO) is described in terms of the constituent modules, the execution of ALAMO procedure files, necessary for system execution, and the source code documentation associated with code development at Langley Research Center. The modules constituting ALAMO are presented both in flow graph form, and through a description of the subroutines and functions that comprise them.

  8. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    PubMed

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  9. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 4: Computer user's manual for UAAP turboprop aeroacoustic code

    NASA Astrophysics Data System (ADS)

    Menthe, R. W.; McColgan, C. J.; Ladden, R. M.

    1991-05-01

    The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.

  10. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 4: Computer user's manual for UAAP turboprop aeroacoustic code

    NASA Technical Reports Server (NTRS)

    Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.

    1991-01-01

    The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.

  11. Animations Need Narrations: An Experimental Test of a Dual-Coding Hypothesis.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Anderson, Richard B.

    1991-01-01

    In two experiments, 102 mechanically naive college students viewed an animation on bicycle tire pump operation with a verbal description before or during the animation or without description. Improved performance of those receiving description during the animation supports a dual-coding hypothesis of connections between visual and verbal stimuli.…

  12. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  13. Proteus two-dimensional Navier-Stokes computer code, version 2.0. Volume 1: Analysis description

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 2D was developed to solve the two-dimensional planar or axisymmetric, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort was to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The governing equations are solved in generalized nonorthogonal body-fitted coordinates, by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. This is the Analysis Description, and presents the equations and solution procedure. The governing equations, the turbulence model, the linearization of the equations and boundary conditions, the time and space differencing formulas, the ADI solution procedure, and the artificial viscosity models are described in detail.

  14. Proteus three-dimensional Navier-Stokes computer code, version 1.0. Volume 1: Analysis description

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 3D has been developed to solve the three dimensional, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort has been to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation have been emphasized. The governing equations are solved in generalized non-orthogonal body-fitted coordinates by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. This is the Analysis Description, and presents the equations and solution procedure. It describes in detail the governing equations, the turbulence model, the linearization of the equations and boundary conditions, the time and space differencing formulas, the ADI solution procedure, and the artificial viscosity models.

  15. TAS: A Transonic Aircraft/Store flow field prediction code

    NASA Technical Reports Server (NTRS)

    Thompson, D. S.

    1983-01-01

    A numerical procedure has been developed that has the capability to predict the transonic flow field around an aircraft with an arbitrarily located, separated store. The TAS code, the product of a joint General Dynamics/NASA ARC/AFWAL research and development program, will serve as the basis for a comprehensive predictive method for aircraft with arbitrary store loadings. This report described the numerical procedures employed to simulate the flow field around a configuration of this type. The validity of TAS code predictions is established by comparison with existing experimental data. In addition, future areas of development of the code are outlined. A brief description of code utilization is also given in the Appendix. The aircraft/store configuration is simulated using a mesh embedding approach. The computational domain is discretized by three meshes: (1) a planform-oriented wing/body fine mesh, (2) a cylindrical store mesh, and (3) a global Cartesian crude mesh. This embedded mesh scheme enables simulation of stores with fins of arbitrary angular orientation.

  16. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume III of III: software description. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-10-29

    This volume is the software description for the National Utility Regulatory Model (NUREG). This is the third of three volumes provided by ICF under contract number DEAC-01-79EI-10579. These three volumes are: a manual describing the NUREG methodology; a users guide; and a description of the software. This manual describes the software which has been developed for NUREG. This includes a listing of the source modules. All computer code has been written in FORTRAN.

  17. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  18. MPEG4: coding for content, interactivity, and universal accessibility

    NASA Astrophysics Data System (ADS)

    Reader, Cliff

    1996-01-01

    MPEG4 is a natural extension of audiovisual coding, and yet from many perspectives breaks new ground as a standard. New coding techniques are being introduced, of course, but they will work on new data structures. The standard itself has a new architecture, and will use a new operational model when implemented on equipment that is likely to have innovative system architecture. The author introduces the background developments in technology and applications that are driving or enabling the standard, introduces the focus of MPEG4, and enumerates the new functionalities to be supported. Key applications in interactive TV and heterogeneous environments are discussed. The architecture of MPEG4 is described, followed by a discussion of the multiphase MPEG4 communication scenarios, and issues of practical implementation of MPEG4 terminals. The paper concludes with a description of the MPEG4 workplan. In summary, MPEG4 has two fundamental attributes. First, it is the coding of audiovisual objects, which may be natural or synthetic data in two or three dimensions. Second, the heart of MPEG4 is its syntax: the MPEG4 Syntactic Descriptive Language -- MSDL.

  19. FY16 ASME High Temperature Code Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swindeman, M. J.; Jetter, R. I.; Sham, T. -L.

    2016-09-01

    One of the objectives of the ASME high temperature Code activities is to develop and validate both improvements and the basic features of Section III, Division 5, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to be used to assess whether or not a specific component under specified loading conditions will satisfy the elevated temperature design requirements for Class A components in Section III, Division 5, Subsection HB, Subpart B (HBB). There are many features and alternative paths of varying complexity in HBB. The initial focus of this task is amore » basic path through the various options for a single reference material, 316H stainless steel. However, the program will be structured for eventual incorporation all the features and permitted materials of HBB. Since this task has recently been initiated, this report focuses on the description of the initial path forward and an overall description of the approach to computer program development.« less

  20. A software framework for pipelined arithmetic algorithms in field programmable gate arrays

    NASA Astrophysics Data System (ADS)

    Kim, J. B.; Won, E.

    2018-03-01

    Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.

  1. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  2. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  3. Formally specifying the logic of an automatic guidance controller

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1990-01-01

    The following topics are covered in viewgraph form: (1) the Penelope Project; (2) the logic of an experimental automatic guidance control system for a 737; (3) Larch/Ada specification; (4) some failures of informal description; (5) description of mode changes caused by switches; (6) intuitive description of window status (chosen vs. current); (7) design of the code; (8) and specifying the code.

  4. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  5. The 1985 Army Experience Survey: Tabular Descriptions of First-Term Attritees. Volume 2

    DTIC Science & Technology

    1986-01-01

    survey receipt control and sample management systems . Data were also keyed, edited, coded, and weighted. The coding schemes developed to classify... R136 REGION OF RESIDENCE WHEN YOU JOINED ARMY. .. ................. 272-273 049 El37 U TERMS OF ACTIVE ENLISTMENT .. ........ ................ 274...272 R136 -- REGION OF RESIDENCE WHEN YOU JOINED ARMY RECODED - WHAT STATE WERE YOU LIVING IN WHEN YOU JOINED THE ARMY! (RECODED TO REGION OF RSID) I

  6. The 1985 Army Experience Survey: Tabular Descriptions of Enlisted Retirees. Volume 2

    DTIC Science & Technology

    1986-01-01

    and sample management systems . Data were also keyed, edited, coded, and weighted. The coding schemes developed to classify written responses to the...34 ," . ." #".* " ’ ," ." ." ." " ." • --.........-.. ’...................,...".."."’............. . . . . 226 .! - R136 -- REGION OF RESIDENCE WHEN YOU JOINED...standards - Positive or negative 14: Pay/benefits/promotion - Poor Promotion practices/criteria - Point system - Weed out poor soldiers - Could not

  7. Development of an expert based ICD-9-CM and ICD-10-CM map to AIS 2005 update 2008.

    PubMed

    Loftis, Kathryn L; Price, Janet P; Gillich, Patrick J; Cookman, Kathy J; Brammer, Amy L; St Germain, Trish; Barnes, Jo; Graymire, Vickie; Nayduch, Donna A; Read-Allsopp, Christine; Baus, Katherine; Stanley, Patsye A; Brennan, Maureen

    2016-09-01

    This article describes how maps were developed from the clinical modifications of the 9th and 10th revisions of the International Classification of Diseases (ICD) to the Abbreviated Injury Scale 2005 Update 2008 (AIS08). The development of the mapping methodology is described, with discussion of the major assumptions used in the process to map ICD codes to AIS severities. There were many intricacies to developing the maps, because the 2 coding systems, ICD and AIS, were developed for different purposes and contain unique classification structures to meet these purposes. Experts in ICD and AIS analyzed the rules and coding guidelines of both injury coding schemes to develop rules for mapping ICD injury codes to the AIS08. This involved subject-matter expertise, detailed knowledge of anatomy, and an in-depth understanding of injury terms and definitions as applied in both taxonomies. The official ICD-9-CM and ICD-10-CM versions (injury sections) were mapped to the AIS08 codes and severities, following the rules outlined in each coding manual. The panel of experts was composed of coders certified in ICD and/or AIS from around the world. In the process of developing the map from ICD to AIS, the experts created rules to address issues with the differences in coding guidelines between the 2 schemas and assure a consistent approach to all codes. Over 19,000 ICD codes were analyzed and maps were generated for each code to AIS08 chapters, AIS08 severities, and Injury Severity Score (ISS) body regions. After completion of the maps, 14,101 (74%) of the eligible 19,012 injury-related ICD-9-CM and ICD-10-CM codes were assigned valid AIS08 severity scores between 1 and 6. The remaining 4,911 codes were assigned an AIS08 of 9 (unknown) or were determined to be nonmappable because the ICD description lacked sufficient qualifying information for determining severity according to AIS rules. There were also 15,214 (80%) ICD codes mapped to AIS08 chapter and ISS body region, which allow for ISS calculations for patient data sets. This mapping between ICD and AIS provides a comprehensive, expert-designed solution for analysts to bridge the data gap between the injury descriptions provided in hospital codes (ICD-9-CM, ICD-10-CM) and injury severity codes (AIS08). By applying consistent rules from both the ICD and AIS taxonomies, the expert panel created these definitive maps, which are the only ones endorsed by the Association for the Advancement of Automotive Medicine (AAAM). Initial validation upheld the quality of these maps for the estimation of AIS severity, but future work should include verification of these maps for MAIS and ISS estimations with large data sets. These ICD-AIS maps will support data analysis from databases with injury information classified in these 2 different systems and open new doors for the investigation of injury from traumatic events using large injury data sets.

  8. PEGASUS User's Guide. 5.1c

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Dietz, William E.; Rogers, Stuart E.; Nash, Steven M.; Onufer, Jeffrey T.

    2000-01-01

    PEGASUS 5.1 is the latest version of the PEGASUS series of mesh interpolation codes. It is a fully three-dimensional code. The main purpose for the development of this latest version was to significantly decrease the number of user inputs required and to allow for easier operation of the code. This guide is to be used with the user's manual for version 4 of PEGASUS. A basic description of methods used in both versions is described in the Version 4 manual. A complete list of all user inputs used in version 5.1 is given in this guide.

  9. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  10. 48 CFR Appendix F to Chapter 2 - Material Inspection and Receiving Report

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., research, development, training, and testing. The “Ship To” code and the “Unit” will have to be filled out... in parentheses or slashes. Show the descriptive noun of the item nomenclature and if provided, the..., rehabilitation, engineering, research, development, training, and testing. Do not complete Blocks 4, 13, and 14...

  11. 48 CFR Appendix F to Chapter 2 - Material Inspection and Receiving Report

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., research, development, training, and testing. The “Ship To” code and the “Unit” will have to be filled out... in parentheses or slashes. Show the descriptive noun of the item nomenclature and if provided, the..., rehabilitation, engineering, research, development, training, and testing. Do not complete Blocks 4, 13, and 14...

  12. 48 CFR Appendix F to Chapter 2 - Material Inspection and Receiving Report

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., research, development, training, and testing. The “Ship To” code and the “Unit” will have to be filled out... in parentheses or slashes. Show the descriptive noun of the item nomenclature and if provided, the..., rehabilitation, engineering, research, development, training, and testing. Do not complete Blocks 4, 13, and 14...

  13. Numerical studies of the deposition of material released from fixed and rotary wing aircraft

    NASA Technical Reports Server (NTRS)

    Bilanin, A. J.; Teske, M. E.

    1984-01-01

    The computer code AGDISP (AGricultural DISPersal) has been developed to predict the deposition of material released from fixed and rotary wing aircraft in a single-pass, computationally efficient manner. The formulation of the code is novel in that the mean particle trajectory and the variance about the mean resulting from turbulent fluid fluctuations are simultaneously predicted. The code presently includes the capability of assessing the influence of neutral atmospheric conditions, inviscid wake vortices, particle evaporation, plant canopy and terrain on the deposition pattern. In this report, the equations governing the motion of aerially released particles are developed, including a description of the evaporation model used. A series of case studies, using AGDISP, are included.

  14. Automatic NEPHIS Coding of Descriptive Titles for Permuted Index Generation.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1982-01-01

    Describes a system for the automatic coding of most descriptive titles which generates Nested Phrase Indexing System (NEPHIS) input strings of sufficient quality for permuted index production. A series of examples and an 11-item reference list accompany the text. (JL)

  15. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their experiments, including the ability to model the beam line, the shielding of samples and sample holders, and the estimates of basic physical and biological outputs of the designed experiments. We present an overview of the GERM code GUI, as well as providing training applications.

  16. Turbulence modeling for hypersonic flight

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.

    1992-01-01

    The objective of the present work is to develop, verify, and incorporate two equation turbulence models which account for the effect of compressibility at high speeds into a three dimensional Reynolds averaged Navier-Stokes code and to provide documented model descriptions and numerical procedures so that they can be implemented into the National Aerospace Plane (NASP) codes. A summary of accomplishments is listed: (1) Four codes have been tested and evaluated against a flat plate boundary layer flow and an external supersonic flow; (2) a code named RANS was chosen because of its speed, accuracy, and versatility; (3) the code was extended from thin boundary layer to full Navier-Stokes; (4) the K-omega two equation turbulence model has been implemented into the base code; (5) a 24 degree laminar compression corner flow has been simulated and compared to other numerical simulations; and (6) work is in progress in writing the numerical method of the base code including the turbulence model.

  17. The 1985 Army Experience Survey: Tabular Descriptions of First-Term Separatees. Volume 2

    DTIC Science & Technology

    1986-01-01

    through survey receipt control and sample management systems . Data were also keyed, edited, coded, and weighted. The coding schemes developed to...270-271 048 R136 REGION OF RESIDENCE WHEN YOU JOINED ARMY ...... .................. ... 272-273 049 E137 # TERMS OF ACTIVE ENLISTMENT...STATISTIC VALUE D.F. PROB. CHISQUARE APPROX. 7.830 5 0.1658 e. U 272 R136 -- REGION OF RIESIDENCE WHEN YOU JOINED ARMY RECODED - WHAT STATE WERE YOU LIVING

  18. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  19. Measurement of neutron spectra in the AWE workplace using a Bonner sphere spectrometer.

    PubMed

    Danyluk, Peter

    2010-12-01

    A Bonner sphere spectrometer has been used to measure the neutron spectra in eight different workplace areas at AWE (Atomic Weapons Establishment). The spectra were analysed by the National Physical Laboratory using their principal unfolding code STAY'SL and the results were also analysed by AWE using a bespoke parametrised unfolding code. The bespoke code was designed specifically for the AWE workplace and is very simple to use. Both codes gave results, in good agreement. It was found that the measured fluence rate varied from 2 to 70 neutrons cm⁻² s⁻¹ (± 10%) and the ambient dose equivalent H*(10) varied from 0.5 to 57 µSv h⁻¹ (± 20%). A detailed description of the development and use of the bespoke code is presented.

  20. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  1. Complexity Measure for the Prototype System Description Language (PSDL)

    DTIC Science & Technology

    2002-06-01

    Albrecht, A. and Gaffney , J., Software Function Source Lines of Code and Development Effort Prediction, IEEE Transactions on Software Engineering...Through Meausrement”; Proceedings of the IEEE, Vol. 77, No. 4, April 89. Schach, Stephen, R., Software Engineering, Second Edition, IRWIN, Burr Ridge

  2. Nucleon interaction data bases for background estimates

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.

    1989-01-01

    Nucleon interaction data bases available in the open literature are examined for potential use in a recently developed nucleon transport code. Particular attention is given to secondary particle penetration and the multiple charged ion products. A brief description of the transport algorithm is given.

  3. Operational manual for two-dimensional transonic code TSFOIL

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1978-01-01

    This code solves the two-dimensional, transonic, small-disturbance equations for flow past lifting airfoils in both free air and various wind-tunnel environments by using a variant of the finite-difference method. A description of the theoretical and numerical basis of the code is provided, together with complete operating instructions and sample cases for the general user. In addition, a programmer's manual is also presented to assist the user interested in modifying the code. Included in the programmer's manual are a dictionary of subroutine variables in common and a detailed description of each subroutine.

  4. Numerical, Analytical, Experimental Study of Fluid Dynamic Forces in Seals Volume 6: Description of Scientific CFD Code SCISEAL

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh; Przekwas, Andrzej

    2004-01-01

    The objectives of the program were to develop computational fluid dynamics (CFD) codes and simpler industrial codes for analyzing and designing advanced seals for air-breathing and space propulsion engines. The CFD code SCISEAL is capable of producing full three-dimensional flow field information for a variety of cylindrical configurations. An implicit multidomain capability allow the division of complex flow domains to allow optimum use of computational cells. SCISEAL also has the unique capability to produce cross-coupled stiffness and damping coefficients for rotordynamic computations. The industrial codes consist of a series of separate stand-alone modules designed for expeditious parametric analyses and optimization of a wide variety of cylindrical and face seals. Coupled through a Knowledge-Based System (KBS) that provides a user-friendly Graphical User Interface (GUI), the industrial codes are PC based using an OS/2 operating system. These codes were designed to treat film seals where a clearance exists between the rotating and stationary components. Leakage is inhibited by surface roughness, small but stiff clearance films, and viscous pumping devices. The codes have demonstrated to be a valuable resource for seal development of future air-breathing and space propulsion engines.

  5. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  6. Development and Implementation of Dynamic Scripts to Execute Cycled WRF/GSI Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Quanli; Watson, Leela

    2014-01-01

    Automating the coupling of data assimilation (DA) and modeling systems is a unique challenge in the numerical weather prediction (NWP) research community. In recent years, the Development Testbed Center (DTC) has released well-documented tools such as the Weather Research and Forecasting (WRF) model and the Gridpoint Statistical Interpolation (GSI) DA system that can be easily downloaded, installed, and run by researchers on their local systems. However, developing a coupled system in which the various preprocessing, DA, model, and postprocessing capabilities are all integrated can be labor-intensive if one has little experience with any of these individual systems. Additionally, operational modeling entities generally have specific coupling methodologies that can take time to understand and develop code to implement properly. To better enable collaborating researchers to perform modeling and DA experiments with GSI, the Short-term Prediction Research and Transition (SPoRT) Center has developed a set of Perl scripts that couple GSI and WRF in a cycling methodology consistent with the use of real-time, regional observation data from the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC). Because Perl is open source, the code can be easily downloaded and executed regardless of the user's native shell environment. This paper will provide a description of this open-source code and descriptions of a number of the use cases that have been performed by SPoRT collaborators using the scripts on different computing systems.

  7. PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.

    1998-01-01

    PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.

  8. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  9. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  10. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  11. Analysis of a Radiation Model of the Shuttle Space Suit

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke M.; Nealy, John E.; Kim, Myung-Hee; Qualls, Garry D.; Wilson, John W.

    2003-01-01

    The extravehicular activity (EVA) required to assemble the International Space Station (ISS) will take approximately 1500 hours with 400 hours of EVA per year in operations and maintenance. With the Space Station at an inclination of 51.6 deg the radiation environment is highly variable with solar activity being of great concern. Thus, it is important to study the dose gradients about the body during an EVA to help determine the cancer risk associated with the different environments the ISS will encounter. In this paper we are concerned only with the trapped radiation (electrons and protons). Two different scenarios are looked at: the first is the quiet geomagnetic periods in low Earth orbit (LEO) and the second is during a large solar particle event in the deep space environment. This study includes a description of how the space suit's computer aided design (CAD) model was developed along with a description of the human model. Also included is a brief description of the transport codes used to determine the total integrated dose at several locations within the body. Finally, the results of the transport codes when applied to the space suit and human model and a brief description of the results are presented.

  12. Modeling of impulsive propellant reorientation

    NASA Technical Reports Server (NTRS)

    Hochstein, John I.; Patag, Alfredo E.; Chato, David J.

    1988-01-01

    The impulsive propellant reorientation process is modeled using the (Energy Calculations for Liquid Propellants in a Space Environment (ECLIPSE) code. A brief description of the process and the computational model is presented. Code validation is documented via comparison to experimentally derived data for small scale tanks. Predictions of reorientation performance are presented for two tanks designed for use in flight experiments and for a proposed full scale OTV tank. A new dimensionless parameter is developed to correlate reorientation performance in geometrically similar tanks. Its success is demonstrated.

  13. Chemical reactivity and spectroscopy explored from QM/MM molecular dynamics simulations using the LIO code

    NASA Astrophysics Data System (ADS)

    Marcolongo, Juan P.; Zeida, Ari; Semelak, Jonathan A.; Foglia, Nicolás O.; Morzan, Uriel N.; Estrin, Dario A.; González Lebrero, Mariano C.; Scherlis, Damián A.

    2018-03-01

    In this work we present the current advances in the development and the applications of LIO, a lab-made code designed for density functional theory calculations in graphical processing units (GPU), that can be coupled with different classical molecular dynamics engines. This code has been thoroughly optimized to perform efficient molecular dynamics simulations at the QM/MM DFT level, allowing for an exhaustive sampling of the configurational space. Selected examples are presented for the description of chemical reactivity in terms of free energy profiles, and also for the computation of optical properties, such as vibrational and electronic spectra in solvent and protein environments.

  14. Object-oriented code SUR for plasma kinetic simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levchenko, V.D.; Sigov, Y.S.

    1995-12-31

    We have developed a self-consistent simulation code based on object-oriented model of plasma (OOMP) for solving the Vlasov/Poisson (V/P), Vlasov/Maxwell (V/M), Bhatnagar-Gross-Krook (BGK) as well as Fokker-Planck (FP) kinetic equations. The application of an object-oriented approach (OOA) to simulation of plasmas and plasma-like media by means of splitting methods permits to uniformly describe and solve the wide circle of plasma kinetics problems, including those being very complicated: many-dimensional, relativistic, with regard for collisions, specific boundary conditions etc. This paper gives the brief description of possibilities of the SUR code, as a concrete realization of OOMP.

  15. Odor Coding by a Mammalian Receptor Repertoire

    PubMed Central

    Saito, Harumi; Chi, Qiuyi; Zhuang, Hanyi; Matsunami, Hiro; Mainland, Joel D.

    2009-01-01

    Deciphering olfactory encoding requires a thorough description of the ligands that activate each odorant receptor (OR). In mammalian systems, however, ligands are known for fewer than 50 of over 1400 human and mouse ORs, greatly limiting our understanding of olfactory coding. We performed high-throughput screening of 93 odorants against 464 ORs expressed in heterologous cells and identified agonists for 52 mouse and 10 human ORs. We used the resulting interaction profiles to develop a predictive model relating physicochemical odorant properties, OR sequences, and their interactions. Our results provide a basis for translating odorants into receptor neuron responses and unraveling mammalian odor coding. PMID:19261596

  16. Requirements for migration of NSSD code systems from LTSS to NLTSS

    NASA Technical Reports Server (NTRS)

    Pratt, M.

    1984-01-01

    The purpose of this document is to address the requirements necessary for a successful conversion of the Nuclear Design (ND) application code systems to the NLTSS environment. The ND application code system community can be characterized as large-scale scientific computation carried out on supercomputers. NLTSS is a distributed operating system being developed at LLNL to replace the LTSS system currently in use. The implications of change are examined including a description of the computational environment and users in ND. The discussion then turns to requirements, first in a general way, followed by specific requirements, including a proposal for managing the transition.

  17. DCU@TRECMed 2012: Using Ad-Hoc Baselines for Domain-Specific Retrieval

    DTIC Science & Technology

    2012-11-01

    description to extend the query, for example: Patients with complicated GERD who receive endoscopy will be extended with Gastroesophageal reflux disease ... Diseases and Related Health Problems, version 9) for the patient’s admission or discharge status [1, 5]; treating negation (e.g. negative test results or...codes were mapped to a description of the code, usually a short phrase/sentence. For instance, the ICD9 code 253.5 corresponds to the disease Diabetes

  18. CSTEM User Manual

    NASA Technical Reports Server (NTRS)

    Hartle, M.; McKnight, R. L.

    2000-01-01

    This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.

  19. The CRONOS Code for Astrophysical Magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Kissmann, R.; Kleimann, J.; Krebl, B.; Wiengarten, T.

    2018-06-01

    We describe the magnetohydrodynamics (MHD) code CRONOS, which has been used in astrophysics and space-physics studies in recent years. CRONOS has been designed to be easily adaptable to the problem in hand, where the user can expand or exchange core modules or add new functionality to the code. This modularity comes about through its implementation using a C++ class structure. The core components of the code include solvers for both hydrodynamical (HD) and MHD problems. These problems are solved on different rectangular grids, which currently support Cartesian, spherical, and cylindrical coordinates. CRONOS uses a finite-volume description with different approximate Riemann solvers that can be chosen at runtime. Here, we describe the implementation of the code with a view toward its ongoing development. We illustrate the code’s potential through several (M)HD test problems and some astrophysical applications.

  20. 50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...

  1. 50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...

  2. 50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... following: Alpha gear code NMFS logbooks Electronic check-in/ check-out Use numeric code to complete the following: Numeric gear code IERS eLandings ADF&G COAR NMFS AND ADF&G GEAR CODES Hook-and-line HAL X X 61 X...

  3. Molecular Dynamic Studies of Particle Wake Potentials in Plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren

    2010-11-01

    Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (P^3M) code ddcMD to perform these simulations. As a starting point in our study, we examined the wake of a particle passing through a plasma. In this poster, we compare the wake observed in 3D ddcMD simulations with that predicted by Vlasov theory and those observed in the electrostatic PIC code BEPS where the cell size was reduced to .03λD.

  4. Continued development and correlation of analytically based weight estimation codes for wings and fuselages

    NASA Technical Reports Server (NTRS)

    Mullen, J., Jr.

    1978-01-01

    The implementation of the changes to the program for Wing Aeroelastic Design and the development of a program to estimate aircraft fuselage weights are described. The equations to implement the modified planform description, the stiffened panel skin representation, the trim loads calculation, and the flutter constraint approximation are presented. A comparison of the wing model with the actual F-5A weight material distributions and loads is given. The equations and program techniques used for the estimation of aircraft fuselage weights are described. These equations were incorporated as a computer code. The weight predictions of this program are compared with data from the C-141.

  5. Application of the verona coding definitions of emotional sequences (VR-CoDES) on a pediatric data set.

    PubMed

    Vatne, Torun M; Finset, Arnstein; Ørnes, Knut; Ruland, Cornelia M

    2010-09-01

    Adult patients present concerns as defined in the Verona Coding Definitions of Emotional Sequences (VR-CoDES), but we do not know how children express their concerns during medical consultations. This study aimed to evaluate the applicability of VR-CoDES to pediatric oncology consultations. Twenty-eight pediatric consultations were coded with the Verona Coding Definitions of Emotional Sequences (VR-CoDES), and the material was also qualitatively analyzed for descriptive purposes. Five consultations were randomly selected for reliability testing and descriptive statistics were computed. Perfect inter-rater reliability for concerns and moderate reliability for cues were obtained. Cues and/or concerns were present in over half of the consultations. Cues were more frequent than concerns, with the majority of cues being verbal hints to hidden concerns or non-verbal cues. Intensity of expressions, limitations in vocabulary, commonality of statements, and complexity of the setting complicated the use of VR-CoDES. Child-specific cues; use of the imperative, cues about past experiences, and use of onomatopoeia were observed. Children with cancer express concerns during medical consultations. VR-CoDES is a reliable tool for coding concerns in pediatric data sets. For future applications in pediatric settings an appendix should be developed to incorporate the child-specific traits. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  6. A Combinatorial Geometry Target Description of the High Mobility Multipurpose Wheeled Vehicle (HMMWV)

    DTIC Science & Technology

    1985-10-01

    NOTE3 1W. KFY OORDS (Continwo =n reverse aide If necesesar aid ldwttlfy by" block ntmber) •JW7 Regions, COM-EOM Region Ident• fication GIFT Material...technique of mobna.tcri• i Geometr- (Com-Geom). The Com-Gem data is used as input to the Geometric Inf• •cation for Targets ( GIFT ) computer code to... GIFT ) 2 3 computer code. This report documents the combinatorial geometry (Com-Geom) target description data which is the input data for the GIFT code

  7. The SIFT hardware/software systems. Volume 2: Software listings

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  8. Beyond crosswalks: reliability of exposure assessment following automated coding of free-text job descriptions for occupational epidemiology.

    PubMed

    Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L

    2014-05-01

    Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ = 0.5-0.8). Thus, the success of automated coding appears to depend on the setting and type of exposure that is being assessed. Our overall recommendation is that automated translation of short narrative descriptions of jobs for exposure assessment is feasible in some settings and essential for large cohorts, especially if combined with manual coding to both assess reliability of coding and to further refine the coding algorithm.

  9. Morse Monte Carlo Radiation Transport Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less

  10. Version 2.0 Visual Sample Plan (VSP): UXO Module Code Description and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.; Wilson, John E.; O'Brien, Robert F.

    2003-05-06

    The Pacific Northwest National Laboratory (PNNL) is developing statistical methods for determining the amount of geophysical surveys conducted along transects (swaths) that are needed to achieve specified levels of confidence of finding target areas (TAs) of anomalous readings and possibly unexploded ordnance (UXO) at closed, transferring and transferred (CTT) Department of Defense (DoD) ranges and other sites. The statistical methods developed by PNNL have been coded into the UXO module of the Visual Sample Plan (VSP) software code that is being developed by PNNL with support from the DoD, the U.S. Department of Energy (DOE, and the U.S. Environmental Protectionmore » Agency (EPA). (The VSP software and VSP Users Guide (Hassig et al, 2002) may be downloaded from http://dqo.pnl.gov/vsp.) This report describes and documents the statistical methods developed and the calculations and verification testing that have been conducted to verify that VSPs implementation of these methods is correct and accurate.« less

  11. Development of a new lattice physics code robin for PWR application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Chen, G.

    2013-07-01

    This paper presents a description of methodologies and preliminary verification results of a new lattice physics code ROBIN, being developed for PWR application at Shanghai NuStar Nuclear Power Technology Co., Ltd. The methods used in ROBIN to fulfill various tasks of lattice physics analysis are an integration of historical methods and new methods that came into being very recently. Not only these methods like equivalence theory for resonance treatment and method of characteristics for neutron transport calculation are adopted, as they are applied in many of today's production-level LWR lattice codes, but also very useful new methods like the enhancedmore » neutron current method for Dancoff correction in large and complicated geometry and the log linear rate constant power depletion method for Gd-bearing fuel are implemented in the code. A small sample of verification results are provided to illustrate the type of accuracy achievable using ROBIN. It is demonstrated that ROBIN is capable of satisfying most of the needs for PWR lattice analysis and has the potential to become a production quality code in the future. (authors)« less

  12. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Robert P.; Miller, Paul; Howley, Kirsten

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, includingmore » MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.« less

  13. An update of input instructions to TEMOD

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The theory and operation of a FORTRAN 4 computer code, designated as TEMOD, used to calcuate tubular thermoelectric generator performance is described in WANL-TME-1906. The original version of TEMOD was developed in 1969. A description is given of additions to the mathematical model and an update of the input instructions to the code. Although the basic mathematical model described in WANL-TME-1906 has remained unchanged, a substantial number of input/output options were added to allow completion of module performance parametrics as required in support of the compact thermoelectric converter system technology program.

  14. Development of Clinical Vignettes to Describe Alzheimer's Disease Health States: A Qualitative Study

    PubMed Central

    Oremus, Mark; Xie, Feng; Gaebel, Kathryn

    2016-01-01

    Aims To develop clinical descriptions (vignettes) of life with Alzheimer’s disease (AD), we conducted focus groups of persons with AD (n = 14), family caregivers of persons with AD (n = 20), and clinicians who see persons with AD in their practices (n = 5). Methods Group participants read existing descriptions of AD and commented on the realism and comprehensibility of the descriptions. We used thematic framework analysis to code the comments into themes and develop three new vignettes to describe mild, moderate, and severe AD. Results Themes included the types of symptoms to mention in the new vignettes, plus the manner in which the vignettes should be written. Since the vignette descriptions were based on focus group participants’ first-hand knowledge of AD, the descriptions can be said to demonstrate content validity. Conclusion Members of the general public can read the vignettes and estimate their health-related quality-of-life (HRQoL) as if they had AD based on the vignette descriptions. This is especially important for economic evaluations of new AD medications, which require HRQoL to be assessed in a manner that persons with AD often find difficult to undertake. The vignettes will allow the general public to serve as a proxy and provide HRQoL estimates in place of persons with AD. PMID:27589604

  15. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  16. Maneuvering Rotorcraft Noise Prediction: A New Code for a New Problem

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.; Bres, Guillaume A.; Perez, Guillaume; Jones, Henry E.

    2002-01-01

    This paper presents the unique aspects of the development of an entirely new maneuver noise prediction code called PSU-WOPWOP. The main focus of the code is the aeroacoustic aspects of the maneuver noise problem, when the aeromechanical input data are provided (namely aircraft and blade motion, blade airloads). The PSU-WOPWOP noise prediction capability was developed for rotors in steady and transient maneuvering flight. Featuring an object-oriented design, the code allows great flexibility for complex rotor configuration and motion (including multiple rotors and full aircraft motion). The relative locations and number of hinges, flexures, and body motions can be arbitrarily specified to match the any specific rotorcraft. An analysis of algorithm efficiency is performed for maneuver noise prediction along with a description of the tradeoffs made specifically for the maneuvering noise problem. Noise predictions for the main rotor of a rotorcraft in steady descent, transient (arrested) descent, hover and a mild "pop-up" maneuver are demonstrated.

  17. AutoBayes Program Synthesis System Users Manual

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd

    2008-01-01

    Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.

  18. Langley Stability and Transition Analysis Code (LASTRAC) Version 1.2 User Manual

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    LASTRAC is a general-purposed, physics-based transition prediction code released by NASA for Laminar Flow Control studies and transition research. The design and development of the LASTRAC code is aimed at providing an engineering tool that is easy to use and yet capable of dealing with a broad range of transition related issues. It was written from scratch based on the state-of-the-art numerical methods for stability analysis and modern software technologies. At low fidelity, it allows users to perform linear stability analysis and N-factor transition correlation for a broad range of flow regimes and configurations by using either the linear stability theory or linear parabolized stability equations method. At high fidelity, users may use nonlinear PSE to track finite-amplitude disturbances until the skin friction rise. This document describes the governing equations, numerical methods, code development, detailed description of input/output parameters, and case studies for the current release of LASTRAC.

  19. Development of a computer code for calculating the steady super/hypersonic inviscid flow around real configurations. Volume 2: Code description

    NASA Technical Reports Server (NTRS)

    Marconi, F.; Yaeger, L.

    1976-01-01

    A numerical procedure was developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second-order accurate finite difference scheme is used to integrate the three-dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine-Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.

  20. Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.

    PubMed

    Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh

    2018-01-01

    Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.

  1. Integrated Electronic Warfare System Advanced Development Model (ADM); Appendix 1 - Functional Requirement Specification.

    DTIC Science & Technology

    1977-10-01

    APPROVED DATE FUNCTION APPROVED jDATE WRITER J . K-olanek 2/6/76 REVISIONS CHK DESCRIPTION REV CHK DESCRIPTION IREV REVISION jJ ~ ~ ~~~ _ II SHEET NO...DOCUMENT (CDBDD) 45 5.5 COMPUTER PROGRAM PACKAGE (CPP)- j 45 5.6 COMPUTER PROGRAM OPERATOR’S MANUAL (CPOM) 45 5.7 COMPUTER PROGRAM TEST PLAN (CPTPL) 45...I LIST OF FIGURES Number Page 1 JEWS Simplified Block Diagram 4 2 System Controller Architecture 5 SIZE CODE IDENT NO DRAWING NO. A 49956 SCALE REV J

  2. Studies of particle wake potentials in plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian N.; Graziani, Frank R.; Glosli, James N.; Strozzi, David J.; Surh, Michael P.; Richards, David F.; Decyk, Viktor K.; Mori, Warren B.

    2011-09-01

    A detailed understanding of electron stopping and scattering in plasmas with variable values for the number of particles within a Debye sphere is still not at hand. Presently, there is some disagreement in the literature concerning the proper description of these processes. Theoretical models assume electrostatic (Coulomb force) interactions between particles and neglect magnetic effects. Developing and validating proper descriptions requires studying the processes using first-principle plasma simulations. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and BEPS. In this paper, we compare the wakes observed in these simulations with each other and predictions from collisionless kinetic theory. The relevance of the work to Fast Ignition is discussed.

  3. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  4. Development of an Ultra-Safe Rechargeable Lithium-Ion Battery.

    DTIC Science & Technology

    1994-11-15

    34 DEVELOPMENT OF AN ULTRA-SAFE RECHARGEABLE LITHIUM - ION BATTERY DTIC \\ JANI 0 1995 19941221 079 Contract # N00014-94-C-0141 ARPA Order...DEVELOPMENT OF AN ULTRA-SAFE RECHARGEABLE LITHIUM - ION BATTERY R&D STATUS REPORT 1931-1001/0 ARPA Order No.: 9332004arp01/13APR1994/313ES Program Code...Title of Work: Lithium - ion Battery Development Reporting Period: August 15, 1994 to November 15, 1994 Description of Progress: The project activities had

  5. What Is FRBR? A Conceptual Model for the Bibliographic Universe

    ERIC Educational Resources Information Center

    Tillett, Barbara

    2005-01-01

    From 1992 to 1995 the IFLA Study Group on Functional Requirements for Bibliographic Records (FRBR) developed an entity relationship model as a generalised view of the bibliographic universe, intended to be independent of any cataloguing code or implementation. The FRBR report itself includes a description of the conceptual model (the entities,…

  6. The Recurring Author: William Shakespeare, a Case Study through Content Analysis.

    ERIC Educational Resources Information Center

    Harrison, Robert L., Jr.

    The "recurring author" is one whose works appear many times at different levels in instructional units found in literature textbook series. A descriptive case study discussed the treatment of a recurring author, William Shakespeare, using units in a sample of six literature textbook series. Developed to describe, to code, and to analyze…

  7. Guideline for the Comprehensive Campus Master Plan System.

    ERIC Educational Resources Information Center

    State Univ. System of Florida, Tallahassee.

    This document is a guideline for institutions in the Florida State University System to use as they comply with state mandates requiring them to develop campus master plans and land management plans. It supplements the minimum criteria in the state's Administrative Code. For each element the guide offers description of its purpose, data…

  8. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  9. Level-2 Milestone 5588: Deliver Strategic Plan and Initial Scalability Assessment by Advanced Architecture and Portability Specialists Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, Erik W.

    This report documents the fact that the work in creating a strategic plan and beginning customer engagements has been completed. The description of milestone is: The newly formed advanced architecture and portability specialists (AAPS) team will develop a strategic plan to meet the goals of 1) sharing knowledge and experience with code teams to ensure that ASC codes run well on new architectures, and 2) supplying skilled computational scientists to put the strategy into practice. The plan will be delivered to ASC management in the first quarter. By the fourth quarter, the team will identify their first customers within PEMmore » and IC, perform an initial assessment and scalability and performance bottleneck for next-generation architectures, and embed AAPS team members with customer code teams to assist with initial portability development within standalone kernels or proxy applications.« less

  10. Developing an observing attitude: A qualitative analysis of meditation diaries in a MBSR clinical trial

    PubMed Central

    Kerr, Catherine E.; Josyula, Krishnapriya; Littenberg, Ronnie

    2011-01-01

    Mindfulness-based stress reduction (MBSR) is an 8-week training that is designed to teach participants mindful awareness of the present moment. In randomized clinical trials (RCTs), MBSR has demonstrated efficacy in various conditions including reducing chronic pain related distress and improving quality of life in healthy individuals. There have, however, been no qualitative studies investigating participants’ descriptions of changes experienced over multiple time-points during the course of the program. This qualitative study of a MBSR cohort (N=8 healthy individuals) in a larger RCT examined participants’ daily diary descriptions of their home-practice experiences. The study used a two-part method, combining grounded theory with a close-ended coding approach. The grounded theory analysis revealed that during the trial, all participants, to varying degrees, described moments of distress related to practice; at the end of the course, all participants who completed the training demonstrated greater detail and clarity in their descriptions, improved affect, and the emergence of an observing self. The closed-ended coding schema carried out to shed light on the development of an observing self, revealed that the emergence of an observing self was not related to the valence of participants’ experiential descriptions: even participants whose diaries contained predominantly negative characterizations of their experience throughout the trial were able, by the end of the trial, to demonstrate an observing, witnessing attitude towards their own distress. Conclusion Progress in MBSR may rely less on the valence of participants’ experiences and more on the way participants describe and relate to their own inner experience. PMID:21226129

  11. Turbomachinery Forced Response Prediction System (FREPS): User's Manual

    NASA Technical Reports Server (NTRS)

    Morel, M. R.; Murthy, D. V.

    1994-01-01

    The turbomachinery forced response prediction system (FREPS), version 1.2, is capable of predicting the aeroelastic behavior of axial-flow turbomachinery blades. This document is meant to serve as a guide in the use of the FREPS code with specific emphasis on its use at NASA Lewis Research Center (LeRC). A detailed explanation of the aeroelastic analysis and its development is beyond the scope of this document, and may be found in the references. FREPS has been developed by the NASA LeRC Structural Dynamics Branch. The manual is divided into three major parts: an introduction, the preparation of input, and the procedure to execute FREPS. Part 1 includes a brief background on the necessity of FREPS, a description of the FREPS system, the steps needed to be taken before FREPS is executed, an example input file with instructions, presentation of the geometric conventions used, and the input/output files employed and produced by FREPS. Part 2 contains a detailed description of the command names needed to create the primary input file that is required to execute the FREPS code. Also, Part 2 has an example data file to aid the user in creating their own input files. Part 3 explains the procedures required to execute the FREPS code on the Cray Y-MP, a computer system available at the NASA LeRC.

  12. Numerical and Physical Aspects of Aerodynamic Flows

    DTIC Science & Technology

    1992-01-15

    accretion was also measured. detailed description of the IRT can be found in This test program also provided a new database for reference 4. code...Deflection lift flows and to develop a validation database 8 Slat Deflection with practical geometries/conditions for emerging computational methods. This...be substantially improved by their developers in the absence of a quality database at realistic conditions for a practical airfoil. The work reported

  13. Manual for obscuration code with space station applications

    NASA Technical Reports Server (NTRS)

    Marhefka, R. J.; Takacs, L.

    1986-01-01

    The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.

  14. PACER -- A fast running computer code for the calculation of short-term containment/confinement loads following coolant boundary failure. Volume 2: User information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sienicki, J.J.

    A fast running and simple computer code has been developed to calculate pressure loadings inside light water reactor containments/confinements under loss-of-coolant accident conditions. PACER was originally developed to calculate containment/confinement pressure and temperature time histories for loss-of-coolant accidents in Soviet-designed VVER reactors and is relevant to the activities of the US International Nuclear Safety Center. The code employs a multicompartment representation of the containment volume and is focused upon application to early time containment phenomena during and immediately following blowdown. PACER has been developed for FORTRAN 77 and earlier versions of FORTRAN. The code has been successfully compiled and executedmore » on SUN SPARC and Hewlett-Packard HP-735 workstations provided that appropriate compiler options are specified. The code incorporates both capabilities built around a hardwired default generic VVER-440 Model V230 design as well as fairly general user-defined input. However, array dimensions are hardwired and must be changed by modifying the source code if the number of compartments/cells differs from the default number of nine. Detailed input instructions are provided as well as a description of outputs. Input files and selected output are presented for two sample problems run on both HP-735 and SUN SPARC workstations.« less

  15. 7 CFR 1485.13 - Application process and strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... affiliated organizations; (D) A description of management and administrative capability; (E) A description of... code and the percentage of U.S. origin content by weight, exclusive of added water; (B) A description... and the percentage of U.S. origin content by weight, exclusive of added water; (C) A description of...

  16. Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Zhang, Jian; Chen, Yan

    This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less

  17. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  18. Calculation of Water Drop Trajectories to and About Arbitrary Three-Dimensional Bodies in Potential Airflow

    NASA Technical Reports Server (NTRS)

    Norment, H. G.

    1980-01-01

    Calculations can be performed for any atmospheric conditions and for all water drop sizes, from the smallest cloud droplet to large raindrops. Any subsonic, external, non-lifting flow can be accommodated; flow into, but not through, inlets also can be simulated. Experimental water drop drag relations are used in the water drop equations of motion and effects of gravity settling are included. Seven codes are described: (1) a code used to debug and plot body surface description data; (2) a code that processes the body surface data to yield the potential flow field; (3) a code that computes flow velocities at arrays of points in space; (4) a code that computes water drop trajectories from an array of points in space; (5) a code that computes water drop trajectories and fluxes to arbitrary target points; (6) a code that computes water drop trajectories tangent to the body; and (7) a code that produces stereo pair plots which include both the body and trajectories. Code descriptions include operating instructions, card inputs and printouts for example problems, and listing of the FORTRAN codes. Accuracy of the calculations is discussed, and trajectory calculation results are compared with prior calculations and with experimental data.

  19. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less

  20. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude data for point sources. The six codes and their definitions are: (1) Coordinate Data Source Code: The... physical piece of or a closely related set of equipment. The EPA's reporting format for a given inventory...

  1. 76 FR 32085 - Medicare Program; Inpatient Psychiatric Facilities Prospective Payment System-Update for Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... description of comorbidity for chronic renal failure. In addition, we inadvertently omitted from Table 11 the comorbidity code ``V4511'' for chronic renal failure. These changes are not substantive changes to the... heading ``Diagnoses codes,'' for the renal failure, chronic diagnoses codes, replace code ``V451'' with...

  2. 76 FR 53912 - FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ...] FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code Names... replaced non- informative code names with descriptive identifiers on its public database of products that... on our public database with non-informative code names. After careful consideration of this matter...

  3. An approach for coupled-code multiphysics core simulations from a common input

    DOE PAGES

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...

    2014-12-10

    This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less

  4. FLUSH: A tool for the design of slush hydrogen flow systems

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1990-01-01

    As part of the National Aerospace Plane Project an analytical model was developed to perform calculations for in-line transfer of solid-liquid mixtures of hydrogen. This code, called FLUSH, calculates pressure drop and solid fraction loss for the flow of slush hydrogen through pipe systems. The model solves the steady-state, one-dimensional equation of energy to obtain slush loss estimates. A description of the code is provided as well as a guide for users of the program. Preliminary results are also presented showing the anticipated degradation of slush hydrogen solid content for various piping systems.

  5. Dust-wall and dust-plasma interaction in the MIGRAINe code

    NASA Astrophysics Data System (ADS)

    Vignitchouk, L.; Tolias, P.; Ratynskaia, S.

    2014-09-01

    The physical models implemented in the recently developed dust dynamics code MIGRAINe are described. A major update of the treatment of secondary electron emission, stemming from models adapted to typical scrape-off layer temperatures, is reported. Sputtering and plasma species backscattering are introduced from fits of available experimental data and their relative importance to dust charging and heating is assessed in fusion-relevant scenarios. Moreover, the description of collisions between dust particles and plasma-facing components, based on the approximation of elastic-perfectly plastic adhesive spheres, has been upgraded to take into account the effects of particle size and temperature.

  6. Determination of coronal magnetic fields from vector magnetograms

    NASA Technical Reports Server (NTRS)

    Mikic, Zoran

    1992-01-01

    The determination of coronal magnetic fields from vector magnetograms, including the development and application of algorithms to determine force-free coronal fields above selected observations of active regions is studied. Two additional active regions were selected and analyzed. The restriction of periodicity in the 3-D code which is used to determine the coronal field was removed giving the new code variable mesh spacing and is thus able to provide a more realistic description of coronal fields. The NOAA active region AR5747 of 20 Oct. 1989 was studied. A brief account of progress during the research performed is reported.

  7. The kinetics of aerosol particle formation and removal in NPP severe accidents

    NASA Astrophysics Data System (ADS)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.

    2016-06-01

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.

  8. The kinetics of aerosol particle formation and removal in NPP severe accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.

    2016-06-08

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less

  9. NASTRAN maintenance and enhancement experiences

    NASA Technical Reports Server (NTRS)

    Schmitz, R. P.

    1975-01-01

    The current capability is described which includes isoparametric elements, optimization of grid point sequencing, and eigenvalue routine. Overlay and coding errors were corrected for cyclic symmetry, transient response, and differential stiffness rigid formats. Error corrections and program enhancements are discussed along with developments scheduled for the current year and a brief description of analyses being performed using the program.

  10. TFaNS Tone Fan Noise Design/Prediction System. Volume 2; User's Manual; 1.4

    NASA Technical Reports Server (NTRS)

    Topol, David A.; Eversman, Walter

    1999-01-01

    TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFaNS consists of: the codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. CUP3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report provides information on code input and file structure essential for potential users of TFANS. This report is divided into three volumes: Volume 1. System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume 2. User's Manual, TFANS Vers. 1.4; Volume 3. Evaluation of System Codes.

  11. TFaNS Tone Fan Noise Design/Prediction System. Volume 3; Evaluation of System Codes

    NASA Technical Reports Server (NTRS)

    Topol, David A.

    1999-01-01

    TFANS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFANS consists of: The codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. Cup3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report evaluates TFANS versus full-scale and ADP 22" fig data using the semi-empirical wake modelling in the system. This report is divided into three volumes: Volume 1: System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume II: User's Manual, TFANS Version 1.4; Volume III: Evaluation of System Codes.

  12. Development of a 3D numerical code to calculate the trajectories of the blow off electrons emitted by a vacuum surface discharge: Application to the study of the electromagnetic interference induced on a spacecraft

    NASA Astrophysics Data System (ADS)

    Froger, Etienne

    1993-05-01

    A description of the electromagnetic behavior of a satellite subjected to an electric discharge is given using a specially developed numerical code. One of the particularities of vacuum discharges, obtained by irradiation of polymers, is the intense emission of electrons into the spacecraft environment. Electromagnetic radiation, associated with the trajectories of the particles around the spacecraft, is considered as the main source of the interference observed. In the absence of accurate orbital data and realistic ground tests, the assessment of these effects requires numerical simulation of the interaction between this electron source and the spacecraft. This is done by the GEODE particle code which is applied to characteristic configurations in order to estimate the spacecraft response to a discharge, which is simulated from a vacuum discharge model designed in laboratory. The spacecraft response to a current injection is simulated by the ALICE numerical three dimensional code. The comparison between discharge and injection effects, from the results given by the two codes, illustrates the representativity of electromagnetic susceptibility tests and the main parameters for their definition.

  13. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  14. Spacecraft-plasma interaction codes: NASCAP/GEO, NASCAP/LEO, POLAR, DynaPAC, and EPSAT

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Jongeward, G. A.; Cooke, D. L.

    1992-01-01

    Development of a computer code to simulate interactions between the surfaces of a geometrically complex spacecraft and the space plasma environment involves: (1) defining the relevant physical phenomena and formulating them in appropriate levels of approximation; (2) defining a representation for the 3-D space external to the spacecraft and a means for defining the spacecraft surface geometry and embedding it in the surrounding space; (3) packaging the code so that it is easy and practical to use, interpret, and present the results; and (4) validating the code by continual comparison with theoretical models, ground test data, and spaceflight experiments. The physical content, geometrical capabilities, and application of five S-CUBED developed spacecraft plasma interaction codes are discussed. The NASA Charging Analyzer Program/geosynchronous earth orbit (NASCAP/GEO) is used to illustrate the role of electrostatic barrier formation in daylight spacecraft charging. NASCAP/low Earth orbit (LEO) applications to the CHARGE-2 and Space Power Experiment Aboard Rockets (SPEAR)-1 rocket payloads are shown. DynaPAC application to the SPEAR-2 rocket payloads is described. Environment Power System Analysis Tool (EPSAT) is illustrated by application to Tethered Satellite System 1 (TSS-1), SPEAR-3, and Sundance. A detailed description and application of the Potentials of Large Objects in the Auroral Region (POLAR) Code are presented.

  15. Reconstructing past occupational exposures: how reliable are women's reports of their partner's occupation?

    PubMed

    Tagiyeva, Nara; Semple, Sean; Devereux, Graham; Sherriff, Andrea; Henderson, John; Elias, Peter; Ayres, Jon G

    2011-06-01

    Most of the evidence on agreement between self- and proxy-reported occupational data comes from interview-based studies. The authors aimed to examine agreement between women's reports of their partner's occupation and their partner's own description using questionnaire-based data collected as a part of the prospective, population-based Avon Longitudinal Study of Parents and Children. Information on present occupation was self-reported by women's partners and proxy-reported by women through questionnaires administered at 8 and 21 months after the birth of a child. Job titles were coded to the Standard Occupational Classification (SOC2000) using software developed by the University of Warwick (Computer-Assisted Structured Coding Tool). The accuracy of proxy-report was expressed as percentage agreement and kappa coefficients for four-, three- and two-digit SOC2000 codes obtained in automatic and semiautomatic (manually improved) coding modes. Data from 6016 couples at 8 months and 5232 couples at 21 months postnatally were included in the analyses. The agreement between men's self-reported occupation and women's report of their partner's occupation in fully automatic coding mode at four-, three- and two-digit code level was 65%, 71% and 77% at 8 months and 68%, 73% and 76% at 21 months. The accuracy of agreement was slightly improved by semiautomatic coding of occupations: 73%/73%, 78%/77% and 83%/80% at 8/21 months respectively. While this suggests that women's description of their partners' occupation can be used as a valuable tool in epidemiological research where data from partners are not available, this study revealed no agreement between these young women and their partners at the two-digit level of SOC2000 coding in approximately one in five cases. Proxy reporting of occupation introduces a statistically significant degree of error in classification. The effects of occupational misclassification by proxy reporting in retrospective occupational epidemiological studies based on questionnaire data should be considered.

  16. Status Report on NEAMS PROTEUS/ORIGEN Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A

    2016-02-18

    The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less

  17. User's manual for the time-dependent INERTIA code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, A.W.; Bennett, R.B.

    1985-01-01

    The time-dependent INERTIA code is described. This code models the effects of neutral beam momentum input in tokamaks as predicted by the time-dependent formulation of the Stacey-Sigmar formalism. The operation and architecture of the code are described, as are the supplementary plotting and impurity line radiation routines. A short description of the steady-state version of the INERTIA code is also provided.

  18. Description of Panel Method Code ANTARES

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; George, Mike (Technical Monitor)

    2000-01-01

    Panel method code ANTARES was developed to compute wall interference corrections in a rectangular wind tunnel. The code uses point doublets to represent blockage effects and line doublets to represent lifting effects of a wind tunnel model. Subsonic compressibility effects are modeled by applying the Prandtl-Glauert transformation. The closed wall, open jet, or perforated wall boundary condition may be assigned to a wall panel centroid. The tunnel walls can be represented by using up to 8000 panels. The accuracy of panel method code ANTARES was successfully investigated by comparing solutions for the closed wall and open jet boundary condition with corresponding Method of Images solutions. Fourier transform solutions of a two-dimensional wind tunnel flow field were used to check the application of the perforated wall boundary condition. Studies showed that the accuracy of panel method code ANTARES can be improved by increasing the total number of wall panels in the circumferential direction. It was also shown that the accuracy decreases with increasing free-stream Mach number of the wind tunnel flow field.

  19. Software Management Environment (SME) release 9.4 user reference material

    NASA Technical Reports Server (NTRS)

    Hendrick, R.; Kistler, D.; Manter, K.

    1992-01-01

    This document contains user reference material for the Software Management Environment (SME) prototype, developed for the Systems Development Branch (Code 552) of the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC). The SME provides an integrated set of management tools that can be used by software development managers in their day-to-day management and planning activities. This document provides an overview of the SME, a description of all functions, and detailed instructions concerning the software's installation and use.

  20. REGIONAL-SCALE ATMOSPHERIC MERCURY MODELING

    EPA Science Inventory

    This PowerPoint presentation gives a short synopsis of the state of the science of atmospheric mercury modeling, including a description of recent publications of model codes by EPA, a description of a recent mercury model intercomparison study, and a description of a synthesis p...

  1. [Orthopedic and trauma surgery in the German DRG system. Recent developments].

    PubMed

    Franz, D; Schemmann, F; Selter, D D; Wirtz, D C; Roeder, N; Siebert, H; Mahlke, L

    2012-07-01

    Orthopedics and trauma surgery are subject to continuous medical advancement. The correct and performance-based case allocation by German diagnosis-related groups (G-DRG) is a major challenge. This article analyzes and assesses current developments in orthopedics and trauma surgery in the areas of coding of diagnoses and medical procedures and the development of the 2012 G-DRG system. The relevant diagnoses, medical procedures and G-DRGs in the versions 2011 and 2012 were analyzed based on the publications of the German DRG Institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI). Changes were made for the International Classification of Diseases (ICD) coding of complex cases with medical complications, the procedure coding for spinal surgery and for hand and foot surgery. The G-DRG structures were modified for endoprosthetic surgery on ankle, shoulder and elbow joints. The definition of modular structured endoprostheses was clarified. The G-DRG system for orthopedic and trauma surgery appears to be largely consolidated. The current phase of the evolution of the G-DRG system is primarily aimed at developing most exact descriptions and definitions of the content and mutual delimitation of operation and procedures coding (OPS). This is an essential prerequisite for a correct and performance-based case allocation in the G-DRG system.

  2. PHITS Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niita, K.; Matsuda, N.; Iwamoto, Y.

    The paper presents a brief description of the models incorporated in PHITS and the present status of the code, showing some benchmarking tests of the PHITS code for accelerator facilities and space radiation.

  3. Telidon Videotex presentation level protocol: Augmented picture description instructions

    NASA Astrophysics Data System (ADS)

    Obrien, C. D.; Brown, H. G.; Smirle, J. C.; Lum, Y. F.; Kukulka, J. Z.; Kwan, A.

    1982-02-01

    The Telidon Videotex system is a method by which graphic and textual information and transactional services can be accessed from information sources by the general public. In order to transmit information to a Telidon terminal at a minimum bandwidth, and in a manner independent of the type of communications channel, a coding scheme was devised which permits the encoding of a picture into the geometric drawing elements which compose it. These picture description instructions are an alpha geometric coding model and are based on the primitives of POINT, LINE, ARC, RECTANGLE, POLYGON, and INCREMENT. Text is encoded as (ASCII) characters along with a supplementary table of accents and special characters. A mosaic shape table is included for compatibility. A detailed specification of the coding scheme and a description of the principles which make it independent of communications channel and display hardware are provided.

  4. Description of movement quality in patients with low back pain: A qualitative study as a first step to a practical definition.

    PubMed

    van Dijk, Margriet J H; Smorenburg, Nienke T A; Visser, Bart; Nijhuis-van der Sanden, Maria W G; Heerkens, Yvonne F

    2017-03-01

    As a first step to formulate a practical definition for movement quality (MQ), this study aims to explore how Dutch allied health care professionals (AHCPs) describe MQ of daily life activities in patients with low back pain (LBP). In this qualitative cross-sectional digital survey study, Dutch AHCPs (n = 91) described MQ in open text (n = 91) and with three keywords (n = 90). After exploratory qualitative content analysis, the ICF linking rules (International Classification of Functioning, Disability and Health) were applied to classify MQ descriptions and keywords. The identified meaningful concepts (MCs) of the descriptions (274) and keywords (239) were linked to ICF codes (87.5% and 80.3%, respectively), Personal factors (5.8% and 5.9%, respectively), and supplementary codes (6.6% and 13.8%, respectively). The MCs were linked to a total of 31 ICF codes, especially to b760 'control of voluntary movement functions', b7602 'coordination of voluntary movements', d4 'Mobility', and d230 'carry out daily routine'. Negative and positive formulated descriptions elucidated different MQ interpretations. Descriptions of MQ given by Dutch AHCPs in patients with LBP cover all ICF components. Coordination and functional movements are seen as the most elementary concepts of MQ. Variation in MQ descriptions and interpretations hinders defining MQ and indicates the necessity of additional steps.

  5. [Standardization of terminology in laboratory medicine I].

    PubMed

    Yoon, Soo Young; Yoon, Jong Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Lee, Chang Kyu; Kwon, Jung Ah; Lee, Kap No

    2007-04-01

    Standardization of medical terminology is essential for data transmission between health-care institutions or clinical laboratories and for maximizing the benefits of information technology. Purpose of our study was to standardize the medical terms used in the clinical laboratory, such as test names, units, terms used in result descriptions, etc. During the first year of the study, we developed a standard database of concept names for laboratory terms, which covered the terms used in government health care centers, their branch offices, and primary health care units. Laboratory terms were collected from the electronic data interchange (EDI) codes from National Health Insurance Corporation (NHIC), Logical Observation Identifier Names and Codes (LOINC) database, community health centers and their branch offices, and clinical laboratories of representative university medical centers. For standard expression, we referred to the English-Korean/ Korean-English medical dictionary of Korean Medical Association and the rules for foreign language translation. Programs for mapping between LOINC DB and EDI code and for translating English to Korean were developed. A Korean standard laboratory terminology database containing six axial concept names such as components, property, time aspect, system (specimen), scale type, and method type was established for 7,508 test observations. Short names and a mapping table for EDI codes and Unified Medical Language System (UMLS) were added. Synonym tables for concept names, words used in the database, and six axial terms were prepared to make it easier to find the standard terminology with common terms used in the field of laboratory medicine. Here we report for the first time a Korean standard laboratory terminology database for test names, result description terms, result units covering most laboratory tests in primary healthcare centers.

  6. Use of data description languages in the interchange of data

    NASA Technical Reports Server (NTRS)

    Pignede, M.; Real-Planells, B.; Smith, S. R.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) is developing Standards for the interchange of information between systems, including those operating under different environments. The objective is to perform the interchange automatically, i.e. in a computer interpretable manner. One aspect of the concept developed by CCSDS is the use of a separate data description to specify the data being transferred. Using the description, data can then be automatically parsed by the receiving computer. With a suitably expressive Data Description Language (DDL), data formats of arbitrary complexity can be handled. The advantages of this approach are: (1) that the description need only be written and distributed once to all users, and (2) new software does not need to be written for each new format, provided generic tools are available to support writing and interpretation of descriptions and the associated data instances. Consequently, the effort of 'hard coding' each new format is avoided and problems of integrating multiple implementations of a given format by different users are avoided. The approach is applicable in any context where computer parsable description of data could enhance efficiency (e.g. within a spacecraft control system, a data delivery system or an archive). The CCSDS have identified several candidate DDL's: EAST (Extended Ada Subset), TSDN (Transfer Syntax Data Notation) and MADEL (Modified ASN.1 as a Data Description Language -- a DDL based on the Abstract Syntax Notation One - ASN.1 - specified in the ISO/IEC 8824). This paper concentrates on ESA's development of MADEL. ESA have also developed a 'proof of concept' prototype of the required support tools, implemented on a PC under MS-DOS, which has successfully demonstrated the feasibility of the approach, including the capability within an application of retrieving and displaying particular data elements, given its MADEL description (i.e. a data description written in MADEL). This paper outlines the work done to date and assesses the applicability of this modified ASN.1 as a DDL. The feasibility of the approach is illustrated with several examples.

  7. Two way time transfer results at NRL and USNO

    NASA Technical Reports Server (NTRS)

    Galysh, Ivan J.; Landis, G. Paul

    1993-01-01

    The Naval Research Laboratory (NRL) has developed a two way time transfer modem system for the United States Naval Observatory (USNO). Two modems in conjunction with a pair of Very Small Aperture Terminal (VSAT) and a communication satellite can achieve sub nanosecond time transfer. This performance is demonstrated by the results of testing at and between NRL and USNO. The modems use Code Division Multiple Access (CDMA) methods to separate their signals through a single path in the satellite. Each modem transmitted a different Pseudo Random Noise (PRN) code and received the others PRN code. High precision time transfer is possible with two way methods because of reciprocity of many of the terms of the path and hardware delay between the two modems. The hardware description was given in a previous paper.

  8. FPCAS3D User's guide: A three dimensional full potential aeroelastic program, version 1

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.

    1995-01-01

    The FPCAS3D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady three-dimensional full potential equation which is solved for a blade row. The structural analysis is based on a finite-element model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS3D code. A complete description of the input data is provided in this report. In addition, six examples, including inputs and outputs, are provided.

  9. FPCAS2D user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.

    1994-01-01

    The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.

  10. Studies of Particle Wake Potentials in Plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren

    2011-10-01

    Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.

  11. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.

    This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less

  12. User's manual for the BNW-II optimization code for dry/wet-cooled power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braun, D.J.; Bamberger, J.A.; Braun, D.J.

    1978-05-01

    This volume provides a listing of the BNW-II dry/wet ammonia heat rejection optimization code and is an appendix to Volume I which gives a narrative description of the code's algorithms as well as logic, input and output information.

  13. Towards a Local Integration of Theories: Codes and Praxeologies in the Case of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Gellert, Uwe; Barbe, Joaquim; Espinoza, Lorena

    2013-01-01

    We report on the development of a "language of description" that facilitates an integrated analysis of classroom video data in terms of the quality of the teaching-learning process and the students' access to valued forms of mathematical knowledge. Our research setting is the introduction of software for teachers for improving the mathematical…

  14. Laser identification system based on acousto-optical barcode scanner principles

    NASA Astrophysics Data System (ADS)

    Khansuvarov, Ruslan A.; Korol, Georgy I.; Preslenev, Leonid N.; Bestugin, Aleksandr R.; Paraskun, Arthur S.

    2016-09-01

    The main purpose of the bar code in the modern world is the unique identification of the product, service, or any of their features, so personal and stationary barcode scanners so widely used. One of the important parameters of bar code scanners is their reliability, accuracy of the barcode recognition, response time and performance. Nowadays, the most popular personal barcode scanners contain a mechanical part, which extremely impairs the reliability indices. Group of SUAI engineers has proposed bar code scanner based on laser beam acoustic deflection effect in crystals [RU patent No 156009 issued 4/16/2015] Through the use of an acousto-optic deflector element in barcode scanner described by a group of engineers SUAI, it can be implemented in the manual form factor, and the stationary form factor of a barcode scanner. Being a wave electronic device, an acousto-optic element in the composition of the acousto-optic barcode scanner allows you to clearly establish a mathematical link between the encoded function of the bar code with the accepted input photodetector intensities function that allows you to speak about the great probability of a bar code clear definition. This paper provides a description of the issued patent, the description of the principles of operation based on the mathematical analysis, a description of the layout of the implemented scanner.

  15. Beam Instrument Development System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOOLITTLE, LAWRENCE; HUANG, GANG; DU, QIANG

    Beam Instrumentation Development System (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.

  16. Real-time transmission of digital video using variable-length coding

    NASA Technical Reports Server (NTRS)

    Bizon, Thomas P.; Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1993-01-01

    Huffman coding is a variable-length lossless compression technique where data with a high probability of occurrence is represented with short codewords, while 'not-so-likely' data is assigned longer codewords. Compression is achieved when the high-probability levels occur so frequently that their benefit outweighs any penalty paid when a less likely input occurs. One instance where Huffman coding is extremely effective occurs when data is highly predictable and differential coding can be applied (as with a digital video signal). For that reason, it is desirable to apply this compression technique to digital video transmission; however, special care must be taken in order to implement a communication protocol utilizing Huffman coding. This paper addresses several of the issues relating to the real-time transmission of Huffman-coded digital video over a constant-rate serial channel. Topics discussed include data rate conversion (from variable to a fixed rate), efficient data buffering, channel coding, recovery from communication errors, decoder synchronization, and decoder architectures. A description of the hardware developed to execute Huffman coding and serial transmission is also included. Although this paper focuses on matters relating to Huffman-coded digital video, the techniques discussed can easily be generalized for a variety of applications which require transmission of variable-length data.

  17. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  18. 50 CFR Table 1c to Part 679 - Product Type Codes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Product Type Codes 1c Table 1c to Part..., Table 1c Table 1c to Part 679—Product Type Codes Description Code Ancillary product.A product, such as... the highest recovery rate. P Reprocessed or rehandled product.A product, such as meal, that results...

  19. acme: The Amendable Coal-Fire Modeling Exercise. A C++ Class Library for the Numerical Simulation of Coal-Fires

    NASA Astrophysics Data System (ADS)

    Wuttke, Manfred W.

    2017-04-01

    At LIAG, we use numerical models to develop and enhance understanding of coupled transport processes and to predict the dynamics of the system under consideration. Topics include geothermal heat utilization, subrosion processes, and spontaneous underground coal fires. Although the details make it inconvenient if not impossible to apply a single code implementation to all systems, their investigations go along similar paths: They all depend on the solution of coupled transport equations. We thus saw a need for a modular code system with open access for the various communities to maximize the shared synergistic effects. To this purpose we develop the oops! ( open object-oriented parallel solutions) - toolkit, a C++ class library for the numerical solution of mathematical models of coupled thermal, hydraulic and chemical processes. This is used to develop problem-specific libraries like acme( amendable coal-fire modeling exercise), a class library for the numerical simulation of coal-fires and applications like kobra (Kohlebrand, german for coal-fire), a numerical simulation code for standard coal-fire models. Basic principle of the oops!-code system is the provision of data types for the description of space and time dependent data fields, description of terms of partial differential equations (pde), their discretisation and solving methods. Coupling of different processes, described by their particular pde is modeled by an automatic timescale-ordered operator-splitting technique. acme is a derived coal-fire specific application library, depending on oops!. If specific functionalities of general interest are implemented and have been tested they will be assimilated into the main oops!-library. Interfaces to external pre- and post-processing tools are easily implemented. Thus a construction kit which can be arbitrarily amended is formed. With the kobra-application constructed with acme we study the processes and propagation of shallow coal seam fires in particular in Xinjiang, China, as well as analyze and interpret results from lab experiments.

  20. Program Descriptions for Interactive Signal and Pattern Analysis and Recognition System (ISPARS).

    DTIC Science & Technology

    1984-03-01

    procedures for the ISPARS components developed at the David Taylor Naval Ship Reserach and Development Center (DTNSRDC), which are not documented in other...an alphabetic character. Some commands may consist of a letter and one or two two-digit numbers, separated by a space as specified in the table. 81 I-I... PAPERS INTENDED FOR IN- TERNAL USE. THEY CARRY AN IDENTIFYING NUMBER WHICH INDICATES THEIR TYPE AND THE NUMERICAL CODE OF THE ORIGINATING DEPARTMENT

  1. Federal Logistics Information System (FLIS) Procedures Manual, Volume 4. Item Identification.

    DTIC Science & Technology

    1995-01-01

    Functional I DRMS Defense Reutilization 1,15 Description and Marketing FDM Full Descriptive 2 Service Method (Item DPSC Defense Personnel 2,13,14...under DIC KRE, return code ment or segment mix of FLIS data. For interna- AU. tional cataloging, only one Output Data RequestV Code may be used per...Screening Results) with KMR (Matching NATO Maintenance and Supply Agency (NAMSA), Reference-Screening) and either KFC (File Data the custodian for control

  2. Social Workers and the NASW "Code of Ethics": Belief, Behavior, Disjuncture

    ERIC Educational Resources Information Center

    DiFranks, Nikki Nelson

    2008-01-01

    A quantitative descriptive survey of a national sample of social workers (N = 206) examined discrepancies between belief in the NASW "Code of Ethics" and behavior in implementing the code and social workers' disjunctive distress (disjuncture) when belief and behavior are discordant. Relationships between setting and disjuncture and ethics…

  3. DRG coding practice: a nationwide hospital survey in Thailand.

    PubMed

    Pongpirul, Krit; Walker, Damian G; Rahman, Hafizur; Robinson, Courtland

    2011-10-31

    Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings.

  4. FRAC-IN-THE-BOX utilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, D.G.; West, J.T.

    FRAC-IN-THE-BOX is a computer code developed to calculate the fractions of rectangular parallelepiped mesh cell volumes that are intersected by combinatorial geometry type zones. The geometry description used in the code is a subset of the combinatorial geometry used in SABRINA. The input file may be read into SABRINA and three dimensional plots made of the input geometry. The volume fractions for those portions of the geometry that are too complicated to describe with the geometry routines provided in FRAC-IN-THE-BOX may be calculated in SABRINA and merged with the volume fractions computed for the remainder of the geometry. 21 figs.,more » 1 tab.« less

  5. Designing a Common Interchange Format for Unit Data Using the Command and Control Information Exchange Data Model (C2IEDM) and XSLT

    DTIC Science & Technology

    2004-09-01

    Required> </Equipment> <Equipment code="L44680"> <Description>LAUNCHER GRENADE SMOKE: SCREENING RP M250 </Description> <Required...EquipmentPiecesOnHand> </UnitEquipment> <UnitEquipment> <EquipmentDescription>LAUNCHER GRENADE SMOKE: SCREENING RP M250 </EquipmentDescription

  6. A Conference on Spacecraft Charging Technology - 1978, held at U.S. Air Force Academy, Colorado Springs, Colorado, October 31 - November 2, 1978.

    DTIC Science & Technology

    1978-01-01

    complex, applications of the code . NASCAP CODE DESCRIPTION The NASCAP code is a finite-element spacecraft-charging simulation that is written in FORTRAN ...transport code POEM (ref. 1), is applicable to arbitrary dielectrics, source spectra, and current time histories. The code calculations are illustrated by...iaxk ’. Vlbouced _DstributionL- 9TNA Availability Codes %ELECTEf Nationa Aeronautics and Dist. Spec al TAvalland/or. MAY 2 21980 Space Administration

  7. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camous, F.; Jacq, F.; Chatelard, P.

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  8. Implementation of the FDTD method in cylindrical coordinates for dispersive materials: Modal study of C-shaped nano-waveguides

    NASA Astrophysics Data System (ADS)

    kebci, Zahia; Belkhir, Abderrahmane; Mezeghrane, Abdelaziz; Lamrous, Omar; Baida, Fadi Issam

    2018-03-01

    The objective of this work is to develop a code based on the finite difference time domain method in cylindrical coordinates (CC-FDTD) that integrates the Drude Critical Points model (DCP) and to apply it in the study of a metallic C-shaped waveguide (CSWG). The integrated dispersion model allows an accurate description of noble metals in the optical range and working in cylindrical coordinates is necessary to bypass the staircase effect induced by a Cartesian mesh especially in the case of curved geometrical forms. The CC-FDTD code developed as a part of this work is more general than the Body-Of-Revolution-FDTD algorithm that can only handle structures exhibiting a complete cylindrical symmetry. A N-order CC-FDTD code is then derived and used to perform a parametric study of an infinitly-long CSWG for nano-optic applications. Propagation losses and dispersion diagrams are given for different geometrical parameters.

  9. Design oriented structural analysis

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1994-01-01

    Desirable characteristics and benefits of design oriented analysis methods are described and illustrated by presenting a synoptic description of the development and uses of the Equivalent Laminated Plate Solution (ELAPS) computer code. ELAPS is a design oriented structural analysis method which is intended for use in the early design of aircraft wing structures. Model preparation is minimized by using a few large plate segments to model the wing box structure. Computational efficiency is achieved by using a limited number of global displacement functions that encompass all segments over the wing planform. Coupling with other codes is facilitated since the output quantities such as deflections and stresses are calculated as continuous functions over the plate segments. Various aspects of the ELAPS development are discussed including the analytical formulation, verification of results by comparison with finite element analysis results, coupling with other codes, and calculation of sensitivity derivatives. The effectiveness of ELAPS for multidisciplinary design application is illustrated by describing its use in design studies of high speed civil transport wing structures.

  10. An Integrated Approach to Swept Wing Icing Simulation

    NASA Technical Reports Server (NTRS)

    Potapczuk, Mark G.; Broeren, Andy P.

    2017-01-01

    This paper describes the various elements of a simulation approach used to develop a database of ice shape geometries and the resulting aerodynamic performance data for a representative commercial transport wing model exposed to a variety of icing conditions. This effort included testing in the NASA Icing Research Tunnel, the Wichita State University Walter H. Beech Wind Tunnel, and the ONERA F1 Subsonic Wind Tunnel as well as the use of ice accretion codes, an inviscid design code, and computational fluid dynamics codes. Additionally, methods for capturing full three-dimensional ice shape geometries, geometry interpolation along the span of the wing, and creation of artificial ice shapes based upon that geometric data were developed for this effort. The icing conditions used for this effort were representative of actual ice shape encounter scenarios and run the gamut from ice roughness to full three-dimensional scalloped ice shapes. The effort is still underway so this paper is a status report of work accomplished to date and a description of the remaining elements of the effort.

  11. Users' Manual for Computer Code SPIRALI Incompressible, Turbulent Spiral Grooved Cylindrical and Face Seals

    NASA Technical Reports Server (NTRS)

    Walowit, Jed A.; Shapiro, Wilbur

    2005-01-01

    The SPIRALI code predicts the performance characteristics of incompressible cylindrical and face seals with or without the inclusion of spiral grooves. Performance characteristics include load capacity (for face seals), leakage flow, power requirements and dynamic characteristics in the form of stiffness, damping and apparent mass coefficients in 4 degrees of freedom for cylindrical seals and 3 degrees of freedom for face seals. These performance characteristics are computed as functions of seal and groove geometry, load or film thickness, running and disturbance speeds, fluid viscosity, and boundary pressures. A derivation of the equations governing the performance of turbulent, incompressible, spiral groove cylindrical and face seals along with a description of their solution is given. The computer codes are described, including an input description, sample cases, and comparisons with results of other codes.

  12. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  13. Communication About Maternal Breast Cancer With Children: A Qualitative Study.

    PubMed

    Huang, Xiaoyan; O'Connor, Margaret; Hu, Yan; Gao, Hongyun; Lee, Susan

    Communication with children is a major concern for mothers with breast cancer. Chinese people have specific understanding of cancer and death, which may affect their way of communication. The aim of this study is to explore how Chinese mothers with breast cancer communicate about their illness with their children. An interpretive description study was conducted. Forty mothers with nonterminal breast cancer in mainland China were interviewed individually. The data were analyzed using 3 steps of coding: free coding, descriptive coding, and interpretive coding. Four themes were identified: breaking the news, explaining to children, disclosing versus concealing, and information needs. Most Chinese mothers disclosed their diagnosis of breast cancer to their children mainly because it was impossible to conceal the truth. They explained illness in a factual manner; however, they tended to allow children to observe their physical changes and overhear conversations between adults. This was because they did not know how to communicate appropriately with their children, and they preferred to allow children to understand the event in a natural way. The communication about maternal breast cancer between mothers and children was influenced by traditional culture. Quantitative studies with large sample sizes should be conducted to compare the opinions of mothers of different characteristics and to investigate the factors predicting communication. Resources should be developed to help mothers with breast cancer communicate appropriately with their children about their illness. Healthcare professionals, especially nurses, need education to provide consultation services to these mothers and children.

  14. DAGON: a 3D Maxwell-Bloch code

    NASA Astrophysics Data System (ADS)

    Oliva, Eduardo; Cotelo, Manuel; Escudero, Juan Carlos; González-Fernández, Agustín.; Sanchís, Alberto; Vera, Javier; Vicéns, Sergio; Velarde, Pedro

    2017-05-01

    The amplification of UV radiation and high order harmonics (HOH) in plasmas is a subject of raising interest due to its different potential applications in several fields like environment and security (detection at distance), biology, materials science and industry (3D imaging) and atomic and plasma physics (pump-probe experiments). In order to develop these sources, it is necessary to properly understand the amplification process. Being the plasma an inhomogeneous medium which changes with time, it is desirable to have a full time-dependent 3D description of the interaction of UV and XUV radiation with plasmas. For these reasons, at the Instituto de Fusíon Nuclear we have developed DAGON, a 3D Maxwell-Bloch code capable of studying the full spationtemporal structure of the amplification process abovementioned.

  15. A novel multiple description scalable coding scheme for mobile wireless video transmission

    NASA Astrophysics Data System (ADS)

    Zheng, Haifeng; Yu, Lun; Chen, Chang Wen

    2005-03-01

    We proposed in this paper a novel multiple description scalable coding (MDSC) scheme based on in-band motion compensation temporal filtering (IBMCTF) technique in order to achieve high video coding performance and robust video transmission. The input video sequence is first split into equal-sized groups of frames (GOFs). Within a GOF, each frame is hierarchically decomposed by discrete wavelet transform. Since there is a direct relationship between wavelet coefficients and what they represent in the image content after wavelet decomposition, we are able to reorganize the spatial orientation trees to generate multiple bit-streams and employed SPIHT algorithm to achieve high coding efficiency. We have shown that multiple bit-stream transmission is very effective in combating error propagation in both Internet video streaming and mobile wireless video. Furthermore, we adopt the IBMCTF scheme to remove the redundancy for inter-frames along the temporal direction using motion compensated temporal filtering, thus high coding performance and flexible scalability can be provided in this scheme. In order to make compressed video resilient to channel error and to guarantee robust video transmission over mobile wireless channels, we add redundancy to each bit-stream and apply error concealment strategy for lost motion vectors. Unlike traditional multiple description schemes, the integration of these techniques enable us to generate more than two bit-streams that may be more appropriate for multiple antenna transmission of compressed video. Simulate results on standard video sequences have shown that the proposed scheme provides flexible tradeoff between coding efficiency and error resilience.

  16. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  17. Numerical Electromagnetics Code (NEC)-Method of Moments. A User-Oriented Computer Code for Analysis of the Electromagnetic Response of Antennas and Other Metal Structures. Part 1: Program Description-Theory. Part 2: Program Description-Code. Volume 1. Revised

    DTIC Science & Technology

    1981-01-01

    Reference Direction4 at " Is - (198) SNetwork’Ports. In either c•es, the port voltagemay be related to the appl &id field on the "segment by’ t~h constant...04 6.|• swot -0 1, i.61-03 45.766 17 0 0.117* 0.US30 ,0001 0.01111,31 1 I. K-03 1.137ft-04 i .3%$K-03 11.i1i is 0 0a1113 0.2178 0.0003 0.00339 1.1117K

  18. Computation of the Genetic Code

    NASA Astrophysics Data System (ADS)

    Kozlov, Nicolay N.; Kozlova, Olga N.

    2018-03-01

    One of the problems in the development of mathematical theory of the genetic code (summary is presented in [1], the detailed -to [2]) is the problem of the calculation of the genetic code. Similar problems in the world is unknown and could be delivered only in the 21st century. One approach to solving this problem is devoted to this work. For the first time provides a detailed description of the method of calculation of the genetic code, the idea of which was first published earlier [3]), and the choice of one of the most important sets for the calculation was based on an article [4]. Such a set of amino acid corresponds to a complete set of representations of the plurality of overlapping triple gene belonging to the same DNA strand. A separate issue was the initial point, triggering an iterative search process all codes submitted by the initial data. Mathematical analysis has shown that the said set contains some ambiguities, which have been founded because of our proposed compressed representation of the set. As a result, the developed method of calculation was limited to the two main stages of research, where the first stage only the of the area were used in the calculations. The proposed approach will significantly reduce the amount of computations at each step in this complex discrete structure.

  19. The equation of state package FEOS for high energy density matter

    NASA Astrophysics Data System (ADS)

    Faik, Steffen; Tauschwitz, Anna; Iosilevskiy, Igor

    2018-06-01

    Adequate equation of state (EOS) data is of high interest in the growing field of high energy density physics and especially essential for hydrodynamic simulation codes. The semi-analytical method used in the newly developed Frankfurt equation of state (FEOS) package provides an easy and fast access to the EOS of - in principle - arbitrary materials. The code is based on the well known QEOS model (More et al., 1988; Young and Corey, 1995) and is a further development of the MPQeos code (Kemp and Meyer-ter Vehn, 1988; Kemp and Meyer-ter Vehn, 1998) from Max-Planck-Institut für Quantenoptik (MPQ) in Garching Germany. The list of features contains the calculation of homogeneous mixtures of chemical elements and the description of the liquid-vapor two-phase region with or without a Maxwell construction. Full flexibility of the package is assured by its structure: A program library provides the EOS with an interface designed for Fortran or C/C++ codes. Two additional software tools allow for the generation of EOS tables in different file output formats and for the calculation and visualization of isolines and Hugoniot shock adiabats. As an example the EOS of fused silica (SiO2) is calculated and compared to experimental data and other EOS codes.

  20. Modeling transonic aerodynamic response using nonlinear systems theory for use with modern control theory

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1993-01-01

    The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.

  1. Using MaxCompiler for the high level synthesis of trigger algorithms

    NASA Astrophysics Data System (ADS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  2. A computer-based information system for epilepsy and electroencephalography.

    PubMed

    Finnerup, N B; Fuglsang-Frederiksen, A; Røssel, P; Jennum, P

    1999-08-01

    This paper describes a standardised computer-based information system for electroencephalography (EEG) focusing on epilepsy. The system was developed using a prototyping approach. It is based on international recommendations for EEG examination, interpretation and terminology, international guidelines for epidemiological studies on epilepsy and classification of epileptic seizures and syndromes and international classification of diseases. It is divided into: (1) clinical information and epilepsy relevant data; and (2) EEG data, which is hierarchically structured including description and interpretation of EEG. Data is coded but is supplemented with unrestricted text. The resulting patient database can be integrated with other clinical databases and with the patient record system and may facilitate clinical and epidemiological research and development of standards and guidelines for EEG description and interpretation. The system is currently used for teleconsultation between Gentofte and Lisbon.

  3. Moving Controlled Vocabularies into the Semantic Web

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Lowry, R. K.; Kokkinaki, A.

    2015-12-01

    One of the issues with legacy oceanographic data formats is that the only tool available for describing what a measurement is and how it was made is a single metadata tag known as the parameter code. The British Oceanographic Data Centre (BODC) has been supporting the international oceanographic community gain maximum benefit from this through a controlled vocabulary known as the BODC Parameter Usage Vocabulary (PUV). Over time this has grown to over 34,000 entries some of which have preferred labels with over 400 bytes of descriptive information detailing what was measured and how. A decade ago the BODC pioneered making this information available in a more useful form with the implementation of a prototype vocabulary server (NVS) that referenced each 'parameter code' as a URL. This developed into the current server (NVS V2) in which the parameter URL resolves into an RDF document based on the SKOS data model which includes a list of resource URLs mapped to the 'parameter'. For example the parameter code for a contaminant in biota, such as 'cadmium in Mytilus edulis', carries RDF triples leading to the entry for Mytilus edulis in the WoRMS and for cadmium in the ChEBI ontologies. By providing links into these external ontologies the information captured in a 1980s parameter code now conforms to the Linked Data paradigm of the Semantic Web, vastly increasing the descriptive information accessible to a user. This presentation will describe the next steps along the road to the Semantic Web with the development of a SPARQL end point1 to expose the PUV plus the 190 other controlled vocabularies held in NVS. Whilst this is ideal for those fluent in SPARQL, most users require something a little more user-friendly and so the NVS browser2 was developed over the end point to allow less technical users to query the vocabularies and navigate the NVS ontology. This tool integrates into an editor that allows vocabulary content to be manipulated by authorised users outside BODC. Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/

  4. A computational model for the prediction of jet entrainment in the vicinity of nozzle boattails (The BOAT code)

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Pergament, H. S.

    1978-01-01

    The basic code structure is discussed, including the overall program flow and a brief description of all subroutines. Instructions on the preparation of input data, definitions of key FORTRAN variables, sample input and output, and a complete listing of the code are presented.

  5. Integrating Bar-Code Medication Administration Competencies in the Curriculum: Implications for Nursing Education and Interprofessional Collaboration.

    PubMed

    Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L

    This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.

  6. Coding and transmission of subband coded images on the Internet

    NASA Astrophysics Data System (ADS)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  7. High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gygi, Francois; Galli, Giulia; Schwegler, Eric

    This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobromir Panayotov; Andrew Grief; Brad J. Merrill

    'Fusion for Energy' (F4E) develops designs and implements the European Test Blanket Systems (TBS) in ITER - Helium-Cooled Lithium-Lead (HCLL) and Helium-Cooled Pebble-Bed (HCPB). Safety demonstration is an essential element for the integration of TBS in ITER and accident analyses are one of its critical segments. A systematic approach to the accident analyses had been acquired under the F4E contract on TBS safety analyses. F4E technical requirements and AMEC and INL efforts resulted in the development of a comprehensive methodology for fusion breeding blanket accident analyses. It addresses the specificity of the breeding blankets design, materials and phenomena and atmore » the same time is consistent with the one already applied to ITER accident analyses. Methodology consists of several phases. At first the reference scenarios are selected on the base of FMEA studies. In the second place elaboration of the accident analyses specifications we use phenomena identification and ranking tables to identify the requirements to be met by the code(s) and TBS models. Thus the limitations of the codes are identified and possible solutions to be built into the models are proposed. These include among others the loose coupling of different codes or code versions in order to simulate multi-fluid flows and phenomena. The code selection and issue of the accident analyses specifications conclude this second step. Furthermore the breeding blanket and ancillary systems models are built on. In this work challenges met and solutions used in the development of both MELCOR and RELAP5 codes models of HCLL and HCPB TBSs will be shared. To continue the developed models are qualified by comparison with finite elements analyses, by code to code comparison and sensitivity studies. Finally, the qualified models are used for the execution of the accident analyses of specific scenario. When possible the methodology phases will be illustrated in the paper by limited number of tables and figures. Description of each phase and its results in detail as well the methodology applications to EU HCLL and HCPB TBSs will be published in separate papers. The developed methodology is applicable to accident analyses of other TBSs to be tested in ITER and as well to DEMO breeding blankets.« less

  9. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  10. A description of the new 3D electron gun and collector modeling tool: MICHELLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petillo, J.; Mondelli, A.; Krueger, W.

    1999-07-01

    A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less

  11. U.S. Naval Research Laboratory Final Analysis Report to NATO Above Water Warfare Capabilities Group 2016 Naval Electromagnetic Operations Trials

    DTIC Science & Technology

    2017-05-23

    Systems and the NRL Code 5763 Radio Frequency (RF) Stimulator. It includes and covers system descriptions , setup, data collection, and test goals that...6 4. Test Asset Descriptions ...7 4.1. Description of FOXTROT Anti-ship Missile (ASM) Simulator ......................................... 7

  12. A program code generator for multiphysics biological simulation using markup languages.

    PubMed

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  13. Deploying electromagnetic particle-in-cell (EM-PIC) codes on Xeon Phi accelerators boards

    NASA Astrophysics Data System (ADS)

    Fonseca, Ricardo

    2014-10-01

    The complexity of the phenomena involved in several relevant plasma physics scenarios, where highly nonlinear and kinetic processes dominate, makes purely theoretical descriptions impossible. Further understanding of these scenarios requires detailed numerical modeling, but fully relativistic particle-in-cell codes such as OSIRIS are computationally intensive. The quest towards Exaflop computer systems has lead to the development of HPC systems based on add-on accelerator cards, such as GPGPUs and more recently the Xeon Phi accelerators that power the current number 1 system in the world. These cards, also referred to as Intel Many Integrated Core Architecture (MIC) offer peak theoretical performances of >1 TFlop/s for general purpose calculations in a single board, and are receiving significant attention as an attractive alternative to CPUs for plasma modeling. In this work we report on our efforts towards the deployment of an EM-PIC code on a Xeon Phi architecture system. We will focus on the parallelization and vectorization strategies followed, and present a detailed performance evaluation of code performance in comparison with the CPU code.

  14. Using the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.

    2013-01-01

    The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.

  15. Instrument Remote Control Application Framework

    NASA Technical Reports Server (NTRS)

    Ames, Troy; Hostetter, Carl F.

    2006-01-01

    The Instrument Remote Control (IRC) architecture is a flexible, platform-independent application framework that is well suited for the control and monitoring of remote devices and sensors. IRC enables significant savings in development costs by utilizing extensible Markup Language (XML) descriptions to configure the framework for a specific application. The Instrument Markup Language (IML) is used to describe the commands used by an instrument, the data streams produced, the rules for formatting commands and parsing the data, and the method of communication. Often no custom code is needed to communicate with a new instrument or device. An IRC instance can advertise and publish a description about a device or subscribe to another device's description on a network. This simple capability of dynamically publishing and subscribing to interfaces enables a very flexible, self-adapting architecture for monitoring and control of complex instruments in diverse environments.

  16. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less

  17. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  18. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  19. Optimized aerodynamic design process for subsonic transport wing fitted with winglets. [wind tunnel model

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.

    1979-01-01

    The aerodynamic design of a wind-tunnel model of a wing representative of that of a subsonic jet transport aircraft, fitted with winglets, was performed using two recently developed optimal wing-design computer programs. Both potential flow codes use a vortex lattice representation of the near-field of the aerodynamic surfaces for determination of the required mean camber surfaces for minimum induced drag, and both codes use far-field induced drag minimization procedures to obtain the required spanloads. One code uses a discrete vortex wake model for this far-field drag computation, while the second uses a 2-D advanced panel wake model. Wing camber shapes for the two codes are very similar, but the resulting winglet camber shapes differ widely. Design techniques and considerations for these two wind-tunnel models are detailed, including a description of the necessary modifications of the design geometry to format it for use by a numerically controlled machine for the actual model construction.

  20. Description and use of LSODE, the Livermore Solver for Ordinary Differential Equations

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Hindmarsh, Alan C.

    1993-01-01

    LSODE, the Livermore Solver for Ordinary Differential Equations, is a package of FORTRAN subroutines designed for the numerical solution of the initial value problem for a system of ordinary differential equations. It is particularly well suited for 'stiff' differential systems, for which the backward differentiation formula method of orders 1 to 5 is provided. The code includes the Adams-Moulton method of orders 1 to 12, so it can be used for nonstiff problems as well. In addition, the user can easily switch methods to increase computational efficiency for problems that change character. For both methods a variety of corrector iteration techniques is included in the code. Also, to minimize computational work, both the step size and method order are varied dynamically. This report presents complete descriptions of the code and integration methods, including their implementation. It also provides a detailed guide to the use of the code, as well as an illustrative example problem.

  1. Physical Model for the Evolution of the Genetic Code

    NASA Astrophysics Data System (ADS)

    Yamashita, Tatsuro; Narikiyo, Osamu

    2011-12-01

    Using the shape space of codons and tRNAs we give a physical description of the genetic code evolution on the basis of the codon capture and ambiguous intermediate scenarios in a consistent manner. In the lowest dimensional version of our description, a physical quantity, codon level is introduced. In terms of the codon levels two scenarios are typically classified into two different routes of the evolutional process. In the case of the ambiguous intermediate scenario we perform an evolutional simulation implemented cost selection of amino acids and confirm a rapid transition of the code change. Such rapidness reduces uncomfortableness of the non-unique translation of the code at intermediate state that is the weakness of the scenario. In the case of the codon capture scenario the survival against mutations under the mutational pressure minimizing GC content in genomes is simulated and it is demonstrated that cells which experience only neutral mutations survive.

  2. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  3. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  4. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  5. Biaxial experiments supporting the development of constitutive theories for advanced high-temperature materials

    NASA Technical Reports Server (NTRS)

    Ellis, J. R.

    1988-01-01

    Complex states of stress and strain are introduced into components during service in engineering applications. It follows that analysis of such components requires material descriptions, or constitutive theories, which reflect the tensorial nature of stress and strain. For applications involving stress levels above yield, the situation is more complex in that material response is both nonlinear and history dependent. This has led to the development of viscoplastic constitutive theories which introduce time by expressing the flow and evolutionary equation in the form of time derivatives. Models were developed here which can be used to analyze high temperature components manufactured from advanced composite materials. In parallel with these studies, effort was directed at developing multiaxial testing techniques to verify the various theories. Recent progress in the development of constitutive theories from both the theoretical and experimental viewpoints are outlined. One important aspect is that material descriptions for advanced composite materials which can be implemented in general purpose finite element codes and used for practical design are verified.

  6. NASTRAN hydroelastic modal studies. Volume 2: Programmer documentation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The operational steps, data descriptions, and program code for the new NASTRAN hydroelastic analysis system are described. The overall flow of the system is described, followed by the descriptions of the individual modules and their subroutines.

  7. Multiple descriptions based on multirate coding for JPEG 2000 and H.264/AVC.

    PubMed

    Tillo, Tammam; Baccaglini, Enrico; Olmo, Gabriella

    2010-07-01

    Multiple description coding (MDC) makes use of redundant representations of multimedia data to achieve resiliency. Descriptions should be generated so that the quality obtained when decoding a subset of them only depends on their number and not on the particular received subset. In this paper, we propose a method based on the principle of encoding the source at several rates, and properly blending the data encoded at different rates to generate the descriptions. The aim is to achieve efficient redundancy exploitation, and easy adaptation to different network scenarios by means of fine tuning of the encoder parameters. We apply this principle to both JPEG 2000 images and H.264/AVC video data. We consider as the reference scenario the distribution of contents on application-layer overlays with multiple-tree topology. The experimental results reveal that our method favorably compares with state-of-art MDC techniques.

  8. Coding "Corrective Recasts": The Maintenance of Meaning and More Fundamental Problems

    ERIC Educational Resources Information Center

    Hauser, Eric

    2005-01-01

    A fair amount of descriptive research in the field of second language acquisition has looked at the presence of what have been labeled corrective recasts. This research has relied on the methodological practice of coding to identify particular turns as "corrective recasts." Often, the coding criteria make use of the notion of the maintenance of…

  9. Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data (SCED). NFES 2011-801

    ERIC Educational Resources Information Center

    National Forum on Education Statistics, 2011

    2011-01-01

    In this handbook, "Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data" (SCED), the National Center for Education Statistics (NCES) and the National Forum on Education Statistics have extended the existing secondary course classification system with codes and descriptions for courses offered at…

  10. Application of ATHLET/DYN3D coupled codes system for fast liquid metal cooled reactor steady state simulation

    NASA Astrophysics Data System (ADS)

    Ivanov, V.; Samokhin, A.; Danicheva, I.; Khrennikov, N.; Bouscuet, J.; Velkov, K.; Pasichnyk, I.

    2017-01-01

    In this paper the approaches used for developing of the BN-800 reactor test model and for validation of coupled neutron-physic and thermohydraulic calculations are described. Coupled codes ATHLET 3.0 (code for thermohydraulic calculations of reactor transients) and DYN3D (3-dimensional code of neutron kinetics) are used for calculations. The main calculation results of reactor steady state condition are provided. 3-D model used for neutron calculations was developed for start reactor BN-800 load. The homogeneous approach is used for description of reactor assemblies. Along with main simplifications, the main reactor BN-800 core zones are described (LEZ, MEZ, HEZ, MOX, blankets). The 3D neutron physics calculations were provided with 28-group library, which is based on estimated nuclear data ENDF/B-7.0. Neutron SCALE code was used for preparation of group constants. Nodalization hydraulic model has boundary conditions by coolant mass-flow rate for core inlet part, by pressure and enthalpy for core outlet part, which can be chosen depending on reactor state. Core inlet and outlet temperatures were chosen according to reactor nominal state. The coolant mass flow rate profiling through the core is based on reactor power distribution. The test thermohydraulic calculations made with using of developed model showed acceptable results in coolant mass flow rate distribution through the reactor core and in axial temperature and pressure distribution. The developed model will be upgraded in future for different transient analysis in metal-cooled fast reactors of BN type including reactivity transients (control rods withdrawal, stop of the main circulation pump, etc.).

  11. Sierra Toolkit Manual Version 4.48.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Toolkit Team

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall outmore » of date. This page intentionally left blank.« less

  12. Geometry creation for MCNP by Sabrina and XSM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Riper, K.A.

    The Monte Carlo N-Particle transport code MCNP is based on a surface description of 3-dimensional geometry. Cells are defined in terms of boolean operations on signed quadratic surfaces. MCNP geometry is entered as a card image file containing coefficients of the surface equations and a list of surfaces and operators describing cells. Several programs are available to assist in creation of the geometry specification, among them Sabrina and the new ``Smart Editor`` code XSM. We briefly describe geometry creation in Sabrina and then discuss XSM in detail. XSM is under development; our discussion is based on the state of XSMmore » as of January 1, 1994.« less

  13. 50 CFR Table 2d to Part 679 - Species Codes-Non-FMP Species

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... description Code Arctic char, anadromous 521 Dolly varden, anadromous 531 Eels or eel-like fish 210 Eel, wolf... Arctic surf 812 Cockle 820 Eastern softshell 842 Pacific geoduck 815 Pacific littleneck 840 Pacific razor...

  14. Coding update of the SMFM definition of low risk for cesarean delivery from ICD-9-CM to ICD-10-CM.

    PubMed

    Armstrong, Joanne; McDermott, Patricia; Saade, George R; Srinivas, Sindhu K

    2017-07-01

    In 2015, the Society for Maternal-Fetal Medicine developed a low risk for cesarean delivery definition based on administrative claims-based diagnosis codes described by the International Classification of Diseases, Ninth Revision, Clinical Modification. The Society for Maternal-Fetal Medicine definition is a clinical enrichment of 2 available measures from the Joint Commission and the Agency for Healthcare Research and Quality measures. The Society for Maternal-Fetal Medicine measure excludes diagnosis codes that represent clinically relevant risk factors that are absolute or relative contraindications to vaginal birth while retaining diagnosis codes such as labor disorders that are discretionary risk factors for cesarean delivery. The introduction of the International Statistical Classification of Diseases, 10th Revision, Clinical Modification in October 2015 expanded the number of available diagnosis codes and enabled a greater depth and breadth of clinical description. These coding improvements further enhance the clinical validity of the Society for Maternal-Fetal Medicine definition and its potential utility in tracking progress toward the goal of safely lowering the US cesarean delivery rate. This report updates the Society for Maternal-Fetal Medicine definition of low risk for cesarean delivery using International Statistical Classification of Diseases, 10th Revision, Clinical Modification coding. Copyright © 2017. Published by Elsevier Inc.

  15. DRG coding practice: a nationwide hospital survey in Thailand

    PubMed Central

    2011-01-01

    Background Diagnosis Related Group (DRG) payment is preferred by healthcare reform in various countries but its implementation in resource-limited countries has not been fully explored. Objectives This study was aimed (1) to compare the characteristics of hospitals in Thailand that were audited with those that were not and (2) to develop a simplified scale to measure hospital coding practice. Methods A questionnaire survey was conducted of 920 hospitals in the Summary and Coding Audit Database (SCAD hospitals, all of which were audited in 2008 because of suspicious reports of possible DRG miscoding); the questionnaire also included 390 non-SCAD hospitals. The questionnaire asked about general demographics of the hospitals, hospital coding structure and process, and also included a set of 63 opinion-oriented items on the current hospital coding practice. Descriptive statistics and exploratory factor analysis (EFA) were used for data analysis. Results SCAD and Non-SCAD hospitals were different in many aspects, especially the number of medical statisticians, experience of medical statisticians and physicians, as well as number of certified coders. Factor analysis revealed a simplified 3-factor, 20-item model to assess hospital coding practice and classify hospital intention. Conclusion Hospital providers should not be assumed capable of producing high quality DRG codes, especially in resource-limited settings. PMID:22040256

  16. DESCRIPTION OF THE SAN RAFAEL PROGRAM FOR MORE ABLE LEARNERS AS PRESCRIBED IN THE CALIFORNIA ADMINISTRATIVE CODE, ARTICLE 23, SPECIALLY EDUCATIONAL PROGRAMS FOR MENTALLY GIFTED MINORS.

    ERIC Educational Resources Information Center

    SHORE, ROBERT E.

    THE SAN RAFAEL MORE ABLE LEARNER CURRICULUM WAS GEARED TO A SELECT GROUP OF ELEMENTARY SCHOOL STUDENTS. IT ATTEMPTED "TO DEEPEN APPRECIATIONS, ATTITUDES, AND UNDERSTANDINGS THROUGH INCREASED KNOWLEDGE OF THE ARTS AND SCIENCES, AND TO DEVELOP PROFICIENCIES AND SKILLS IN SELECTED AREAS IN THE ARTS AND SCIENCES." THE CURRICULUM OFFERED A…

  17. Descriptive Summaries of the Research, Development, Test and Evaluation, Army Appropriation. Supporting Data FY 1994, Budget Estimates Submitted to Congress, April 1993

    DTIC Science & Technology

    1993-04-01

    determining effective group functioning, leader-group interaction , and decision making; (2) factors that determine effective, low error human performance...infectious disease and biological defense vaccines and drugs , vision, neurotxins, neurochemistry, molecular neurobiology, neurodegenrative diseases...Potential Rotor/Comprehensive Analysis Model for Rotor Aerodynamics-Johnson Aeronautics (FPR/CAMRAD-JA) code to predict Blade Vortex Interaction (BVI

  18. Analysis of Compton continuum measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gold, R.; Olson, I. K.

    1970-01-01

    Five computer programs: COMPSCAT, FEND, GABCO, DOSE, and COMPLOT, have been developed and used for the analysis and subsequent reduction of measured energy distributions of Compton recoil electrons to continuous gamma spectra. In addition to detailed descriptions of these computer programs, the relationship amongst these codes is stressed. The manner in which these programs function is illustrated by tracing a sample measurement through a complete cycle of the data-reduction process.

  19. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our

  20. An expanded framework to define and measure shared decision-making in dialogue: A 'top-down' and 'bottom-up' approach.

    PubMed

    Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F

    2018-03-11

    We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.

  1. Computationally efficient description of relativistic electron beam transport in dense plasma

    NASA Astrophysics Data System (ADS)

    Polomarov, Oleg; Sefkov, Adam; Kaganovich, Igor; Shvets, Gennady

    2006-10-01

    A reduced model of the Weibel instability and electron beam transport in dense plasma is developed. Beam electrons are modeled by macro-particles and the background plasma is represented by electron fluid. Conservation of generalized vorticity and quasineutrality of the plasma-beam system are used to simplify the governing equations. Our approach is motivated by the conditions of the FI scenario, where the beam density is likely to be much smaller than the plasma density and the beam energy is likely to be very high. For this case the growth rate of the Weibel instability is small, making the modeling of it by conventional PICs exceedingly time consuming. The present approach does not require resolving the plasma period and only resolves a plasma collisionless skin depth and is suitable for modeling a long-time behavior of beam-plasma interaction. An efficient code based on this reduced description is developed and benchmarked against the LSP PIC code. The dynamics of low and high current electron beams in dense plasma is simulated. Special emphasis is on peculiarities of its non-linear stages, such as filament formation and merger, saturation and post-saturation field and energy oscillations. *Supported by DOE Fusion Science through grant DE-FG02-05ER54840.

  2. [Changes for rheumatology in the G-DRG system 2005].

    PubMed

    Fiori, W; Roeder, N; Lakomek, H-J; Liman, W; Köneke, N; Hülsemann, J L; Lehmann, H; Wenke, A

    2005-02-01

    The German prospective payment system G-DRG has been recently adapted and recalculated. Apart from the adjustments of the G-DRG classification system itself changes in the legal framework like the extension of the "convergence period" or the limitation of budget loss due to DRG introduction have to be considered. Especially the introduction of new procedure codes (OPS) describing the specialized and complex rheumatologic treatment of inpatients might be of significant importance. Even though these procedures will not yet develop influence on the grouping process in 2005, it will enable a more accurate description of the efforts of acute-rheumatologic treatment which can be used for further adaptations of the DRG algorithm. Numerous newly introduced additive payment components (ZE) result in a more adequate description of the "DRG-products". Although not increasing the individual hospital budget, these additive payments contribute to more transparency of high cost services and can be addressed separately from the DRG-budget. Furthermore a lot of other relevant changes to the G-DRG catalogue, the classification systems ICD-10-GM and OPS-301 and the German Coding Standards (DKR) are presented.

  3. A lifting surface computer code with jet-in-crossflow interference effects. Volume 1: Theoretical description

    NASA Technical Reports Server (NTRS)

    Furlong, K. L.; Fearn, R. L.

    1983-01-01

    A method is proposed to combine a numerical description of a jet in a crossflow with a lifting surface panel code to calculate the jet/aerodynamic-surface interference effects on a V/STOL aircraft. An iterative technique is suggested that starts with a model for the properties of a jet/flat plate configuration and modifies these properties based on the flow field calculated for the configuration of interest. The method would estimate the pressures, forces, and moments on an aircraft out of ground effect. A first-order approximation to the method suggested is developed and applied to two simple configurations. The first-order approximation is a noniterative precedure which does not allow for interactions between multiple jets in a crossflow and also does not account for the influence of lifting surfaces on the jet properties. The jet/flat plate model utilized in the examples presented is restricted to a uniform round jet injected perpendicularly into a uniform crossflow for a range of jet-to-crossflow velocity ratios from three to ten.

  4. Description and Codification of Miscanthus × giganteus Growth Stages for Phenological Assessment

    PubMed Central

    Tejera, Mauricio D.; Heaton, Emily A.

    2017-01-01

    Triploid Miscanthus × giganteus (Greef et Deu. ex Hodkinson et Renvoize) is a sterile, perennial grass used for biomass production in temperate environments. While M. × giganteus has been intensively researched, a scale standardizing description of M. × giganteus morphological stages has not been developed. Here we provide such a scale by adapting the widely-used Biologische Bundesanstalt, Bundessortenamt, CHemische Industrie (BBCH) scale and its corresponding numerical code to describe stages of morphological development in M. × giganteus using observations of the “Freedom” and “Illinois” clone in Iowa, USA. Descriptive keys with images are also presented. Because M. × giganteus plants overlap in the field, the scale was first applied to individual stems and then scaled up to assess plants or communities. Of the 10 principal growth stages in the BBCH system, eight were observed in M. × giganteus. Each principal stage was subdivided into secondary stages to enable a detailed description of developmental progression. While M. × giganteus does not have seed development stages, descriptions of those stages are provided to extend the scale to other Miscanthus genotypes. We present methods to use morphological development data to assess phenology by calculating the onset, duration, and abundance of each developmental stage. This scale has potential to harmonize previously described study-specific scales and standardize results across studies. Use of the precise staging presented here should more tightly constrain estimates of developmental parameters in crop models and increase the efficacy of timing-sensitive crop management practices like pest control and harvest. PMID:29062320

  5. Study of Two-Dimensional Compressible Non-Acoustic Modeling of Stirling Machine Type Components

    NASA Technical Reports Server (NTRS)

    Tew, Roy C., Jr.; Ibrahim, Mounir B.

    2001-01-01

    A two-dimensional (2-D) computer code was developed for modeling enclosed volumes of gas with oscillating boundaries, such as Stirling machine components. An existing 2-D incompressible flow computer code, CAST, was used as the starting point for the project. CAST was modified to use the compressible non-acoustic Navier-Stokes equations to model an enclosed volume including an oscillating piston. The devices modeled have low Mach numbers and are sufficiently small that the time required for acoustics to propagate across them is negligible. Therefore, acoustics were excluded to enable more time efficient computation. Background information about the project is presented. The compressible non-acoustic flow assumptions are discussed. The governing equations used in the model are presented in transport equation format. A brief description is given of the numerical methods used. Comparisons of code predictions with experimental data are then discussed.

  6. Implementation of non-condensable gases condensation suppression model into the WCOBRA/TRAC-TF2 LOCA safety evaluation code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, J.; Cao, L.; Ohkawa, K.

    2012-07-01

    The non-condensable gases condensation suppression model is important for a realistic LOCA safety analysis code. A condensation suppression model for direct contact condensation was previously developed by Westinghouse using first principles. The model is believed to be an accurate description of the direct contact condensation process in the presence of non-condensable gases. The Westinghouse condensation suppression model is further revised by applying a more physical model. The revised condensation suppression model is thus implemented into the WCOBRA/TRAC-TF2 LOCA safety evaluation code for both 3-D module (COBRA-TF) and 1-D module (TRAC-PF1). Parametric study using the revised Westinghouse condensation suppression model ismore » conducted. Additionally, the performance of non-condensable gases condensation suppression model is examined in the ACHILLES (ISP-25) separate effects test and LOFT L2-5 (ISP-13) integral effects test. (authors)« less

  7. Preconditioning for Numerical Simulation of Low Mach Number Three-Dimensional Viscous Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.; Chima, Rodrick V.; Turkel, Eli

    1997-01-01

    A preconditioning scheme has been implemented into a three-dimensional viscous computational fluid dynamics code for turbomachine blade rows. The preconditioning allows the code, originally developed for simulating compressible flow fields, to be applied to nearly-incompressible, low Mach number flows. A brief description is given of the compressible Navier-Stokes equations for a rotating coordinate system, along with the preconditioning method employed. Details about the conservative formulation of artificial dissipation are provided, and different artificial dissipation schemes are discussed and compared. The preconditioned code was applied to a well-documented case involving the NASA large low-speed centrifugal compressor for which detailed experimental data are available for comparison. Performance and flow field data are compared for the near-design operating point of the compressor, with generally good agreement between computation and experiment. Further, significant differences between computational results for the different numerical implementations, revealing different levels of solution accuracy, are discussed.

  8. CUDA Fortran acceleration for the finite-difference time-domain method

    NASA Astrophysics Data System (ADS)

    Hadi, Mohammed F.; Esmaeili, Seyed A.

    2013-05-01

    A detailed description of programming the three-dimensional finite-difference time-domain (FDTD) method to run on graphical processing units (GPUs) using CUDA Fortran is presented. Two FDTD-to-CUDA thread-block mapping designs are investigated and their performances compared. Comparative assessment of trade-offs between GPU's shared memory and L1 cache is also discussed. This presentation is for the benefit of FDTD programmers who work exclusively with Fortran and are reluctant to port their codes to C in order to utilize GPU computing. The derived CUDA Fortran code is compared with an optimized CPU version that runs on a workstation-class CPU to present a realistic GPU to CPU run time comparison and thus help in making better informed investment decisions on FDTD code redesigns and equipment upgrades. All analyses are mirrored with CUDA C simulations to put in perspective the present state of CUDA Fortran development.

  9. Single-channel voice-response-system program documentation volume I : system description

    DOT National Transportation Integrated Search

    1977-01-01

    This report documents the design and implementation of a Voice Response System (VRS) using Adaptive Differential Pulse Code Modulation (ADPCM) voice coding. Implemented on a Digital Equipment Corporation PDP-11/20,R this VRS system supports a single ...

  10. UFO: A THREE-DIMENSIONAL NEUTRON DIFFUSION CODE FOR THE IBM 704

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auerbach, E.H.; Jewett, J.P.; Ketchum, M.A.

    A description of UFO, a code for the solution of the fewgroup neutron diffusion equation in three-dimensional Cartesian coordinates on the IBM 704, is given. An accelerated Liebmann flux iteration scheme is used, and optimum parameters can be calculated by the code whenever they are required. The theory and operation of the program are discussed. (auth)

  11. 50 CFR Table 2c to Part 679 - Species Codes: FMP Forage Fish Species (all species of the following families)

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Species Codes: FMP Forage Fish Species (all species of the following families) 2c Table 2c to Part 679 Wildlife and Fisheries FISHERY...: FMP Forage Fish Species (all species of the following families) Species Description Code Bristlemouths...

  12. 50 CFR Table 2c to Part 679 - Species Codes: FMP Forage Fish Species (all species of the following families)

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Species Codes: FMP Forage Fish Species (all species of the following families) 2c Table 2c to Part 679 Wildlife and Fisheries FISHERY...: FMP Forage Fish Species (all species of the following families) Species Description Code Bristlemouths...

  13. 50 CFR Table 2c to Part 679 - Species Codes: FMP Forage Fish Species (all species of the following families)

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Species Codes: FMP Forage Fish Species (all species of the following families) 2c Table 2c to Part 679 Wildlife and Fisheries FISHERY...: FMP Forage Fish Species (all species of the following families) Species Description Code Bristlemouths...

  14. 50 CFR Table 2c to Part 679 - Species Codes: FMP Forage Fish Species (all species of the following families)

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Species Codes: FMP Forage Fish Species (all species of the following families) 2c Table 2c to Part 679 Wildlife and Fisheries FISHERY...: FMP Forage Fish Species (all species of the following families) Species Description Code Bristlemouths...

  15. 50 CFR Table 2c to Part 679 - Species Codes: FMP Forage Fish Species (all species of the following families)

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Species Codes: FMP Forage Fish Species (all species of the following families) 2c Table 2c to Part 679 Wildlife and Fisheries FISHERY...: FMP Forage Fish Species (all species of the following families) Species Description Code Bristlemouths...

  16. Improvements to a method for the geometrically nonlinear analysis of compressively loaded stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Stoll, Frederick

    1993-01-01

    The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.

  17. Patient or treatment centre? Where are efforts invested to improve cancer patients' psychosocial outcomes?

    PubMed Central

    Carey, ML; Clinton-McHarg, T; Sanson-Fisher, RW; Campbell, S; Douglas, HE

    2011-01-01

    The psychosocial outcomes of cancer patients may be influenced by individual-level, social and treatment centre predictors. This paper aimed to examine the extent to which individual, social and treatment centre variables have been examined as predictors or targets of intervention for psychosocial outcomes of cancer patients. Medline was searched to find studies in which the psychological outcomes of cancer patient were primary variables. Papers published in English between 1999 and 2009 that reported primary data relevant to psychosocial outcomes for cancer patients were included, with 20% randomly selected for further coding. Descriptive studies were coded for inclusion of individual, social or treatment centre variables. Intervention studies were coded to determine if the unit of intervention was the individual patient, social unit or treatment centre. After random sampling, 412 publications meeting the inclusion criteria were identified, 169 were descriptive and 243 interventions. Of the descriptive papers 95.0% included individual predictors, and 5.0% social predictors. None of the descriptive papers examined treatment centre variables as predictors of psychosocial outcomes. Similarly, none of the interventions evaluated the effectiveness of treatment centre interventions for improving psychosocial outcomes. Potential reasons for the overwhelming dominance of individual predictors and individual-focused interventions in psychosocial literature are discussed. PMID:20646035

  18. Circumstances of Trauma and Accidents in Children: A Thesaurus-based Survey

    PubMed

    Séjourné, Claire; Philbois, Olivier; Vercherin, Paul; Patural, Hugues

    2016-11-25

    Introduction : Injuries and accidents are major causes of morbidity and mortality in children in France. Identification and description of the mechanisms of accidents are essential to develop adapted prevention methods. For this purpose, a specific thesaurus of ICD-10 codes relating to the circumstances of trauma and accidents in children was created in the French Loire department. The objective of this study was to evaluate the relevance and acceptability of the thesaurus in the pediatric emergency unit of Saint-Etienne university hospital.Material and Methods : This study was conducted in two phases. The first, longitudinal phase was conducted over three periods between May and October 2014 to compare codings by emergency room physicians before using the thesaurus with those defined on the basis of the thesaurus. The second phase retrospectively compared coding in July and August 2014 before introduction of the thesaurus with thesaurus-based coding in July and August 2015.Results : The first phase showed a loss of more than half of the information without the thesaurus. The circumstances of trauma can be described by an appropriate code in more than 90% of cases. The second phase showed a 13% increase in coding of the circumstances of trauma, which nevertheless remains insufficient.Discussion : The thesaurus facilitates coding and generally meets the coding physician’s expectations and should be used in large-scale epidemiological surveys.

  19. An algorithm to identify rheumatoid arthritis in primary care: a Clinical Practice Research Datalink study

    PubMed Central

    Muller, Sara; Hider, Samantha L; Raza, Karim; Stack, Rebecca J; Hayward, Richard A; Mallen, Christian D

    2015-01-01

    Objective Rheumatoid arthritis (RA) is a multisystem, inflammatory disorder associated with increased levels of morbidity and mortality. While much research into the condition is conducted in the secondary care setting, routinely collected primary care databases provide an important source of research data. This study aimed to update an algorithm to define RA that was previously developed and validated in the General Practice Research Database (GPRD). Methods The original algorithm consisted of two criteria. Individuals meeting at least one were considered to have RA. Criterion 1: ≥1 RA Read code and a disease modifying antirheumatic drug (DMARD) without an alternative indication. Criterion 2: ≥2 RA Read codes, with at least one ‘strong’ code and no alternative diagnoses. Lists of codes for consultations and prescriptions were obtained from the authors of the original algorithm where these were available, or compiled based on the original description and clinical knowledge. 4161 people with a first Read code for RA between 1 January 2010 and 31 December 2012 were selected from the Clinical Practice Research Datalink (CPRD, successor to the GPRD), and the criteria applied. Results Code lists were updated for the introduction of new Read codes and biological DMARDs. 3577/4161 (86%) of people met the updated algorithm for RA, compared to 61% in the original development study. 62.8% of people fulfilled both Criterion 1 and Criterion 2. Conclusions Those wishing to define RA in the CPRD, should consider using this updated algorithm, rather than a single RA code, if they wish to identify only those who are most likely to have RA. PMID:26700281

  20. Integrated modelling framework for short pulse high energy density physics experiments

    NASA Astrophysics Data System (ADS)

    Sircombe, N. J.; Hughes, S. J.; Ramsay, M. G.

    2016-03-01

    Modelling experimental campaigns on the Orion laser at AWE, and developing a viable point-design for fast ignition (FI), calls for a multi-scale approach; a complete description of the problem would require an extensive range of physics which cannot realistically be included in a single code. For modelling the laser-plasma interaction (LPI) we need a fine mesh which can capture the dispersion of electromagnetic waves, and a kinetic model for each plasma species. In the dense material of the bulk target, away from the LPI region, collisional physics dominates. The transport of hot particles generated by the action of the laser is dependent on their slowing and stopping in the dense material and their need to draw a return current. These effects will heat the target, which in turn influences transport. On longer timescales, the hydrodynamic response of the target will begin to play a role as the pressure generated from isochoric heating begins to take effect. Recent effort at AWE [1] has focussed on the development of an integrated code suite based on: the particle in cell code EPOCH, to model LPI; the Monte-Carlo electron transport code THOR, to model the onward transport of hot electrons; and the radiation hydrodynamics code CORVUS, to model the hydrodynamic response of the target. We outline the methodology adopted, elucidate on the advantages of a robustly integrated code suite compared to a single code approach, demonstrate the integrated code suite's application to modelling the heating of buried layers on Orion, and assess the potential of such experiments for the validation of modelling capability in advance of more ambitious HEDP experiments, as a step towards a predictive modelling capability for FI.

  1. From novice to master surgeon: improving feedback with a descriptive approach to intraoperative assessment.

    PubMed

    Huang, Emily; Wyles, Susannah M; Chern, Hueylan; Kim, Edward; O'Sullivan, Patricia

    2016-07-01

    A developmental and descriptive approach to assessing trainee intraoperative performance was explored. Semistructured interviews with 20 surgeon educators were recorded, transcribed, deidentified, and analyzed using a grounded theory approach to identify emergent themes. Two researchers independently coded the transcripts. Emergent themes were also compared to existing theories of skill acquisition. Surgeon educators characterized intraoperative surgical performance as an integrated practice of multiple skill categories and included anticipating, planning for contingencies, monitoring progress, self-efficacy, and "working knowledge." Comments concerning progression through stages, broadly characterized as "technician," "anatomist," "anticipator," "strategist," and "executive," formed a narrative about each stage of development. The developmental trajectory with narrative, descriptive profiles of surgeons working toward mastery provide a standardized vocabulary for communicating feedback, while fostering reflection on trainee progress. Viewing surgical performance as integrated practice rather than the conglomerate of isolated skills enhances authentic assessment. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  3. Determination of tire cross-sectional geometric characteristics from a digitally scanned image

    NASA Astrophysics Data System (ADS)

    Danielson, Kent T.

    1995-08-01

    A semi-automated procedure is described for the accurate determination of geometrical characteristics using a scanned image of the tire cross-section. The procedure can be useful for cases when CAD drawings are not available or when a description of the actual cured tire is desired. Curves representing the perimeter of the tire cross-section are determined by an edge tracing scheme, and the plyline and cord-end positions are determined by locations of color intensities. The procedure provides an accurate description of the perimeter of the tire cross-section and the locations of plylines and cord-ends. The position, normals, and curvatures of the cross-sectional surface are included in this description. The locations of the plylines provide the necessary information for determining the ply thicknesses and relative position to a reference surface. Finally, the locations of the cord-ends provide a means to calculate the cord-ends per inch (epi). Menu driven software has been developed to facilitate the procedure using the commercial code, PV-Wave by Visual Numerics, Inc., to display the images. From a single user interface, separate modules are executed for image enhancement, curve fitting the edge trace of the cross-sectional perimeter, and determining the plyline and cord-end locations. The code can run on SUN or SGI workstations and requires the use of a mouse to specify options or identify items on the scanned image.

  4. Determination of tire cross-sectional geometric characteristics from a digitally scanned image

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.

    1995-01-01

    A semi-automated procedure is described for the accurate determination of geometrical characteristics using a scanned image of the tire cross-section. The procedure can be useful for cases when CAD drawings are not available or when a description of the actual cured tire is desired. Curves representing the perimeter of the tire cross-section are determined by an edge tracing scheme, and the plyline and cord-end positions are determined by locations of color intensities. The procedure provides an accurate description of the perimeter of the tire cross-section and the locations of plylines and cord-ends. The position, normals, and curvatures of the cross-sectional surface are included in this description. The locations of the plylines provide the necessary information for determining the ply thicknesses and relative position to a reference surface. Finally, the locations of the cord-ends provide a means to calculate the cord-ends per inch (epi). Menu driven software has been developed to facilitate the procedure using the commercial code, PV-Wave by Visual Numerics, Inc., to display the images. From a single user interface, separate modules are executed for image enhancement, curve fitting the edge trace of the cross-sectional perimeter, and determining the plyline and cord-end locations. The code can run on SUN or SGI workstations and requires the use of a mouse to specify options or identify items on the scanned image.

  5. The navigation toolkit

    NASA Technical Reports Server (NTRS)

    Rich, William F.; Strom, Stephen W.

    1994-01-01

    This report summarizes the experience of the authors in managing, designing, and implementing an object-oriented applications framework for orbital navigation analysis for the Flight Design and Dynamics Department of the Rockwell Space Operations Company in Houston, in support of the Mission Operations Directorate of NASA's Johnson Space Center. The 8 person year project spanned 1.5 years and produced 30,000 lines of C++ code, replacing 150,000 lines of Fortran/C. We believe that our experience is important because it represents a 'second project' experience and generated real production-quality code - it was not a pilot. The project successfully demonstrated the use of 'continuous development' or rapid prototyping techniques. Use of formal methods and executable models contributed to the quality of the code. Keys to the success of the project were a strong architectural vision and highly skilled workers. This report focuses on process and methodology, and not on a detailed design description of the product. But the true importance of the object-oriented paradigm is its liberation of the developer to focus on the problem rather than the means used to solve the problem.

  6. Refined lateral energy correction functions for the KASCADE-Grande experiment based on Geant4 simulations

    NASA Astrophysics Data System (ADS)

    Gherghel-Lascu, A.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Fuchs, B.; Fuhrmann, D.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huber, D.; Huege, T.; Kampert, K.-H.; Kang, D.; Klages, H. O.; Link, K.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Oehlschläger, J.; Ostapchenko, S.; Palmieri, N.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schoo, S.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Ulrich, H.; Weindl, A.; Wochele, J.; Zabierowski, J.

    2015-02-01

    In previous studies of KASCADE-Grande data, a Monte Carlo simulation code based on the GEANT3 program has been developed to describe the energy deposited by EAS particles in the detector stations. In an attempt to decrease the simulation time and ensure compatibility with the geometry description in standard KASCADE-Grande analysis software, several structural elements have been neglected in the implementation of the Grande station geometry. To improve the agreement between experimental and simulated data, a more accurate simulation of the response of the KASCADE-Grande detector is necessary. A new simulation code has been developed based on the GEANT4 program, including a realistic geometry of the detector station with structural elements that have not been considered in previous studies. The new code is used to study the influence of a realistic detector geometry on the energy deposited in the Grande detector stations by particles from EAS events simulated by CORSIKA. Lateral Energy Correction Functions are determined and compared with previous results based on GEANT3.

  7. A Concept for Run-Time Support of the Chapel Language

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

  8. A predictive transport modeling code for ICRF-heated tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, C.K.; Hwang, D.Q.; Houlberg, W.

    In this report, a detailed description of the physic included in the WHIST/RAZE package as well as a few illustrative examples of the capabilities of the package will be presented. An in depth analysis of ICRF heating experiments using WHIST/RAZE will be discussed in a forthcoming report. A general overview of philosophy behind the structure of the WHIST/RAZE package, a summary of the features of the WHIST code, and a description of the interface to the RAZE subroutines are presented in section 2 of this report. Details of the physics contained in the RAZE code are examined in section 3.more » Sample results from the package follow in section 4, with concluding remarks and a discussion of possible improvements to the package discussed in section 5.« less

  9. A predictive transport modeling code for ICRF-heated tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, C.K.; Hwang, D.Q.; Houlberg, W.

    1992-02-01

    In this report, a detailed description of the physic included in the WHIST/RAZE package as well as a few illustrative examples of the capabilities of the package will be presented. An in depth analysis of ICRF heating experiments using WHIST/RAZE will be discussed in a forthcoming report. A general overview of philosophy behind the structure of the WHIST/RAZE package, a summary of the features of the WHIST code, and a description of the interface to the RAZE subroutines are presented in section 2 of this report. Details of the physics contained in the RAZE code are examined in section 3.more » Sample results from the package follow in section 4, with concluding remarks and a discussion of possible improvements to the package discussed in section 5.« less

  10. 41 CFR Appendix C to Chapter 301 - Standard Data Elements for Federal Travel [Traveler Identification

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... education, in scientific, professional, technical, mechanical, trade, clerical, fiscal, administrative, or... Data Elements for Federal Travel [Accounting & Certification] Group name Data elements Description Accounting Classification Accounting Code Agency accounting code. Non-Federal Source Indicator Per Diem...

  11. Cognitive Architectures for Multimedia Learning

    ERIC Educational Resources Information Center

    Reed, Stephen K.

    2006-01-01

    This article provides a tutorial overview of cognitive architectures that can form a theoretical foundation for designing multimedia instruction. Cognitive architectures include a description of memory stores, memory codes, and cognitive operations. Architectures that are relevant to multimedia learning include Paivio's dual coding theory,…

  12. Comparison of a Simple Patched Conic Trajectory Code to Commercially Available Software

    NASA Technical Reports Server (NTRS)

    AndersonPark, Brooke M.; Wright, Henry S.

    2007-01-01

    Often in spaceflight proposal development, mission designers must eva luate numerous trajectories as different design factors are investiga ted. Although there are numerous commercial software packages availab le to help develop and analyze trajectories, most take a significant amount of time to develop the trajectory itself, which isn't effectiv e when working on proposals. Thus a new code, PatCon, which is both q uick and easy to use, was developed to aid mission designers to condu ct trade studies on launch and arrival times for any given target pla net. The code is able to run quick analyses, due to the incorporation of the patched conic approximation, to determine the trajectory. PatCon provides a simple but accurate approximation of the four body moti on problem that would be needed to solve any planetary trajectory. P atCon has been compared to a patched conic test case for verification, with limited validation or comparison with other COTS software. This paper describes the patched conic technique and its implementation i n PatCon. A description of the results and comparison of PatCon to ot her more evolved codes such as AGI#s Satellite Tool Kit and JAQAR As trodynamics# Swingby Calculator is provided. The results will include percent differences in values such as C3 numbers, and Vinfinity at a rrival, and other more subjective results such as the time it takes to build the simulation, and actual calculation time.

  13. Development of a Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, G.; Kauffman, W. (Technical Monitor)

    2002-01-01

    This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.

  14. Wake curvature and trailing edge interaction effects in viscous flow over airfoils

    NASA Technical Reports Server (NTRS)

    Melnik, R. E.

    1979-01-01

    A theory developed for analyzing viscous flows over airfoils at high Reynolds numbers is described. The theory includes a complete treatment of viscous interaction effects induced by the curved wake behind the airfoil and accounts for normal pressure gradients across the boundary layer in the trailing edge region. A brief description of a computer code that was developed to solve the extended viscous interaction equations is given. Comparisons of the theoretical results with wind tunnel data for two rear loaded airfoils at supercritical conditions are presented.

  15. Investigation of the performance characteristics of Doppler radar technique for aircraft collision hazard warning, phase 3

    NASA Technical Reports Server (NTRS)

    1972-01-01

    System studies, equipment simulation, hardware development and flight tests which were conducted during the development of aircraft collision hazard warning system are discussed. The system uses a cooperative, continuous wave Doppler radar principle with pseudo-random frequency modulation. The report presents a description of the system operation and deals at length with the use of pseudo-random coding techniques. In addition, the use of mathematical modeling and computer simulation to determine the alarm statistics and system saturation characteristics in terminal area traffic of variable density is discussed.

  16. Computer program system for dynamic simulation and stability analysis of passive and actively controlled spacecraft. Volume 1. Theory

    NASA Technical Reports Server (NTRS)

    Bodley, C. S.; Devers, D. A.; Park, C. A.

    1975-01-01

    A theoretical development and associated digital computer program system is presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system may be used to investigate total system dynamic characteristics including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. Additionally, the program system may be used for design of attitude control systems and for evaluation of total dynamic system performance including time domain response and frequency domain stability analyses. Volume 1 presents the theoretical developments including a description of the physical system, the equations of dynamic equilibrium, discussion of kinematics and system topology, a complete treatment of momentum wheel coupling, and a discussion of gravity gradient and environmental effects. Volume 2, is a program users' guide and includes a description of the overall digital program code, individual subroutines and a description of required program input and generated program output. Volume 3 presents the results of selected demonstration problems that illustrate all program system capabilities.

  17. Space Physics Data Facility Web Services

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  18. LIGKA: A linear gyrokinetic code for the description of background kinetic and fast particle effects on the MHD stability in tokamaks

    NASA Astrophysics Data System (ADS)

    Lauber, Ph.; Günter, S.; Könies, A.; Pinches, S. D.

    2007-09-01

    In a plasma with a population of super-thermal particles generated by heating or fusion processes, kinetic effects can lead to the additional destabilisation of MHD modes or even to additional energetic particle modes. In order to describe these modes, a new linear gyrokinetic MHD code has been developed and tested, LIGKA (linear gyrokinetic shear Alfvén physics) [Ph. Lauber, Linear gyrokinetic description of fast particle effects on the MHD stability in tokamaks, Ph.D. Thesis, TU München, 2003; Ph. Lauber, S. Günter, S.D. Pinches, Phys. Plasmas 12 (2005) 122501], based on a gyrokinetic model [H. Qin, Gyrokinetic theory and computational methods for electromagnetic perturbations in tokamaks, Ph.D. Thesis, Princeton University, 1998]. A finite Larmor radius expansion together with the construction of some fluid moments and specification to the shear Alfvén regime results in a self-consistent, electromagnetic, non-perturbative model, that allows not only for growing or damped eigenvalues but also for a change in mode-structure of the magnetic perturbation due to the energetic particles and background kinetic effects. Compared to previous implementations [H. Qin, mentioned above], this model is coded in a more general and comprehensive way. LIGKA uses a Fourier decomposition in the poloidal coordinate and a finite element discretisation in the radial direction. Both analytical and numerical equilibria can be treated. Integration over the unperturbed particle orbits is performed with the drift-kinetic HAGIS code [S.D. Pinches, Ph.D. Thesis, The University of Nottingham, 1996; S.D. Pinches et al., CPC 111 (1998) 131] which accurately describes the particles' trajectories. This allows finite-banana-width effects to be implemented in a rigorous way since the linear formulation of the model allows the exchange of the unperturbed orbit integration and the discretisation of the perturbed potentials in the radial direction. Successful benchmarks for toroidal Alfvén eigenmodes (TAEs) and kinetic Alfvén waves (KAWs) with analytical results, ideal MHD codes, drift-kinetic codes and other codes based on kinetic models are reported.

  19. Bibliography of Joint Aircraft Survivability Reports and Related Documents

    DTIC Science & Technology

    1994-07-01

    report are: synthetic and preparative procedures for new materials developed; a new concept of fire-control by dry chemical agents; descriptions of...5001 Author: James T. Sweeten , Jr. Abstract: (U) This report provides information for users on the implementation of the MJU-7A/B, MJU-8A/B, MJU-10, MJU...John 0. Bennett, Code 4072 Crane Performing Organization: ARC Professional Services Group Information Systems Division Author: James T. Sweeten , Jr

  20. Global Modeling and Data Assimilation. Volume 11; Documentation of the Tangent Linear and Adjoint Models of the Relaxed Arakawa-Schubert Moisture Parameterization of the NASA GEOS-1 GCM; 5.2

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Yang, Wei-Yu; Todling, Ricardo; Navon, I. Michael

    1997-01-01

    A detailed description of the development of the tangent linear model (TLM) and its adjoint model of the Relaxed Arakawa-Schubert moisture parameterization package used in the NASA GEOS-1 C-Grid GCM (Version 5.2) is presented. The notational conventions used in the TLM and its adjoint codes are described in detail.

  1. CFL3D User's Manual (Version 5.0)

    NASA Technical Reports Server (NTRS)

    Krist, Sherrie L.; Biedron, Robert T.; Rumsey, Christopher L.

    1998-01-01

    This document is the User's Manual for the CFL3D computer code, a thin-layer Reynolds-averaged Navier-Stokes flow solver for structured multiple-zone grids. Descriptions of the code's input parameters, non-dimensionalizations, file formats, boundary conditions, and equations are included. Sample 2-D and 3-D test cases are also described, and many helpful hints for using the code are provided.

  2. FPGA Boot Loader and Scrubber

    NASA Technical Reports Server (NTRS)

    Wade, Randall S.; Jones, Bailey

    2009-01-01

    A computer program loads configuration code into a Xilinx field-programmable gate array (FPGA), reads back and verifies that code, reloads the code if an error is detected, and monitors the performance of the FPGA for errors in the presence of radiation. The program consists mainly of a set of VHDL files (wherein "VHDL" signifies "VHSIC Hardware Description Language" and "VHSIC" signifies "very-high-speed integrated circuit").

  3. Study of coherent synchrotron radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    NASA Astrophysics Data System (ADS)

    Dattoli, G.; Migliorati, M.; Schiavi, A.

    2007-05-01

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.

  4. Positron follow-up in liquid water: I. A new Monte Carlo track-structure code.

    PubMed

    Champion, C; Le Loirec, C

    2006-04-07

    When biological matter is irradiated by charged particles, a wide variety of interactions occur, which lead to a deep modification of the cellular environment. To understand the fine structure of the microscopic distribution of energy deposits, Monte Carlo event-by-event simulations are particularly suitable. However, the development of these track-structure codes needs accurate interaction cross sections for all the electronic processes: ionization, excitation, positronium formation and even elastic scattering. Under these conditions, we have recently developed a Monte Carlo code for positrons in water, the latter being commonly used to simulate the biological medium. All the processes are studied in detail via theoretical differential and total cross-section calculations performed by using partial wave methods. Comparisons with existing theoretical and experimental data in terms of stopping powers, mean energy transfers and ranges show very good agreements. Moreover, thanks to the theoretical description of positronium formation, we have access, for the first time, to the complete kinematics of the electron capture process. Then, the present Monte Carlo code is able to describe the detailed positronium history, which will provide useful information for medical imaging (like positron emission tomography) where improvements are needed to define with the best accuracy the tumoural volumes.

  5. ARCADIA{sup R} - A New Generation of Coupled Neutronics / Core Thermal- Hydraulics Code System at AREVA NP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curca-Tivig, Florin; Merk, Stephan; Pautz, Andreas

    2007-07-01

    Anticipating future needs of our customers and willing to concentrate synergies and competences existing in the company for the benefit of our customers, AREVA NP decided in 2002 to develop the next generation of coupled neutronics/ core thermal-hydraulic (TH) code systems for fuel assembly and core design calculations for both, PWR and BWR applications. The global CONVERGENCE project was born: after a feasibility study of one year (2002) and a conceptual phase of another year (2003), development was started at the beginning of 2004. The present paper introduces the CONVERGENCE project, presents the main feature of the new code systemmore » ARCADIA{sup R} and concludes on customer benefits. ARCADIA{sup R} is designed to meet AREVA NP market and customers' requirements worldwide. Besides state-of-the-art physical modeling, numerical performance and industrial functionality, the ARCADIA{sup R} system is featuring state-of-the-art software engineering. The new code system will bring a series of benefits for our customers: e.g. improved accuracy for heterogeneous cores (MOX/ UOX, Gd...), better description of nuclide chains, and access to local neutronics/ thermal-hydraulics and possibly thermal-mechanical information (3D pin by pin full core modeling). ARCADIA is a registered trademark of AREVA NP. (authors)« less

  6. THE RETC CODE FOR QUANTIFYING THE HYDRAULIC FUNCTIONS OF UNSATURATED SOILS

    EPA Science Inventory

    This report describes the RETC computer code for analyzing the soil water retention and hydraulic conductivity functions of unsaturated soils. These hydraulic properties are key parameters in any quantitative description of water flow into and through the unsaturated zone of soil...

  7. Practices and Standards in the Construction of BRL-CAD Target Descriptions

    DTIC Science & Technology

    1993-09-01

    Spencer) 3 AIFRS (Dr. Steven Carter) AIFRT (John Kosiewicz) AIFRE (S. Eitelman) 220 Seventh Street, NE Charlottesville, VA 22901-5396 3 Director 6...Hawkins, Code 1740.2 2231 Faraday Ave Steven L. Cohen, Code 1230 Suite 103 Dennis Clark, Code 0111 Carlsbad, CA 92008 Dr. Paul C. St. Hilaire, Code...4E995 Washington, DC 20330 1 Dr. Robert B. LaBerge 910 Via Palo 1 Cincinnati Mailacron Inc. Aptos, CA 95003 ATTN: Mr. Richard C. Messinger

  8. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran.

    PubMed

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-09-01

    Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives.

  9. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    PubMed Central

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-01-01

    Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. Methods: A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives. PMID:25276730

  10. Ethics, culture and nursing practice in Ghana.

    PubMed

    Donkor, N T; Andrews, L D

    2011-03-01

    This paper describes how nurses in Ghana approach ethical problems. The International Council of Nurses' (ICN) Code for Nurses (2006) that serves as the model for professional code of ethics worldwide also acknowledges respect for healthy cultural values. Using the ICN's Code and universal ethical principles as a benchmark, a survey was conducted in 2009 to ascertain how nurses in Ghana respond to ethical and cultural issues in their practice. The study was qualitative with 200 participant nurses. Data were obtained through anonymous self-administered questionnaires. Descriptive statistics were used to analyze the data. Nurses' approaches to ethical problems in Ghana do not always meet expectations of the ICN Code for Nurses. They are also informed by local ethical practices related to the institutional setting and cultural environment in the country. While some cultural values complemented the ICN's Code and universal ethical principles, others conflicted with them. These data can assist nurses to provide culturally competent solutions to ethical dilemmas in their practice. Dynamic communication between nurses and patients/clients, intentional study of local cultural beliefs, and the development of ethics education will improve the conformity between universal ethical standards and local cultural values. © 2011 The Authors. International Nursing Review © 2011 International Council of Nurses.

  11. Opinion survey on proposals for improving code stroke in Murcia Health District V, 2014.

    PubMed

    González-Navarro, M; Martínez-Sánchez, M A; Morales-Camacho, V; Valera-Albert, M; Atienza-Ayala, S V; Limiñana-Alcaraz, G

    2017-05-01

    Stroke is a time-dependent neurological disease. Health District V in the Murcia Health System has certain demographic and geographical characteristics that make it necessary to create specific improvement strategies to ensure proper functioning of code stroke (CS). The study objectives were to assess local professionals' opinions about code stroke activation and procedure, and to share these suggestions with the regional multidisciplinary group for code stroke. This cross-sectional and descriptive study used the Delphi technique to develop a questionnaire for doctors and nurses working at all care levels in Area V. An anonymous electronic survey was sent to 154 professionals. The analysis was performed using the SWOT method (Strengths, Weaknesses, Opportunities, and Threats). Researchers collected 51 questionnaires. The main proposals were providing training, promoting communication with the neurologist, overcoming physical distances, using diagnostic imaging tests, motivating professionals, and raising awareness in the general population. Most of the interventions proposed by the participants have been listed in published literature. These improvement proposals were forwarded to the Regional Code Stroke Improvement Group. Copyright © 2015 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Validation and evaluation of the advanced aeronautical CFD system SAUNA: A method developer's view

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Peace, A. J.; Georgala, J. M.; Childs, P. N.

    1993-09-01

    This paper is concerned with a detailed validation and evaluation of the SAUNA CFD system for complex aircraft configurations. The methodology of the complete system is described in brief, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the configuration. A wide range of configurations and flow conditions are chosen in the validation and evaluation exercise to demonstrate the scope of SAUNA. A detailed description of the results from the method is preceded by a discussion on the philosophy behind the strategy followed in the exercise, in terms of equality assessment and the differing roles of the code developer and the code user. It is considered that SAUNA has grown into a highly usable tool for the aircraft designer, in combining flexibility and accuracy in an efficient manner.

  13. Validation of a multi-layer Green's function code for ion beam transport

    NASA Astrophysics Data System (ADS)

    Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence

    To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.

  14. Home energy management (HEM) database: A list with coded attributes of 308 devices commercially available in the US.

    PubMed

    Pritoni, Marco; Ford, Rebecca; Karlin, Beth; Sanguinetti, Angela

    2018-02-01

    Policymakers worldwide are currently discussing whether to include home energy management (HEM) products in their portfolio of technologies to reduce carbon emissions and improve grid reliability. However, very little data is available about these products. Here we present the results of an extensive review including 308 HEM products available on the US market in 2015-2016. We gathered these data from publicly available sources such as vendor websites, online marketplaces and other vendor documents. A coding guide was developed iteratively during the data collection and utilized to classify the devices. Each product was coded based on 96 distinct attributes, grouped into 11 categories: Identifying information, Product components, Hardware, Communication, Software, Information - feedback, Information - feedforward, Control, Utility interaction, Additional benefits and Usability. The codes describe product features and functionalities, user interaction and interoperability with other devices. A mix of binary attributes and more descriptive codes allow to sort and group data without losing important qualitative information. The information is stored in a large spreadsheet included with this article, along with an explanatory coding guide. This dataset is analyzed and described in a research article entitled "Categories and functionality of smart home technology for energy management" (Ford et al., 2017) [1].

  15. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, Charles W.; Bartel, Timothy James

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less

  16. Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1982-01-01

    An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.

  17. Particle-in-cell/accelerator code for space-charge dominated beam simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-08

    Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less

  18. Analysis and Description of HOLTIN Service Provision for AECG monitoring in Complex Indoor Environments

    PubMed Central

    Led, Santiago; Azpilicueta, Leire; Aguirre, Erik; de Espronceda, Miguel Martínez; Serrano, Luis; Falcone, Francisco

    2013-01-01

    In this work, a novel ambulatory ECG monitoring device developed in-house called HOLTIN is analyzed when operating in complex indoor scenarios. The HOLTIN system is described, from the technological platform level to its functional model. In addition, by using in-house 3D ray launching simulation code, the wireless channel behavior, which enables ubiquitous operation, is performed. The effect of human body presence is taken into account by a novel simplified model embedded within the 3D Ray Launching code. Simulation as well as measurement results are presented, showing good agreement. These results may aid in the adequate deployment of this novel device to automate conventional medical processes, increasing the coverage radius and optimizing energy consumption. PMID:23584122

  19. Numerical Simulation of the ``Fluid Mechanical Sewing Machine''

    NASA Astrophysics Data System (ADS)

    Brun, Pierre-Thomas; Audoly, Basile; Ribe, Neil

    2011-11-01

    A thin thread of viscous fluid falling onto a moving conveyor belt generates a wealth of complex ``stitch'' patterns depending on the belt speed and the fall height. To understand the rich nonlinear dynamics of this system, we have developed a new numerical code for simulating unsteady viscous threads, based on a discrete description of the geometry and a variational formulation for the viscous stresses. The code successfully reproduces all major features of the experimental state diagram of Morris et al. (Phys. Rev. E 2008). Fourier analysis of the motion of the thread's contact point with the belt suggests a new classification of the observed patterns, and reveals that the system behaves as a nonlinear oscillator coupling the pendulum modes of the thread.

  20. How to describe a new fungal species

    USDA-ARS?s Scientific Manuscript database

    The formal and informal requirements for the publication of descriptions of new fungal species are discussed. This involves following the rules of the International Code of Botanical Nomenclature as well as meeting the standards set by the editorial board of the journals in which these descriptions ...

  1. Prediction of plasma-facing ICRH antenna behavior via a Finite-Element solution of coupled Integral Equations

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; Milanesio, D.; Maggiora, R.; Vecchi, G.; Kyrytsya, V.

    2005-09-01

    The demand for a predictive tool to help designing ICRH antennas for fusion experiments has driven the development of codes like ICANT, RANT3D, and the early developments and further upgrades of TOPICA code. Currently, TOPICA handles the actual geometry of ICRH antennas (with their housing, etc.) as well as a realistic plasma model, including density and temperature profiles and FLR effects. Both goals have been attained by formally splitting the problem into two parts: the vacuum region around the antenna, and the plasma region inside the toroidal chamber. Field continuity and boundary conditions allow writing a set of coupled integral equations for the unknown equivalent (current) sources; finite elements are used on a triangular-cell mesh and a linear system is obtained on application of the weighted-residual solution scheme. In the vacuum region calculations are done in the spatial domain, whereas in the plasma region a spectral (wavenumber) representation of fields and currents is adopted, thus allowing a description of the plasma by a surface impedance matrix. Thanks to this approach, any plasma model can be used in principle, and at present Brambilla's FELICE code has been employed. The natural outputs of TOPICA are the induced currents on the conductors and the electric field in front of the plasma, whence the antenna circuit parameters (impedance/scattering matrices), the radiated power and the fields (at locations other than the chamber aperture) are then obtained. An accurate model of the feeding coaxial lines is also included. This paper is precisely devoted to the description of TOPICA, whereas examples of results for real-life antennas are reported in a companion paper [1] in this proceedings.

  2. Prediction of plasma-facing ICRH antenna behavior via a Finite-Element solution of coupled Integral Equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancellotti, V.; Milanesio, D.; Maggiora, R.

    2005-09-26

    The demand for a predictive tool to help designing ICRH antennas for fusion experiments has driven the development of codes like ICANT, RANT3D, and the early developments and further upgrades of TOPICA code. Currently, TOPICA handles the actual geometry of ICRH antennas (with their housing, etc.) as well as a realistic plasma model, including density and temperature profiles and FLR effects. Both goals have been attained by formally splitting the problem into two parts: the vacuum region around the antenna, and the plasma region inside the toroidal chamber. Field continuity and boundary conditions allow writing a set of coupled integralmore » equations for the unknown equivalent (current) sources; finite elements are used on a triangular-cell mesh and a linear system is obtained on application of the weighted-residual solution scheme. In the vacuum region calculations are done in the spatial domain, whereas in the plasma region a spectral (wavenumber) representation of fields and currents is adopted, thus allowing a description of the plasma by a surface impedance matrix. Thanks to this approach, any plasma model can be used in principle, and at present Brambilla's FELICE code has been employed. The natural outputs of TOPICA are the induced currents on the conductors and the electric field in front of the plasma, whence the antenna circuit parameters (impedance/scattering matrices), the radiated power and the fields (at locations other than the chamber aperture) are then obtained. An accurate model of the feeding coaxial lines is also included. This paper is precisely devoted to the description of TOPICA, whereas examples of results for real-life antennas are reported in a companion paper in this proceedings.« less

  3. Description and Simulation of a Fast Packet Switch Architecture for Communication Satellites

    NASA Technical Reports Server (NTRS)

    Quintana, Jorge A.; Lizanich, Paul J.

    1995-01-01

    The NASA Lewis Research Center has been developing the architecture for a multichannel communications signal processing satellite (MCSPS) as part of a flexible, low-cost meshed-VSAT (very small aperture terminal) network. The MCSPS architecture is based on a multifrequency, time-division-multiple-access (MF-TDMA) uplink and a time-division multiplex (TDM) downlink. There are eight uplink MF-TDMA beams, and eight downlink TDM beams, with eight downlink dwells per beam. The information-switching processor, which decodes, stores, and transmits each packet of user data to the appropriate downlink dwell onboard the satellite, has been fully described by using VHSIC (Very High Speed Integrated-Circuit) Hardware Description Language (VHDL). This VHDL code, which was developed in-house to simulate the information switching processor, showed that the architecture is both feasible and viable. This paper describes a shared-memory-per-beam architecture, its VHDL implementation, and the simulation efforts.

  4. Development and application of CATIA-GDML geometry builder

    NASA Astrophysics Data System (ADS)

    Belogurov, S.; Berchun, Yu; Chernogorov, A.; Malzacher, P.; Ovcharenko, E.; Schetinin, V.

    2014-06-01

    Due to conceptual difference between geometry descriptions in Computer-Aided Design (CAD) systems and particle transport Monte Carlo (MC) codes direct conversion of detector geometry in either direction is not feasible. The paper presents an update on functionality and application practice of the CATIA-GDML geometry builder first introduced at CHEP2010. This set of CATIAv5 tools has been developed for building a MC optimized GEANT4/ROOT compatible geometry based on the existing CAD model. The model can be exported via Geometry Description Markup Language (GDML). The builder allows also import and visualization of GEANT4/ROOT geometries in CATIA. The structure of a GDML file, including replicated volumes, volume assemblies and variables, is mapped into a part specification tree. A dedicated file template, a wide range of primitives, tools for measurement and implicit calculation of parameters, different types of multiple volume instantiation, mirroring, positioning and quality check have been implemented. Several use cases are discussed.

  5. SED-ED, a workflow editor for computational biology experiments written in SED-ML.

    PubMed

    Adams, Richard R

    2012-04-15

    The simulation experiment description markup language (SED-ML) is a new community data standard to encode computational biology experiments in a computer-readable XML format. Its widespread adoption will require the development of software support to work with SED-ML files. Here, we describe a software tool, SED-ED, to view, edit, validate and annotate SED-ML documents while shielding end-users from the underlying XML representation. SED-ED supports modellers who wish to create, understand and further develop a simulation description provided in SED-ML format. SED-ED is available as a standalone Java application, as an Eclipse plug-in and as an SBSI (www.sbsi.ed.ac.uk) plug-in, all under an MIT open-source license. Source code is at https://sed-ed-sedmleditor.googlecode.com/svn. The application itself is available from https://sourceforge.net/projects/jlibsedml/files/SED-ED/.

  6. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part M. ACE Competency Based Job Descriptions: #60--Food Assembler; #61--Injection Molder--Machine Operator; #62--Data Entry Typist; #63--Institutional Cook; Office Core Job Description; #64--Clerk Typist.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This tenth of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Food Assembler, Injection Molder-Machine Operator, Data Entry Typist, Institutional Cook, and Clerk Typist. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE…

  7. Design of Excess 3 to BCD code converter using electro-optic effect of Mach-Zehnder Interferometers for efficient data transmission

    NASA Astrophysics Data System (ADS)

    Kumar, Santosh; Chanderkanta; Amphawan, Angela

    2016-04-01

    Excess 3 code is one of the most important codes used for efficient data storage and transmission. It is a non-weighted code and also known as self complimenting code. In this paper, a four bit optical Excess 3 to BCD code converter is proposed using electro-optic effect inside lithium-niobate based Mach-Zehnder interferometers (MZIs). The MZI structures have powerful capability to switching an optical input signal to a desired output port. The paper constitutes a mathematical description of the proposed device and thereafter simulation using MATLAB. The study is verified using beam propagation method (BPM).

  8. Programmable Logic Device (PLD) Design Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. At the conclusion of the development, the software and hardware description language (HDL) code was delivered to JSC for their use in their iPAS test bed to get hands-on experience with the STRS standard, and for development of their own STRS Waveforms on the now STRS compliant platform.The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe the design of the HDL code for the FPGA portion of the iPAS STRS Radio particularly the design of the FPGA wrapper and the test waveform.

  9. Finite Element Estimation of Protein-Ligand Association Rates with Post-Encounter Effects: Applications to Calcium binding in Troponin C and SERCA

    PubMed Central

    Kekenes-Huskey, P. M.; Gillette, A.; Hake, J.; McCammon, J. A.

    2012-01-01

    We introduce a computational pipeline and suite of software tools for the approximation of diffusion-limited binding based on a recently developed theoretical framework. Our approach handles molecular geometries generated from high-resolution structural data and can account for active sites buried within the protein or behind gating mechanisms. Using tools from the FEniCS library and the APBS solver, we implement a numerical code for our method and study two Ca2+-binding proteins: Troponin C and the Sarcoplasmic Reticulum Ca2+ ATPase (SERCA). We find that a combination of diffusional encounter and internal ‘buried channel’ descriptions provide superior descriptions of association rates, improving estimates by orders of magnitude. PMID:23293662

  10. Finite Element Estimation of Protein-Ligand Association Rates with Post-Encounter Effects: Applications to Calcium binding in Troponin C and SERCA.

    PubMed

    Kekenes-Huskey, P M; Gillette, A; Hake, J; McCammon, J A

    2012-10-31

    We introduce a computational pipeline and suite of software tools for the approximation of diffusion-limited binding based on a recently developed theoretical framework. Our approach handles molecular geometries generated from high-resolution structural data and can account for active sites buried within the protein or behind gating mechanisms. Using tools from the FEniCS library and the APBS solver, we implement a numerical code for our method and study two Ca(2+)-binding proteins: Troponin C and the Sarcoplasmic Reticulum Ca(2+) ATPase (SERCA). We find that a combination of diffusional encounter and internal 'buried channel' descriptions provide superior descriptions of association rates, improving estimates by orders of magnitude.

  11. Finite-element estimation of protein-ligand association rates with post-encounter effects: applications to calcium binding in troponin C and SERCA

    NASA Astrophysics Data System (ADS)

    Kekenes-Huskey, P. M.; Gillette, A.; Hake, J.; McCammon, J. A.

    2012-01-01

    We introduce a computational pipeline and suite of software tools for the approximation of diffusion-limited binding based on a recently developed theoretical framework. Our approach handles molecular geometries generated from high-resolution structural data and can account for active sites buried within the protein or behind gating mechanisms. Using tools from the FEniCS library and the APBS solver, we implement a numerical code for our method and study two Ca2+-binding proteins: troponin C and the sarcoplasmic reticulum Ca2+ ATPase. We find that a combination of diffusional encounter and internal ‘buried channel’ descriptions provides superior descriptions of association rates, improving estimates by orders of magnitude.

  12. HDL Based FPGA Interface Library for Data Acquisition and Multipurpose Real Time Algorithms

    NASA Astrophysics Data System (ADS)

    Fernandes, Ana M.; Pereira, R. C.; Sousa, J.; Batista, A. J. N.; Combo, A.; Carvalho, B. B.; Correia, C. M. B. A.; Varandas, C. A. F.

    2011-08-01

    The inherent parallelism of the logic resources, the flexibility in its configuration and the performance at high processing frequencies makes the field programmable gate array (FPGA) the most suitable device to be used both for real time algorithm processing and data transfer in instrumentation modules. Moreover, the reconfigurability of these FPGA based modules enables exploiting different applications on the same module. When using a reconfigurable module for various applications, the availability of a common interface library for easier implementation of the algorithms on the FPGA leads to more efficient development. The FPGA configuration is usually specified in a hardware description language (HDL) or other higher level descriptive language. The critical paths, such as the management of internal hardware clocks that require deep knowledge of the module behavior shall be implemented in HDL to optimize the timing constraints. The common interface library should include these critical paths, freeing the application designer from hardware complexity and able to choose any of the available high-level abstraction languages for the algorithm implementation. With this purpose a modular Verilog code was developed for the Virtex 4 FPGA of the in-house Transient Recorder and Processor (TRP) hardware module, based on the Advanced Telecommunications Computing Architecture (ATCA), with eight channels sampling at up to 400 MSamples/s (MSPS). The TRP was designed to perform real time Pulse Height Analysis (PHA), Pulse Shape Discrimination (PSD) and Pile-Up Rejection (PUR) algorithms at a high count rate (few Mevent/s). A brief description of this modular code is presented and examples of its use as an interface with end user algorithms, including a PHA with PUR, are described.

  13. Application of a Java-based, univel geometry, neutral particle Monte Carlo code to the searchlight problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles A. Wemple; Joshua J. Cogliati

    2005-04-01

    A univel geometry, neutral particle Monte Carlo transport code, written entirely in the Java programming language, is under development for medical radiotherapy applications. The code uses ENDF-VI based continuous energy cross section data in a flexible XML format. Full neutron-photon coupling, including detailed photon production and photonuclear reactions, is included. Charged particle equilibrium is assumed within the patient model so that detailed transport of electrons produced by photon interactions may be neglected. External beam and internal distributed source descriptions for mixed neutron-photon sources are allowed. Flux and dose tallies are performed on a univel basis. A four-tap, shift-register-sequence random numbermore » generator is used. Initial verification and validation testing of the basic neutron transport routines is underway. The searchlight problem was chosen as a suitable first application because of the simplicity of the physical model. Results show excellent agreement with analytic solutions. Computation times for similar numbers of histories are comparable to other neutron MC codes written in C and FORTRAN.« less

  14. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  15. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  16. Noise suppression methods for robust speech processing

    NASA Astrophysics Data System (ADS)

    Boll, S. F.; Ravindra, H.; Randall, G.; Armantrout, R.; Power, R.

    1980-05-01

    Robust speech processing in practical operating environments requires effective environmental and processor noise suppression. This report describes the technical findings and accomplishments during this reporting period for the research program funded to develop real time, compressed speech analysis synthesis algorithms whose performance in invariant under signal contamination. Fulfillment of this requirement is necessary to insure reliable secure compressed speech transmission within realistic military command and control environments. Overall contributions resulting from this research program include the understanding of how environmental noise degrades narrow band, coded speech, development of appropriate real time noise suppression algorithms, and development of speech parameter identification methods that consider signal contamination as a fundamental element in the estimation process. This report describes the current research and results in the areas of noise suppression using the dual input adaptive noise cancellation using the short time Fourier transform algorithms, articulation rate change techniques, and a description of an experiment which demonstrated that the spectral subtraction noise suppression algorithm can improve the intelligibility of 2400 bps, LPC 10 coded, helicopter speech by 10.6 point.

  17. Analysis of typical WWER-1000 severe accident scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorokin, Yu.S.; Shchekoldin, V.V.; Borisov, L.N.

    2004-07-01

    At present in EDO 'Gidropress' there is a certain experience of performing the analyses of severe accidents of reactor plant with WWER with application of domestic and foreign codes. Important data were also obtained by the results of calculation modeling of integrated experiments with fuel assembly melting comprising a real fuel. Systematization and consideration of these data in development and assimilation of codes are extremely important in connection with large uncertainty still existing in understanding and adequate description of phenomenology of severe accidents. The presented report gives a comparison of analysis results of severe accidents of reactor plant with WWER-1000more » for two typical scenarios made by using American MELCOR code and the Russian RATEG/SVECHA/HEFEST code. The results of calculation modeling are compared using above codes with the data of experiment FPT1 with fuel assembly melting comprising a real fuel, which has been carried out at the facility Phebus (France). The obtained results are considered in the report from the viewpoint of: - adequacy of results of calculation modeling of separate phenomena during severe accidents of RP with WWER by using the above codes; - influence of uncertainties (degree of details of calculation models, choice of parameters of models etc.); - choice of those or other setup variables (options) in the used codes; - necessity of detailed modeling of processes and phenomena as applied to design justification of safety of RP with WWER. (authors)« less

  18. Project Summary. THE RETC CODE FOR QUANTIFYING THE HYDRAULIC FUNCTIONS OF UNSATURATED SOILS

    EPA Science Inventory

    This summary describes the RETC computer code for analyzing the soil water retention and hydraulic conductivity functions of unsaturated soils. These hydraulic properties are key parameters in any quantitative description of water flow into and through the unsaturated zone of soi...

  19. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  20. Evaluating training of screening, brief intervention, and referral to treatment (SBIRT) for substance use: Reliability of the MD3 SBIRT Coding Scale.

    PubMed

    DiClemente, Carlo C; Crouch, Taylor Berens; Norwood, Amber E Q; Delahanty, Janine; Welsh, Christopher

    2015-03-01

    Screening, brief intervention, and referral to treatment (SBIRT) has become an empirically supported and widely implemented approach in primary and specialty care for addressing substance misuse. Accordingly, training of providers in SBIRT has increased exponentially in recent years. However, the quality and fidelity of training programs and subsequent interventions are largely unknown because of the lack of SBIRT-specific evaluation tools. The purpose of this study was to create a coding scale to assess quality and fidelity of SBIRT interactions addressing alcohol, tobacco, illicit drugs, and prescription medication misuse. The scale was developed to evaluate performance in an SBIRT residency training program. Scale development was based on training protocol and competencies with consultation from Motivational Interviewing coding experts. Trained medical residents practiced SBIRT with standardized patients during 10- to 15-min videotaped interactions. This study included 25 tapes from the Family Medicine program coded by 3 unique coder pairs with varying levels of coding experience. Interrater reliability was assessed for overall scale components and individual items via intraclass correlation coefficients. Coder pair-specific reliability was also assessed. Interrater reliability was excellent overall for the scale components (>.85) and nearly all items. Reliability was higher for more experienced coders, though still adequate for the trained coder pair. Descriptive data demonstrated a broad range of adherence and skills. Subscale correlations supported concurrent and discriminant validity. Data provide evidence that the MD3 SBIRT Coding Scale is a psychometrically reliable coding system for evaluating SBIRT interactions and can be used to evaluate implementation skills for fidelity, training, assessment, and research. Recommendations for refinement and further testing of the measure are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  1. Capabilities overview of the MORET 5 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cochet, B.; Jinaphanh, A.; Heulers, L.; Jacquet, O.

    2014-06-01

    The MORET code is a simulation tool that solves the transport equation for neutrons using the Monte Carlo method. It allows users to model complex three-dimensional geometrical configurations, describe the materials, define their own tallies in order to analyse the results. The MORET code has been initially designed to perform calculations for criticality safety assessments. New features has been introduced in the MORET 5 code to expand its use for reactor applications. This paper presents an overview of the MORET 5 code capabilities, going through the description of materials, the geometry modelling, the transport simulation and the definition of the outputs.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Kathleen; Lopez, Hugo; Cairns, Julie

    An overview of the main North American codes and standards associated with hydrogen safety sensors is provided. The distinction between a code and a standard is defined, and the relationship between standards and codes is clarified, especially for those circumstances where a standard or a certification requirement is explicitly referenced within a code. The report identifies three main types of standards commonly applied to hydrogen sensors (interface and controls standards, shock and hazard standards, and performance-based standards). The certification process and a list and description of the main standards and model codes associated with the use of hydrogen safety sensorsmore » in hydrogen infrastructure are presented.« less

  3. Advances in ultrasonic testing of austenitic stainless steel welds. Towards a 3D description of the material including attenuation and optimisation by inversion

    NASA Astrophysics Data System (ADS)

    Moysan, J.; Gueudré, C.; Ploix, M.-A.; Corneloup, G.; Guy, Ph.; Guerjouma, R. El; Chassignole, B.

    In the case of multi-pass welds, the material is very difficult to describe due to its anisotropic and heterogeneous properties. Anisotropy results from the metal solidification and is correlated with the grain orientation. A precise description of the material is one of the key points to obtain reliable results with wave propagation codes. A first advance is the model MINA which predicts the grain orientations in multi-pass 316-L steel welds. For flat position welding, good predictions of the grains orientations were obtained using 2D modelling. In case of welding in position the resulting grain structure may be 3D oriented. We indicate how the MINA model can be improved for 3D description. A second advance is a good quantification of the attenuation. Precise measurements are obtained using plane waves angular spectrum method together with the computation of the transmission coefficients for triclinic material. With these two first advances, the third one is now possible: developing an inverse method to obtain the material description through ultrasonic measurements at different positions.

  4. VHDL Descriptions for the FPGA Implementation of PWL-Function-Based Multi-Scroll Chaotic Oscillators

    PubMed Central

    2016-01-01

    Nowadays, chaos generators are an attractive field for research and the challenge is their realization for the development of engineering applications. From more than three decades ago, chaotic oscillators have been designed using discrete electronic devices, very few with integrated circuit technology, and in this work we propose the use of field-programmable gate arrays (FPGAs) for fast prototyping. FPGA-based applications require that one be expert on programming with very-high-speed integrated circuits hardware description language (VHDL). In this manner, we detail the VHDL descriptions of chaos generators for fast prototyping from high-level programming using Python. The cases of study are three kinds of chaos generators based on piecewise-linear (PWL) functions that can be systematically augmented to generate even and odd number of scrolls. We introduce new algorithms for the VHDL description of PWL functions like saturated functions series, negative slopes and sawtooth. The generated VHDL-code is portable, reusable and open source to be synthesized in an FPGA. Finally, we show experimental results for observing 2, 10 and 30-scroll attractors. PMID:27997930

  5. VHDL Descriptions for the FPGA Implementation of PWL-Function-Based Multi-Scroll Chaotic Oscillators.

    PubMed

    Tlelo-Cuautle, Esteban; Quintas-Valles, Antonio de Jesus; de la Fraga, Luis Gerardo; Rangel-Magdaleno, Jose de Jesus

    2016-01-01

    Nowadays, chaos generators are an attractive field for research and the challenge is their realization for the development of engineering applications. From more than three decades ago, chaotic oscillators have been designed using discrete electronic devices, very few with integrated circuit technology, and in this work we propose the use of field-programmable gate arrays (FPGAs) for fast prototyping. FPGA-based applications require that one be expert on programming with very-high-speed integrated circuits hardware description language (VHDL). In this manner, we detail the VHDL descriptions of chaos generators for fast prototyping from high-level programming using Python. The cases of study are three kinds of chaos generators based on piecewise-linear (PWL) functions that can be systematically augmented to generate even and odd number of scrolls. We introduce new algorithms for the VHDL description of PWL functions like saturated functions series, negative slopes and sawtooth. The generated VHDL-code is portable, reusable and open source to be synthesized in an FPGA. Finally, we show experimental results for observing 2, 10 and 30-scroll attractors.

  6. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    PubMed

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  7. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    PubMed Central

    2012-01-01

    Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277

  8. Laser-sodium interaction for the polychromatic laser guide star project

    NASA Astrophysics Data System (ADS)

    Bellanger, Veronique; Petit, Alain D.

    2002-02-01

    We developed a code aimed at determining the laser parameters leading to the maximum return flux of photons at 0.33 micrometers for a polychromatic sodium Laser Guide Star. This software relies upon a full 48-level collisionless and magnetic-field-free density-matrix description of the hyperfine structure of Na and includes Doppler broadening and Zeeman degeneracy. Experimental validation of BEACON was conducted on the SILVA facilities and will also be discussed in this paper.

  9. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part G. ACE Competency Based Job Descriptions: #22--Refrigerator Mechanic; #24--Motorcycle Repairperson.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This fourth of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Refrigerator Mechanic and Motorcycle Repairperson. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T. general educational…

  10. 23 CFR 710.201 - State responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false State responsibilities. 710.201 Section 710.201 Highways... interest acquired for all Federal-aid projects funded pursuant to title 23 of the United States Code shall... or acquisitions advanced under title 23 of the United States Code with a written description of its...

  11. 75 FR 6769 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... To Amend the Hearing Location Rules of the Codes of Arbitration Procedure for Customer and Industry... expand the criteria for selecting a hearing location for an arbitration proceeding. The proposed rule..., 2010. II. Description of the Proposed Rule Change Hearing Location Selection Under the Customer Code...

  12. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... organization transmitting the data set, including first name, middle name or initial, and surname. Contact... unit's nameplate by the manufacturer. The data element is reported in megawatts or kilowatts. Method accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude...

  13. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... organization transmitting the data set, including first name, middle name or initial, and surname. Contact... unit's nameplate by the manufacturer. The data element is reported in megawatts or kilowatts. Method accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude...

  14. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... organization transmitting the data set, including first name, middle name or initial, and surname. Contact... unit's nameplate by the manufacturer. The data element is reported in megawatts or kilowatts. Method accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude...

  15. 40 CFR 51.50 - What definitions apply to this subpart?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... organization transmitting the data set, including first name, middle name or initial, and surname. Contact... unit's nameplate by the manufacturer. The data element is reported in megawatts or kilowatts. Method accuracy description (MAD) codes means a set of six codes used to define the accuracy of latitude/longitude...

  16. 50 CFR 679.94 - Economic data report (EDR) for the Amendment 80 sector.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: NMFS, Alaska Fisheries Science Center, Economic Data Reports, 7600 Sand Point Way NE, F/AKC2, Seattle... Operation Description of code Code NMFS Alaska region ADF&G FCP Catcher/processor Floating catcher processor. FLD Mothership Floating domestic mothership. IFP Stationary Floating Processor Inshore floating...

  17. GCS component development cycle

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  18. 106-17 Telemetry Standards Metadata Configuration Chapter 23

    DTIC Science & Technology

    2017-07-01

    23-1 23.2 Metadata Description Language ...Chapter 23, July 2017 iii Acronyms HTML Hypertext Markup Language MDL Metadata Description Language PCM pulse code modulation TMATS Telemetry...Attributes Transfer Standard W3C World Wide Web Consortium XML eXtensible Markup Language XSD XML schema document Telemetry Network Standard

  19. 48 CFR 47.207-3 - Description of shipment, origin, and destination.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contracting officer shall include in solicitations full details regarding the location from which the freight is to be shipped. For example, if a single location is shown, furnish the shipper's name, street..., including boundaries and ZIP codes. (c) Description of the freight. The contracting officer shall include in...

  20. 48 CFR 47.207-3 - Description of shipment, origin, and destination.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... contracting officer shall include in solicitations full details regarding the location from which the freight is to be shipped. For example, if a single location is shown, furnish the shipper's name, street..., including boundaries and ZIP codes. (c) Description of the freight. The contracting officer shall include in...

  1. 48 CFR 47.207-3 - Description of shipment, origin, and destination.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... contracting officer shall include in solicitations full details regarding the location from which the freight is to be shipped. For example, if a single location is shown, furnish the shipper's name, street..., including boundaries and ZIP codes. (c) Description of the freight. The contracting officer shall include in...

  2. 48 CFR 47.207-3 - Description of shipment, origin, and destination.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... contracting officer shall include in solicitations full details regarding the location from which the freight is to be shipped. For example, if a single location is shown, furnish the shipper's name, street..., including boundaries and ZIP codes. (c) Description of the freight. The contracting officer shall include in...

  3. 48 CFR 47.207-3 - Description of shipment, origin, and destination.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... contracting officer shall include in solicitations full details regarding the location from which the freight is to be shipped. For example, if a single location is shown, furnish the shipper's name, street..., including boundaries and ZIP codes. (c) Description of the freight. The contracting officer shall include in...

  4. Maternal Label and Gesture Use Affects Acquisition of Specific Object Names

    ERIC Educational Resources Information Center

    Zammit, Maria; Schafer, Graham

    2011-01-01

    Ten mothers were observed prospectively, interacting with their infants aged 0 ; 10 in two contexts (picture description and noun description). Maternal communicative behaviours were coded for volubility, gestural production and labelling style. Verbal labelling events were categorized into three exclusive categories: label only; label plus…

  5. HZETRN: Description of a free-space ion and nucleon transport and shielding computer program

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Cucinotta, Francis A.; Shinn, Judy L.; Badhwar, Gautam D.; Silberberg, R.; Tsao, C. H.; Townsend, Lawrence W.; Tripathi, Ram K.

    1995-01-01

    The high-charge-and energy (HZE) transport computer program HZETRN is developed to address the problems of free-space radiation transport and shielding. The HZETRN program is intended specifically for the design engineer who is interested in obtaining fast and accurate dosimetric information for the design and construction of space modules and devices. The program is based on a one-dimensional space-marching formulation of the Boltzmann transport equation with a straight-ahead approximation. The effect of the long-range Coulomb force and electron interaction is treated as a continuous slowing-down process. Atomic (electronic) stopping power coefficients with energies above a few A MeV are calculated by using Bethe's theory including Bragg's rule, Ziegler's shell corrections, and effective charge. Nuclear absorption cross sections are obtained from fits to quantum calculations and total cross sections are obtained with a Ramsauer formalism. Nuclear fragmentation cross sections are calculated with a semiempirical abrasion-ablation fragmentation model. The relation of the final computer code to the Boltzmann equation is discussed in the context of simplifying assumptions. A detailed description of the flow of the computer code, input requirements, sample output, and compatibility requirements for non-VAX platforms are provided.

  6. The diffusion of the distance Entomology Master's Degree Program at the University of Nebraska Lincoln: A descriptive case study

    NASA Astrophysics Data System (ADS)

    Hubbell, Jody M.

    This study explored three selected phases of Rogers' (1995) Diffusion of Innovations Theory to examine the diffusion process of the distance Entomology Master's Degree program at the University of Nebraska, Lincoln. A qualitative descriptive case study approach incorporated semi-structured interviews with individuals involved in one or more of the three stages: Development, Implementation, and Institutionalization. Documents and archival evidence were used to triangulate findings. This research analyzed descriptions of the program as it moved from the Development, to the Implementation, and finally, the Institutionalization stages of diffusion. Each respective stage was examined through open and axial coding. Process coding identified themes common to two or more diffusion stages, and explored the evolution of themes from one diffusion stage to the next. At a time of significant budget constraints, many departments were faced with the possibility of merger or dissolution. The Entomology Master's Degree Program evolved from being an entrepreneurial means to prevent departmental dissolution to eventually being viewed as a model for the development of similar programs across this university and other institutions of higher education. During this evolution, the program was reinvented to meet the broader needs of industry and a global student market. One finding not consistent with Rogers' model was that smaller, rather than larger, departmental size contributed to the success of the program. Within this small department, faculty members were able to share their experiences and knowledge with each other on a regular basis, which promoted greater acceptance of the distance program. How quality and rigor may be defined and measured was a key issue in each respective stage. In this specific case, quality and rigor was initially a comparison of on-campus and distance course content and then moved to program-based assessment and measures of student outcomes such as job placement rates.

  7. DbMap: improving database interoperability issues in medical software using a simple, Java-Xml based solution.

    PubMed Central

    Karadimas, H.; Hemery, F.; Roland, P.; Lepage, E.

    2000-01-01

    In medical software development, the use of databases plays a central role. However, most of the databases have heterogeneous encoding and data models. To deal with these variations in the application code directly is error-prone and reduces the potential reuse of the produced software. Several approaches to overcome these limitations have been proposed in the medical database literature, which will be presented. We present a simple solution, based on a Java library, and a central Metadata description file in XML. This development approach presents several benefits in software design and development cycles, the main one being the simplicity in maintenance. PMID:11079915

  8. Developing and refining the methods for a 'one-stop shop' for research evidence about health systems.

    PubMed

    Lavis, John N; Wilson, Michael G; Moat, Kaelan A; Hammill, Amanda C; Boyko, Jennifer A; Grimshaw, Jeremy M; Flottorp, Signe

    2015-02-25

    Policymakers, stakeholders and researchers have not been able to find research evidence about health systems using an easily understood taxonomy of topics, know when they have conducted a comprehensive search of the many types of research evidence relevant to them, or rapidly identify decision-relevant information in their search results. To address these gaps, we developed an approach to building a 'one-stop shop' for research evidence about health systems. We developed a taxonomy of health system topics and iteratively refined it by drawing on existing categorization schemes and by using it to categorize progressively larger bundles of research evidence. We identified systematic reviews, systematic review protocols, and review-derived products through searches of Medline, hand searches of several databases indexing systematic reviews, hand searches of journals, and continuous scanning of listservs and websites. We developed an approach to providing 'added value' to existing content (e.g., coding systematic reviews according to the countries in which included studies were conducted) and to expanding the types of evidence eligible for inclusion (e.g., economic evaluations and health system descriptions). Lastly, we developed an approach to continuously updating the online one-stop shop in seven supported languages. The taxonomy is organized by governance, financial, and delivery arrangements and by implementation strategies. The 'one-stop shop', called Health Systems Evidence, contains a comprehensive inventory of evidence briefs, overviews of systematic reviews, systematic reviews, systematic review protocols, registered systematic review titles, economic evaluations and costing studies, health reform descriptions and health system descriptions, and many types of added-value coding. It is continuously updated and new content is regularly translated into Arabic, Chinese, English, French, Portuguese, Russian, and Spanish. Policymakers and stakeholders can now easily access and use a wide variety of types of research evidence about health systems to inform decision-making and advocacy. Researchers and research funding agencies can use Health Systems Evidence to identify gaps in the current stock of research evidence and domains that could benefit from primary research, systematic reviews, and review overviews.

  9. The descriptive epidemiology of sports/leisure-related heat illness hospitalisations in New South Wales, Australia.

    PubMed

    Finch, Caroline F; Boufous, Soufiane

    2008-01-01

    Sport-related heat illness has not been commonly studied from an epidemiological perspective. This study presents the descriptive epidemiology of sports/leisure-related heat illness hospitalisations in New South Wales, Australia. All in-patient separations from all acute hospitals in NSW during 2001-2004, with an International Classification of Diseases external cause of injury code indicating "exposure to excessive natural heat (X30)" or any ICD-10 diagnosis code in the range: "effects of heat and light (T67.0-T67.9)", were analysed. The sport/leisure relatedness of cases was defined by ICD-10-AM activity codes indicating involvement in sport/leisure activities. Cases of exposure to heat while engaged in sport/leisure were described by gender, year, age, principal diagnosis, type of activity/sport and length of stay. There were 109 hospital separations for exposure to heat while engaging in sport/leisure activity, with the majority occurring during the hottest months. The number of male cases significantly increased over the 4-year period and 45+ -year olds had the largest number of cases. Heat exhaustion was the leading cause of hospital separation (40% of cases). Marathon running, cricket and golf were the activities most commonly associated with heat-related hospitalisation. Ongoing development and refinement of expert position statements regarding heat illnesses need to draw on both epidemiological and physiological evidence to ensure their relevance to all levels of risk from the real world sport training and competition contexts.

  10. Simulation studies using multibody dynamics code DART

    NASA Technical Reports Server (NTRS)

    Keat, James E.

    1989-01-01

    DART is a multibody dynamics code developed by Photon Research Associates for the Air Force Astronautics Laboratory (AFAL). The code is intended primarily to simulate the dynamics of large space structures, particularly during the deployment phase of their missions. DART integrates nonlinear equations of motion numerically. The number of bodies in the system being simulated is arbitrary. The bodies' interconnection joints can have an arbitrary number of degrees of freedom between 0 and 6. Motions across the joints can be large. Provision for simulating on-board control systems is provided. Conservation of energy and momentum, when applicable, are used to evaluate DART's performance. After a brief description of DART, studies made to test the program prior to its delivery to AFAL are described. The first is a large angle reorientating of a flexible spacecraft consisting of a rigid central hub and four flexible booms. Reorientation was accomplished by a single-cycle sine wave shape torque input. In the second study, an appendage, mounted on a spacecraft, was slewed through a large angle. Four closed-loop control systems provided control of this appendage and of the spacecraft's attitude. The third study simulated the deployment of the rim of a bicycle wheel configuration large space structure. This system contained 18 bodies. An interesting and unexpected feature of the dynamics was a pulsing phenomena experienced by the stays whole playout was used to control the deployment. A short description of the current status of DART is given.

  11. Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.

    1993-01-01

    A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.

  12. Mathematical description of complex chemical kinetics and application to CFD modeling codes

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.

    1993-01-01

    A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.

  13. Photoionization and High Density Gas

    NASA Technical Reports Server (NTRS)

    Kallman, T.; Bautista, M.; White, Nicholas E. (Technical Monitor)

    2002-01-01

    We present results of calculations using the XSTAR version 2 computer code. This code is loosely based on the XSTAR v.1 code which has been available for public use for some time. However it represents an improvement and update in several major respects, including atomic data, code structure, user interface, and improved physical description of ionization/excitation. In particular, it now is applicable to high density situations in which significant excited atomic level populations are likely to occur. We describe the computational techniques and assumptions, and present sample runs with particular emphasis on high density situations.

  14. Extensions and Adjuncts to the BRL-COMGEOM Program

    DTIC Science & Technology

    1974-08-01

    m MAGIC Code, GIFT Code, Computer Simulation, Target Description, Geometric Modeling Techniques, Vulnerability Analysis 20...Arbitrary Quadric Surf ace.. 0Oo „<>. 7 III. BRITL: A GEOMETRY PREPROCESSOR PROGRAM FOR INPUT TO THE GIFT SYSTEM „ 0 18 A. Introduction <, „. ° 18 B...the BRL- GIFT code. The tasks completed under this contract and described in the report are: Ao The addition to the list of available body types

  15. Neutrino-induced reactions on nuclei

    NASA Astrophysics Data System (ADS)

    Gallmeister, K.; Mosel, U.; Weil, J.

    2016-09-01

    Background: Long-baseline experiments such as the planned deep underground neutrino experiment (DUNE) require theoretical descriptions of the complete event in a neutrino-nucleus reaction. Since nuclear targets are used this requires a good understanding of neutrino-nucleus interactions. Purpose: Develop a consistent theory and code framework for the description of lepton-nucleus interactions that can be used to describe not only inclusive cross sections, but also the complete final state of the reaction. Methods: The Giessen-Boltzmann-Uehling-Uhlenbeck (GiBUU) implementation of quantum-kinetic transport theory is used, with improvements in its treatment of the nuclear ground state and of 2p2h interactions. For the latter an empirical structure function from electron scattering data is used as a basis. Results: Results for electron-induced inclusive cross sections are given as a necessary check for the overall quality of this approach. The calculated neutrino-induced inclusive double-differential cross sections show good agreement data from neutrino and antineutrino reactions for different neutrino flavors at MiniBooNE and T2K. Inclusive double-differential cross sections for MicroBooNE, NOvA, MINERvA, and LBNF/DUNE are given. Conclusions: Based on the GiBUU model of lepton-nucleus interactions a good theoretical description of inclusive electron-, neutrino-, and antineutrino-nucleus data over a wide range of energies, different neutrino flavors, and different experiments is now possible. Since no tuning is involved this theory and code should be reliable also for new energy regimes and target masses.

  16. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  17. Process modelling for materials preparation experiments

    NASA Technical Reports Server (NTRS)

    Rosenberger, Franz; Alexander, J. Iwan D.

    1992-01-01

    The development is examined of mathematical tools and measurement of transport properties necessary for high fidelity modeling of crystal growth from the melt and solution, in particular for the Bridgman-Stockbarger growth of mercury cadmium telluride (MCT) and the solution growth of triglycine sulphate (TGS). The tasks include development of a spectral code for moving boundary problems, kinematic viscosity measurements on liquid MCT at temperatures close to the melting point, and diffusivity measurements on concentrated and supersaturated TGS solutions. A detailed description is given of the work performed for these tasks, together with a summary of the resulting publications and presentations.

  18. Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio User's Guide -- Advanced Exploration Systems (AES)

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto; Shalkhauser, Mary Jo Windmille

    2017-01-01

    The Integrated Power, Avionics and Software (IPAS) software defined radio (SDR) was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RAICS) platform, for radio development at NASA Johnson Space Center. Software and hardware description language (HDL) code were delivered by NASA Glenn Research Center for use in the IPAS test bed and for development of their own Space Telecommunications Radio System (STRS) waveforms on the RAICS platform. The purpose of this document is to describe how to setup and operate the IPAS STRS Radio platform with its delivered test waveform.

  19. Real-time operating system for selected Intel processors

    NASA Technical Reports Server (NTRS)

    Pool, W. R.

    1980-01-01

    The rationale for system development is given along with reasons for not using vendor supplied operating systems. Although many system design and performance goals were dictated by problems with vendor supplied systems, other goals surfaced as a result of a design for a custom system able to span multiple projects. System development and management problems and areas that required redesign or major code changes for system implementation are examined as well as the relative successes of the initial projects. A generic description of the actual project is provided and the ongoing support requirements and future plans are discussed.

  20. SIRU development. Volume 3: Software description and program documentation

    NASA Technical Reports Server (NTRS)

    Oehrle, J.

    1973-01-01

    The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.

  1. The CAD triad hypothesis: a mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity).

    PubMed

    Rozin, P; Lowery, L; Imada, S; Haidt, J

    1999-04-01

    It is proposed that 3 emotions--contempt, anger, and disgust--are typically elicited, across cultures, by violations of 3 moral codes proposed by R. A. Shweder and his colleagues (R. A. Shweder, N. C. Much, M. Mahapatra, & L. Park, 1997). The proposed alignment links anger to autonomy (individual rights violations), contempt to community (violation of communal codes including hierarchy), and disgust to divinity (violations of purity-sanctity). This is the CAD triad hypothesis. Students in the United States and Japan were presented with descriptions of situations that involve 1 of the types of moral violations and asked to assign either an appropriate facial expression (from a set of 6) or an appropriate word (contempt, anger, disgust, or their translations). Results generally supported the CAD triad hypothesis. Results were further confirmed by analysis of facial expressions actually made by Americans to the descriptions of these situations.

  2. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  3. Experimental and theoretical investigations on the warm-up of a high-pressure mercury discharge lamp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zalach, J.; Franke, St.; Schoepp, H.

    2011-03-15

    Modern high-pressure discharge lamps are forced to provide instant light and hot relight capabilities - if possible at lower power units. A detailed understanding of the warm-up of high-pressure discharge lamps is therefore required. Complex fluid model codes were developed for the past years including more and more processes like two-dimensional treatment of convection trying to provide a more comprehensive and consistent description of high-pressure discharge lamps. However, there is a lack of experimental data to examine the performance of these models. This work provides a very complete set of geometrical, electrical, spectroscopic, and thermographic data according to the warm-upmore » of a high-pressure mercury discharge lamp that is compared to the results of a state of the art fluid code. Quantitative agreement is achieved for single parameters like wall temperatures. But the paper also reveals the need for further investigations and improvements of the code.« less

  4. Supercomputer description of human lung morphology for imaging analysis.

    PubMed

    Martonen, T B; Hwang, D; Guan, X; Fleming, J S

    1998-04-01

    A supercomputer code that describes the three-dimensional branching structure of the human lung has been developed. The algorithm was written for the Cray C94. In our simulations, the human lung was divided into a matrix containing discrete volumes (voxels) so as to be compatible with analyses of SPECT images. The matrix has 3840 voxels. The matrix can be segmented into transverse, sagittal and coronal layers analogous to human subject examinations. The compositions of individual voxels were identified by the type and respective number of airways present. The code provides a mapping of the spatial positions of the almost 17 million airways in human lungs and unambiguously assigns each airway to a voxel. Thus, the clinician and research scientist in the medical arena have a powerful new tool to be used in imaging analyses. The code was designed to be integrated into diverse applications, including the interpretation of SPECT images, the design of inhalation exposure experiments and the targeted delivery of inhaled pharmacologic drugs.

  5. Modernizing the ATLAS simulation infrastructure

    NASA Astrophysics Data System (ADS)

    Di Simone, A.; CollaborationAlbert-Ludwigs-Universitt Freiburg, ATLAS; Institut, Physikalisches; Br., 79104 Freiburg i.; Germany

    2017-10-01

    The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4-MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including truth, the record of particle interactions with the detector during the simulation. These advances were possible thanks to close interactions with the Geant4 developers.

  6. Prototype Mixed Finite Element Hydrodynamics Capability in ARES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rieben, R N

    This document describes work on a prototype Mixed Finite Element Method (MFEM) hydrodynamics algorithm in the ARES code, and its application to a set of standard test problems. This work is motivated by the need for improvements to the algorithms used in the Lagrange hydrodynamics step to make them more robust. We begin by identifying the outstanding issues with traditional numerical hydrodynamics algorithms followed by a description of the proposed method and how it may address several of these longstanding issues. We give a theoretical overview of the proposed MFEM algorithm as well as a summary of the coding additionsmore » and modifications that were made to add this capability to the ARES code. We present results obtained with the new method on a set of canonical hydrodynamics test problems and demonstrate significant improvement in comparison to results obtained with traditional methods. We conclude with a summary of the issues still at hand and motivate the need for continued research to develop the proposed method into maturity.« less

  7. A three-dimensional viscous/potential flow interaction analysis method for multi-element wings: Modifications to the potential flow code to allow part-span, high-lift devices and close-interference calculations

    NASA Technical Reports Server (NTRS)

    Maskew, B.

    1979-01-01

    The description of the modified code includes details of a doublet subpanel technique in which panels that are close to a velocity calculation point are replaced by a subpanel set. This treatment gives the effect of a higher panel density without increasing the number of unknowns. In particular, the technique removes the close approach problem of the earlier singularity model in which distortions occur in the detailed pressure calculation near panel corners. Removal of this problem allowed a complete wake relaxation and roll-up iterative procedure to be installed in the code. The geometry package developed for the new technique and also for the more general configurations is based on a multiple patch scheme. Each patch has a regular array of panels, but arbitrary relationships are allowed between neighboring panels at the edges of adjacent patches. This provides great versatility for treating general configurations.

  8. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part D. ACE Competency Based Job Descriptions: #2--Child Care Attendent; #4--Guard; #8--Medical Assistant.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This first of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Child Care Attendent, Guard, and Medical Assistant. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T. general educational developmental…

  9. Description and Evaluation of GDEM-V 3.0

    DTIC Science & Technology

    2009-02-06

    Description and Evaluation of GDEM -V 3.0 Michael R. caRnes Ocean Sciences Branch Oceanography Division February 6, 2009 i REPORT DOCUMENTATION PAGE Form...include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Description and Evaluation of GDEM -V 3.0 Michael R. Carnes...unlimited. Unclassified Unclassified Unclassified UL 24 Michael R. Carnes (228) 688-5648 The GDEM (Generalized Digital Environment Model) has served as

  10. Task Descriptions in Diagnostic Radiology. Research Report No. 7. Volume 4, Index of Tasks by Code Number and Extended Name.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    The fourth of four volumes in Research Report No. 7 of the Health Services Mobility Study (HSMS), this book contains the extended task names of all the tasks whose descriptions can be found in the three prior volumes. It serves as an index to all the tasks by listing the volume in which each task description appears. Chapter 1 of this volume…

  11. Description and availability of the SMARTS spectral model for photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Myers, Daryl R.; Gueymard, Christian A.

    2004-11-01

    Limited spectral response range of photocoltaic (PV) devices requires device performance be characterized with respect to widely varying terrestrial solar spectra. The FORTRAN code "Simple Model for Atmospheric Transmission of Sunshine" (SMARTS) was developed for various clear-sky solar renewable energy applications. The model is partly based on parameterizations of transmittance functions in the MODTRAN/LOWTRAN band model family of radiative transfer codes. SMARTS computes spectra with a resolution of 0.5 nanometers (nm) below 400 nm, 1.0 nm from 400 nm to 1700 nm, and 5 nm from 1700 nm to 4000 nm. Fewer than 20 input parameters are required to compute spectral irradiance distributions including spectral direct beam, total, and diffuse hemispherical radiation, and up to 30 other spectral parameters. A spreadsheet-based graphical user interface can be used to simplify the construction of input files for the model. The model is the basis for new terrestrial reference spectra developed by the American Society for Testing and Materials (ASTM) for photovoltaic and materials degradation applications. We describe the model accuracy, functionality, and the availability of source and executable code. Applications to PV rating and efficiency and the combined effects of spectral selectivity and varying atmospheric conditions are briefly discussed.

  12. The Development of the Ducted Fan Noise Propagation and Radiation Code CDUCT-LaRC

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Farassat, F.; Pope, D. Stuart; Vatsa, Veer

    2003-01-01

    The development of the ducted fan noise propagation and radiation code CDUCT-LaRC at NASA Langley Research Center is described. This code calculates the propagation and radiation of given acoustic modes ahead of the fan face or aft of the exhaust guide vanes in the inlet or exhaust ducts, respectively. This paper gives a description of the modules comprising CDUCT-LaRC. The grid generation module provides automatic creation of numerical grids for complex (non-axisymmetric) geometries that include single or multiple pylons. Files for performing automatic inviscid mean flow calculations are also generated within this module. The duct propagation is based on the parabolic approximation theory of R. P. Dougherty. This theory allows the handling of complex internal geometries and the ability to study the effect of non-uniform (i.e. circumferentially and axially segmented) liners. Finally, the duct radiation module is based on the Ffowcs Williams-Hawkings (FW-H) equation with a penetrable data surface. Refraction of sound through the shear layer between the external flow and bypass duct flow is included. Results for benchmark annular ducts, as well as other geometries with pylons, are presented and compared with available analytical data.

  13. A framework for streamlining research workflow in neuroscience and psychology

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers. PMID:24478691

  14. Online Tools for Astronomy and Cosmochemistry

    NASA Technical Reports Server (NTRS)

    Meyer, B. S.

    2005-01-01

    Over the past year, the Webnucleo Group at Clemson University has been developing a web site with a number of interactive online tools for astronomy and cosmochemistry applications. The site uses SHP (Simplified Hypertext Preprocessor), which, because of its flexibility, allows us to embed almost any computer language into our web pages. For a description of SHP, please see http://www.joeldenny.com/ At our web site, an internet user may mine large and complex data sets, such as our stellar evolution models, and make graphs or tables of the results. The user may also run some of our detailed nuclear physics and astrophysics codes, such as our nuclear statistical equilibrium code, which is written in fortran and C. Again, the user may make graphs and tables and download the results.

  15. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    NASA Technical Reports Server (NTRS)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where each variable was used. The listings summarize all optimizations, listing the objective functions, design variables, and constraints. The compiler offers error-checking specific to optimization problems, so that simple mistakes will not cost hours of debugging time. The optimization engine used by and included with the SOL compiler is a version of Vanderplatt's ADS system (Version 1.1) modified specifically to work with the SOL compiler. SOL allows the use of the over 100 ADS optimization choices such as Sequential Quadratic Programming, Modified Feasible Directions, interior and exterior penalty function and variable metric methods. Default choices of the many control parameters of ADS are made for the user, however, the user can override any of the ADS control parameters desired for each individual optimization. The SOL language and compiler were developed with an advanced compiler-generation system to ensure correctness and simplify program maintenance. Thus, SOL's syntax was defined precisely by a LALR(1) grammar and the SOL compiler's parser was generated automatically from the LALR(1) grammar with a parser-generator. Hence unlike ad hoc, manually coded interfaces, the SOL compiler's lexical analysis insures that the SOL compiler recognizes all legal SOL programs, can recover from and correct for many errors and report the location of errors to the user. This version of the SOL compiler has been implemented on VAX/VMS computer systems and requires 204 KB of virtual memory to execute. Since the SOL compiler produces FORTRAN code, it requires the VAX FORTRAN compiler to produce an executable program. The SOL compiler consists of 13,000 lines of Pascal code. It was developed in 1986 and last updated in 1988. The ADS and other utility subroutines amount to 14,000 lines of FORTRAN code and were also updated in 1988.

  16. Relative efficiency and accuracy of two Navier-Stokes codes for simulating attached transonic flow over wings

    NASA Technical Reports Server (NTRS)

    Bonhaus, Daryl L.; Wornom, Stephen F.

    1991-01-01

    Two codes which solve the 3-D Thin Layer Navier-Stokes (TLNS) equations are used to compute the steady state flow for two test cases representing typical finite wings at transonic conditions. Several grids of C-O topology and varying point densities are used to determine the effects of grid refinement. After a description of each code and test case, standards for determining code efficiency and accuracy are defined and applied to determine the relative performance of the two codes in predicting turbulent transonic wing flows. Comparisons of computed surface pressure distributions with experimental data are made.

  17. INDEX TO 16MM EDUCATIONAL FILMS.

    ERIC Educational Resources Information Center

    University of Southern California, Los Angeles. National Information Center for Educational Media.

    SIXTEEN MILLIMETER EDUCATIONAL FILMS ARE LISTED WITH TITLE, DESCRIPTION, TIME, COLOR/BLACK AND WHITE, PRODUCER CODE NAME, DISTRIBUTER CODE NAME, AND DATE OF PRODUCTION. FILMS ARE LISTED IN TWO WAYS--WITH TITLE ONLY BY SUBJECT IN A SUBJECT MATTER SECTION WHICH HAS AN OUTLINE AND INDEX, AND WITH ALL DATA IN A SECTION WHICH LISTS ALL FILMS…

  18. Occupational Titles Including Job Descriptions for Health Occupations Education.

    ERIC Educational Resources Information Center

    East Texas State Univ., Commerce. Occupational Curriculum Lab.

    This alphabetical compilation of 80 occupational titles for health occupations education is taken from the Dictionary of Occupational Titles, (DOT), 4th edition, 1977. An index shows the arrangement of the occupational titles (together with instructional program and DOT code) according to the United States Office of Education code numbers. For…

  19. 50 CFR Table 15 to Part 679 - Gear Codes, Descriptions, and Use

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., power gurdy TROLL X X 15 X X All other gear types OTH X X ADF&G GEAR CODES Diving 11 X X Dredge 22 X X Dredge, hydro/mechanical 23 X X Fish ladder/raceway 77 X X Fish wheel 08 X X Gillnet, drift 03 X X...

  20. 78 FR 72576 - Criteria for a Catastrophically Disabled Determination for Purposes of Enrollment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... Procedural Terminology (CPT[supreg]) codes. The revisions ensure that the regulation is not out of date when... trademark of the American Medical Association. CPT codes and descriptions are copyrighted by the American Medical Association. All rights reserved.) This approach will soon be outdated; the ICD-9-CM and CPT...

  1. Reporting of occupational injury and illness in the semiconductor manufacturing industry.

    PubMed

    McCurdy, S A; Schenker, M B; Samuels, S J

    1991-01-01

    In the United States, occupational illness and injury cases meeting specific reporting criteria are recorded on company Occupational Safety and Health Administration (OSHA) 200 logs; case description data are submitted to participating state agencies for coding and entry in the national Supplementary Data System (SDS). We evaluated completeness of reporting (the percentage of reportable cases that were recorded in the company OSHA 200 log) in the semiconductor manufacturing industry by reviewing company health clinic records for 1984 of 10 manufacturing sites of member companies of a national semiconductor manufacturing industry trade association. Of 416 randomly selected work-related cases, 101 met OSHA reporting criteria. Reporting completeness was 60 percent and was lowest for occupational illnesses (44 percent). Case-description data from 150 reported cases were submitted twice to state coding personnel to evaluate coding reliability. Reliability was high (kappa 0.82-0.93) for "nature," "affected body part," "source," and "type" variables. Coding for the SDS appears reliable; reporting completeness may be improved by use of a stepwise approach by company personnel responsible for reporting decisions.

  2. Defining acute aortic syndrome after trauma: Are Abbreviated Injury Scale codes a useful surrogate descriptor?

    PubMed

    Leach, R; McNally, Donal; Bashir, Mohamad; Sastry, Priya; Cuerden, Richard; Richens, David; Field, Mark

    2012-10-01

    The severity and location of injuries resulting from vehicular collisions are normally recorded in Abbreviated Injury Scale (AIS) code; we propose a system to link AIS code to a description of acute aortic syndrome (AAS), thus allowing the hypothesis that aortic injury is progressive with collision kinematics to be tested. Standard AIS codes were matched with a clinical description of AAS. A total of 199 collisions that resulted in aortic injury were extracted from a national automotive collision database and the outcomes mapped onto AAS descriptions. The severity of aortic injury (AIS severity score) and stage of AAS progression were compared with collision kinematics and occupant demographics. Post hoc power analyses were used to estimate maximum effect size. The general demographic distribution of the sample represented that of the UK population in regard to sex and age. No significant relationship was observed between estimated test speed, collision direction, occupant location or seat belt use and clinical progression of aortic injury (once initiated). Power analysis confirmed that a suitable sample size was used to observe a medium effect in most of the cases. Similarly, no association was observed between injury severity and collision kinematics. There is sufficient information on AIS severity and location codes to map onto the clinical AAS spectrum. It was not possible, with this data set, to consider the influence of collision kinematics on aortic injury initiation. However, it was demonstrated that after initiation, further progression along the AAS pathway was not influenced by collision kinematics. This might be because the injury is not progressive, because the vehicle kinematics studied do not fully represent the kinematics of the occupants, or because an unknown factor, such as stage of cardiac cycle, dominates. Epidemiologic/prognostic study, level IV.

  3. Optimization of the propulsion for multistage solid rocket motor launchers

    NASA Astrophysics Data System (ADS)

    Calabro, M.; Dufour, A.; Macaire, A.

    2002-02-01

    Some tools focused on a rapid multidisciplinary optimization capability for multistage launch vehicle design were developed at EADS-LV. These tools may be broken down into two categories, those related to propulsion design optimization and a computer code devoted to trajectories and under constraints optimization. Both are linked in order to obtain optimal vehicle design after an iterative process. After a description of the two categories tools, an example of application is given on a small space launcher.

  4. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Thoracic/Vascular Surgery Team (FFGKT)

    DTIC Science & Technology

    2010-11-10

    asset, including combat wounds, non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient...patient conditions and the frequency at which they would present. The resulting illness and injury frequencies characterize the expected patient...The scenario is shown in Table 1. Table 1 Thoracic/Vascular Scenario ICD-9 ICD-9 description No. patients 903.9 INJURY ARM VESSEL NOS 2 904.8

  5. An open-source textbook for teaching climate-related risk analysis using the R computing environment

    NASA Astrophysics Data System (ADS)

    Applegate, P. J.; Keller, K.

    2015-12-01

    Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.

  6. JASMIN: Japanese-American study of muon interactions and neutron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.

    Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of amore » code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.« less

  7. HINCOF-1: a Code for Hail Ingestion in Engine Inlets

    NASA Technical Reports Server (NTRS)

    Gopalaswamy, N.; Murthy, S. N. B.

    1995-01-01

    One of the major concerns during hail ingestion into an engine is the resulting amount and space- and time-wise distribution of hail at the engine face for a given geometry of inlet and set of atmospheric and flight conditions. The appearance of hail in the capture streamtube is invariably random in space and time, with respect to size and momentum. During the motion of a hailstone through an inlet, a hailstone undergoes several processes, namely impact with other hailstones and material surfaces of the inlet and spinner, rolling and rebound following impact; heat and mass transfer; phase change; and shattering, the latter three due to friction and impact. Taking all of these factors into account, a numerical code, designated HINCOF-I, has been developed for determining the motion hailstones from the atmosphere, through an inlet, and up to the engine face. The numerical procedure is based on the Monte-Carlo method. The report presents a description of the code, along with several illustrative cases. The code can be utilized to relate the spinner geometry - conical or, more effective, elliptical - to the possible diversion of hail at the engine face into the bypass stream. The code is also useful for assessing the influence of various hail characteristics on the ingestion and distribution of hailstones over the engine face.

  8. Test case for VVER-1000 complex modeling using MCU and ATHLET

    NASA Astrophysics Data System (ADS)

    Bahdanovich, R. B.; Bogdanova, E. V.; Gamtsemlidze, I. D.; Nikonov, S. P.; Tikhomirov, G. V.

    2017-01-01

    The correct modeling of processes occurring in the fuel core of the reactor is very important. In the design and operation of nuclear reactors it is necessary to cover the entire range of reactor physics. Very often the calculations are carried out within the framework of only one domain, for example, in the framework of structural analysis, neutronics (NT) or thermal hydraulics (TH). However, this is not always correct, as the impact of related physical processes occurring simultaneously, could be significant. Therefore it is recommended to spend the coupled calculations. The paper provides test case for the coupled neutronics-thermal hydraulics calculation of VVER-1000 using the precise neutron code MCU and system engineering code ATHLET. The model is based on the fuel assembly (type 2M). Test case for calculation of power distribution, fuel and coolant temperature, coolant density, etc. has been developed. It is assumed that the test case will be used for simulation of VVER-1000 reactor and in the calculation using other programs, for example, for codes cross-verification. The detailed description of the codes (MCU, ATHLET), geometry and material composition of the model and an iterative calculation scheme is given in the paper. Script in PERL language was written to couple the codes.

  9. Enhanced capabilities and modified users manual for axial-flow compressor conceptual design code CSPAN

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.; Lavelle, Thomas M.

    1995-01-01

    Modifications made to the axial-flow compressor conceptual design code CSPAN are documented in this report. Endwall blockage and stall margin predictions were added. The loss-coefficient model was upgraded. Default correlations for rotor and stator solidity and aspect-ratio inputs and for stator-exit tangential velocity inputs were included in the code along with defaults for aerodynamic design limits. A complete description of input and output along with sample cases are included.

  10. Computer Description of the M561 Utility Truck

    DTIC Science & Technology

    1984-10-01

    GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom

  11. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  12. GPU-accelerated atmospheric chemical kinetics in the ECHAM/MESSy (EMAC) Earth system model (version 2.52)

    NASA Astrophysics Data System (ADS)

    Alvanos, Michail; Christoudias, Theodoros

    2017-10-01

    This paper presents an application of GPU accelerators in Earth system modeling. We focus on atmospheric chemical kinetics, one of the most computationally intensive tasks in climate-chemistry model simulations. We developed a software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in the global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC), used to study climate change and air quality scenarios. A source-to-source compiler outputs a CUDA-compatible kernel by parsing the FORTRAN code generated by the Kinetic PreProcessor (KPP) general analysis tool. All Rosenbrock methods that are available in the KPP numerical library are supported.Performance evaluation, using Fermi and Pascal CUDA-enabled GPU accelerators, shows achieved speed-ups of 4. 5 × and 20. 4 × , respectively, of the kernel execution time. A node-to-node real-world production performance comparison shows a 1. 75 × speed-up over the non-accelerated application using the KPP three-stage Rosenbrock solver. We provide a detailed description of the code optimizations used to improve the performance including memory optimizations, control code simplification, and reduction of idle time. The accuracy and correctness of the accelerated implementation are evaluated by comparing to the CPU-only code of the application. The median relative difference is found to be less than 0.000000001 % when comparing the output of the accelerated kernel the CPU-only code.The approach followed, including the computational workload division, and the developed GPU solver code can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications.

  13. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part R. ACE Competency Based Job Descriptions: #95--Bus Driver; #98--General Loader; #99--Forklift Operator; #100--Material Handler.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This fifteenth of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Bus Driver, General Loader, Forklift Operator, and Material Handler. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T.…

  14. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part F. ACE Competency Based Job Descriptions: #20--Body Fender Mechanic; #21--New Car Get-Ready Person.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This third of sixteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Body Fender Mechanic and New Car Get-Ready Person. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T. general educational developmental…

  15. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part H. ACE Competency Based Job Descriptions: #25--Household Appliance Mechanic; #26--Lineworker; #27--Painter Helper, Spray; #28--Painter, Brush; #29--Carpenter Apprentice.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This fifth of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Household Appliance Mechanic; Lineworker; Painter Helper, Spray; Painter, Brush; and Carpenter Apprentice. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE…

  16. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  17. Description of interventions is under-reported in physical therapy clinical trials.

    PubMed

    Hariohm, K; Jeyanthi, S; Kumar, J Saravan; Prakash, V

    Amongst several barriers to the application of quality clinical evidence and clinical guidelines into routine daily practice, poor description of interventions reported in clinical trials has received less attention. Although some studies have investigated the completeness of descriptions of non-pharmacological interventions in randomized trials, studies that exclusively analyzed physical therapy interventions reported in published trials are scarce. To evaluate the quality of descriptions of interventions in both experimental and control groups in randomized controlled trials published in four core physical therapy journals. We included all randomized controlled trials published from the Physical Therapy Journal, Journal of Physiotherapy, Clinical Rehabilitation, and Archives of Physical Medicine and Rehabilitation between June 2012 and December 2013. Each randomized controlled trial (RCT) was analyzed and coded for description of interventions using the checklist developed by Schroter et al. Out of 100 RCTs selected, only 35 RCTs (35%) fully described the interventions in both the intervention and control groups. Control group interventions were poorly described in the remaining RCTs (65%). Interventions, especially in the control group, are poorly described in the clinical trials published in leading physical therapy journals. A complete description of the intervention in a published report is crucial for physical therapists to be able to use the intervention in clinical practice. Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.

  18. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  19. Energy loss of argon in a laser-generated carbon plasma.

    PubMed

    Frank, A; Blazević, A; Grande, P L; Harres, K; Hessling, T; Hoffmann, D H H; Knobloch-Maas, R; Kuznetsov, P G; Nürnberg, F; Pelka, A; Schaumann, G; Schiwietz, G; Schökel, A; Schollmeier, M; Schumacher, D; Schütrumpf, J; Vatulin, V V; Vinokurov, O A; Roth, M

    2010-02-01

    The experimental data presented in this paper address the energy loss determination for argon at 4 MeV/u projectile energy in laser-generated carbon plasma covering a huge parameter range in density and temperature. Furthermore, a consistent theoretical description of the projectile charge state evolution via a Monte Carlo code is combined with an improved version of the CasP code that allows us to calculate the contributions to the stopping power of bound and free electrons for each projectile charge state. This approach gets rid of any effective charge description of the stopping power. Comparison of experimental data and theoretical results allows us to judge the influence of different plasma parameters.

  20. A Manual for Coding Descriptions, Interpretations, and Evaluations of Visual Art Forms.

    ERIC Educational Resources Information Center

    Acuff, Bette C.; Sieber-Suppes, Joan

    This manual presents a system for categorizing stated esthetic responses to paintings. It is primarily a training manual for coders, but it may also be used for teaching reflective thinking skills and for evaluating programs of art education. The coding system contains 33 subdivisions of esthetic responses under three major categories: Cue…

  1. 50 CFR Table 2b to Part 679 - Species Codes: FMP Prohibited Species and CR Crab

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CR Crab 2b Table 2b to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... CR Crab Species Description Code CR Crab Groundfish PSC CRAB Box Lopholithodes mandtii 900... aequispinus 923 ✓ ✓ King, red Paralithodes camtshaticus 921 ✓ ✓ King, scarlet (deepsea) Lithodes couesi 924...

  2. The Teaching of the Code of Ethics and Standard Practices for Texas Educator Preparation Programs

    ERIC Educational Resources Information Center

    Davenport, Marvin; Thompson, J. Ray; Templeton, Nathan R.

    2015-01-01

    The purpose of this descriptive quantitative research study was to answer three basic informational questions: (1) To what extent ethics training, as stipulated in Texas Administrative Code Chapter 247, was included in the EPP curriculum; (2) To what extent Texas public universities with approved EPP programs provided faculty opportunities for…

  3. Dependents' Educational Assistance Program (DEA), Chapter 25 of Title 38, U.S. Code

    ERIC Educational Resources Information Center

    US Department of Veterans Affairs, 2005

    2005-01-01

    This pamphlet provides a general description of the Dependents' Educational Assistance program, or DEA (chapter 35 of title 38, U. S. Code). The DEA program provides education and training opportunities to eligible dependents and survivors of certain veterans. It covers the main questions prospective participants may have about DEA benefits,…

  4. The Impact of Bar Code Medication Administration Technology on Reported Medication Errors

    ERIC Educational Resources Information Center

    Holecek, Andrea

    2011-01-01

    The use of bar-code medication administration technology is on the rise in acute care facilities in the United States. The technology is purported to decrease medication errors that occur at the point of administration. How significantly this technology affects actual rate and severity of error is unknown. This descriptive, longitudinal research…

  5. Infections in Combat Casualties During Operations Iraqi and Enduring Freedom

    DTIC Science & Technology

    2009-04-01

    bacteria other 39 112.1 Vulva/vaginal candidiasis 1 381.4 Nonsuppurative otitis media 1 451.82 Superficial phlebitis arm 2 451.83 Deep phlebitis arm 1...Coding by Pathogen Pathogen Code Code Description Number Fungus 112.1 Vulva/vaginal candidiasis 1 112.3 Candidiasis of skin/nails 1 112.5 Disseminated... candidiasis 3 112.89 Candidiasis site not available 6 112.9 Candidiasis site unspecified 13 117.3 Aspergillus 5 117.9 Mycoses 14 Gram-negative 003.8

  6. Computer Aided Design of Polyhedron Solids to Model Air in Com-Geom Descriptions

    DTIC Science & Technology

    1983-08-01

    34The GIFT Code User Manual, Volume I, Introduction and Input Requirements," BRL Report No. 1802, July 1975 (Unclassified). (AD B0060Z7LK 2G...Kuehl, L. Bain and M. Reisinger, "The GIFT Code User Manual, Volume II, The Output Options," BRL Report ARBRL-TR-02189, September 1979...is generated from the GIFT code under op- tion XSECT. This option produces plot files which define cross- sectional views of the COM-GEOM

  7. Hot zero power reactor calculations using the Insilico code

    DOE PAGES

    Hamilton, Steven P.; Evans, Thomas M.; Davidson, Gregory G.; ...

    2016-03-18

    In this paper we describe the reactor physics simulation capabilities of the insilico code. A description of the various capabilities of the code is provided, including detailed discussion of the geometry, meshing, cross section processing, and neutron transport options. Numerical results demonstrate that the insilico SP N solver with pin-homogenized cross section generation is capable of delivering highly accurate full-core simulation of various PWR problems. Comparison to both Monte Carlo calculations and measured plant data is provided.

  8. Instruct coders' manual

    NASA Technical Reports Server (NTRS)

    Friend, J.

    1971-01-01

    A manual designed both as an instructional manual for beginning coders and as a reference manual for the coding language INSTRUCT, is presented. The manual includes the major programs necessary to implement the teaching system and lists the limitation of current implementation. A detailed description is given of how to code a lesson, what buttons to push, and what utility programs to use. Suggestions for debugging coded lessons and the error messages that may be received during assembly or while running the lesson are given.

  9. Validation of NASA Thermal Ice Protection Computer Codes. Part 1; Program Overview

    NASA Technical Reports Server (NTRS)

    Miller, Dean; Bond, Thomas; Sheldon, David; Wright, William; Langhals, Tammy; Al-Khalil, Kamel; Broughton, Howard

    1996-01-01

    The Icing Technology Branch at NASA Lewis has been involved in an effort to validate two thermal ice protection codes developed at the NASA Lewis Research Center. LEWICE/Thermal (electrothermal deicing & anti-icing), and ANTICE (hot-gas & electrothermal anti-icing). The Thermal Code Validation effort was designated as a priority during a 1994 'peer review' of the NASA Lewis Icing program, and was implemented as a cooperative effort with industry. During April 1996, the first of a series of experimental validation tests was conducted in the NASA Lewis Icing Research Tunnel(IRT). The purpose of the April 96 test was to validate the electrothermal predictive capabilities of both LEWICE/Thermal, and ANTICE. A heavily instrumented test article was designed and fabricated for this test, with the capability of simulating electrothermal de-icing and anti-icing modes of operation. Thermal measurements were then obtained over a range of test conditions, for comparison with analytical predictions. This paper will present an overview of the test, including a detailed description of: (1) the validation process; (2) test article design; (3) test matrix development; and (4) test procedures. Selected experimental results will be presented for de-icing and anti-icing modes of operation. Finally, the status of the validation effort at this point will be summarized. Detailed comparisons between analytical predictions and experimental results are contained in the following two papers: 'Validation of NASA Thermal Ice Protection Computer Codes: Part 2- The Validation of LEWICE/Thermal' and 'Validation of NASA Thermal Ice Protection Computer Codes: Part 3-The Validation of ANTICE'

  10. A Loader for Executing Multi-Binary Applications on the Thinking Machines CM-5: It's Not Just for SPMD Anymore

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey C.

    1995-01-01

    The Thinking Machines CM-5 platform was designed to run single program, multiple data (SPMD) applications, i.e., to run a single binary across all nodes of a partition, with each node possibly operating on different data. Certain classes of applications, such as multi-disciplinary computational fluid dynamics codes, are facilitated by the ability to have subsets of the partition nodes running different binaries. In order to extend the CM-5 system software to permit such applications, a multi-program loader was developed. This system is based on the dld loader which was originally developed for workstations. This paper provides a high level description of dld, and describes how it was ported to the CM-5 to provide support for multi-binary applications. Finally, it elaborates how the loader has been used to implement the CM-5 version of MPIRUN, a portable facility for running multi-disciplinary/multi-zonal MPI (Message-Passing Interface Standard) codes.

  11. Transient flow analysis linked to fast pressure disturbance monitored in pipe systems

    NASA Astrophysics Data System (ADS)

    Kueny, J. L.; Lourenco, M.; Ballester, J. L.

    2012-11-01

    EDF Hydro Division has launched the RENOUVEAU program in order to increase performance and improve plant availability through anticipation. Due to this program, a large penstocks fleet is equipped with pressure transducers linked to a special monitoring system. Any significant disturbance of the pressure is captured in a snapshot and the waveform of the signal is stored and analyzed. During these transient states, variations in flow are unknown. In order to determine the structural impact of such overpressure occurring during complex transients conditions over the entire circuit, EDF DTG has asked ENSE3 GRENOBLE to develop a code called ACHYL CF*. The input data of ACHYL CF are circuit topology and pressure boundaries conditions. This article provide a description of the computer code developed for modeling the transient flow in a pipe network using the signals from pressure transducers as boundary conditions. Different test cases will be presented, simulating real hydro power plants for which measured pressure signals are available.

  12. VizieR Online Data Catalog: Habitable zones around main-sequence stars (Kopparapu+, 2014)

    NASA Astrophysics Data System (ADS)

    Kopparapu, R. K.; Ramirez, R. M.; Schottelkotte, J.; Kasting, J. F.; Domagal-Goldman, S.; Eymet, V.

    2017-08-01

    Language: Fortran 90 Code tested under the following compilers/operating systems: ifort/CentOS linux Description of input data: No input necessary. Description of output data: Output files: HZs.dat, HZ_coefficients.dat System requirements: No major system requirement. Fortran compiler necessary. Calls to external routines: None. Additional comments: None (1 data file).

  13. When Do Siblings Compromise? Associations with Children's Descriptions of Conflict Issues, Culpability, and Emotions

    ERIC Educational Resources Information Center

    Recchia, Holly E.; Howe, Nina

    2010-01-01

    This study examined associations between children's descriptions of sibling conflicts and their resolutions during a structured negotiation task. A sample of 58 sibling dyads (older sibling M age = 8.39 years, younger sibling M = 6.06 years) were privately interviewed about an actual conflict. Each child provided a narrative that was coded for…

  14. P80 SRM low torque flex-seal development - thermal and chemical modeling of molding process

    NASA Astrophysics Data System (ADS)

    Descamps, C.; Gautronneau, E.; Rousseau, G.; Daurat, M.

    2009-09-01

    The development of the flex-seal component of the P80 nozzle gave the opportunity to set up new design and manufacturing process methods. Due to the short development lead time required by VEGA program, the usual manufacturing iterative tests work flow, which is usually time consuming, had to be enhanced in order to use a more predictive approach. A newly refined rubber vulcanization description was built up and identified on laboratory samples. This chemical model was implemented in a thermal analysis code. The complete model successfully supports the manufacturing processes. These activities were conducted with the support of ESA/CNES Research & Technologies and DGA (General Delegation for Armament).

  15. Equivalent Longitudinal Area Distributions of the B-58 and XB-70-1 Airplanes for Use in Wave Drag and Sonic Boom Calculations

    NASA Technical Reports Server (NTRS)

    Tinetti, Ana F.; Maglieri, Domenic J.; Driver, Cornelius; Bobbitt, Percy J.

    2011-01-01

    A detailed geometric description, in wave drag format, has been developed for the Convair B-58 and North American XB-70-1 delta wing airplanes. These descriptions have been placed on electronic files, the contents of which are described in this paper They are intended for use in wave drag and sonic boom calculations. Included in the electronic file and in the present paper are photographs and 3-view drawings of the two airplanes, tabulated geometric descriptions of each vehicle and its components, and comparisons of the electronic file outputs with existing data. The comparisons include a pictorial of the two airplanes based on the present geometric descriptions, and cross-sectional area distributions for both the normal Mach cuts and oblique Mach cuts above and below the vehicles. Good correlation exists between the area distributions generated in the late 1950s and 1960s and the present files. The availability of these electronic files facilitates further validation of sonic boom prediction codes through the use of two existing data bases on these airplanes, which were acquired in the 1960s and have not been fully exploited.

  16. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  17. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part N. ACE Competency Based Job Descriptions: #65--Typist I; #67--Grocery Checker; #68--File Clerk; #69--Receptionist; #70--Bank Teller; #71--Clerk, General Office.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This eleventh of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Typist I, Grocery Checker, File Clerk, Receptionist; Bank Teller; and Clerk, General Office. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number,…

  18. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part O. ACE Competency Based Job Descriptions: #72--Ward Clerk; #73--Account Clerk; #74--Mail Handler (Messenger); #75--Payroll Clerk.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This twelfth of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Ward Clerk, Account Clerk, Mail Handler (Messenger), and Payroll Clerk. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T. general…

  19. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part Q. ACE Competency Based Job Descriptions: #91--Meat Cutter; #92--Shipping Clerk; #93--Long Haul Truck Driver; #94--Truck Driver--Light.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This fourteenth of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Meat Cutter, Shipping Clerk, Long Haul Truck Driver, and Truck Driver--Light. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T.…

  20. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part K. ACE Competency Based Job Descriptions: #40--Salesperson, Automobile; #41--Salesperson, Men's Wear; #42--Waiter/Waitress; #45--Janitor; #46--Porter; #48--Pressing Machine Operator.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This eighth of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Salesperson, Automotive; Salesperson, Men's Wear; Waiter/Waitress; Janitor; Porter; and Pressing Machine Operator. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE…

  1. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part I. ACE Competency Based Job Descriptions: #30--Roofer Apprentice; #31--Pipefitter; #32--Medical Supply Clerk; #33--Stock Clerk; #35--Warehouseman (Laborer, Stores).

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This sixth of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Roofer Apprentice, Pipefitter, Medical Supply Clerk, Stock Clerk, and Warehouseperson. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T.…

  2. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part E. ACE Competency Based Job Descriptions: #11--Groundskeeper; #12--Animal Keeper; #15--Tire Repairperson; #16--Muffler Installer; #17--Garage Mechanic.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This second of fifteen sets of Adult Competency Education (ACE) Based Job Descriptions in the ACE kit contains job descriptions for Groundskeeper, Animal Keeper, Tire Repairperson, Muffler Installer, and Garage Mechanic. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder, D.O.T.…

  3. Adult Competency Education Kit. Basic Skills in Speaking, Math, and Reading for Employment. Part L. ACE Competency Based Job Descriptions: #52--Maid; #54--Ticket Agent; #55--Cosmetologist; #57--Counterperson; #58--Cook's Helper; #59--Kitchen Helper.

    ERIC Educational Resources Information Center

    San Mateo County Office of Education, Redwood City, CA. Career Preparation Centers.

    This ninth of fifteen sets of Adult Competency Education (ACE) Competency Based Job Descriptions in the ACE kit contains job descriptions for Maid, Ticket Agent, Cosmetologist, Counterperson, Cook's Helper, and Kitchen Helper. Each begins with a fact sheet that includes this information: occupational title, D.O.T. code, ACE number, career ladder,…

  4. Scalable and expressive medical terminologies.

    PubMed

    Mays, E; Weida, R; Dionne, R; Laker, M; White, B; Liang, C; Oles, F J

    1996-01-01

    The K-Rep system, based on description logic, is used to represent and reason with large and expressive controlled medical terminologies. Expressive concept descriptions incorporate semantically precise definitions composed using logical operators, together with important non-semantic information such as synonyms and codes. Examples are drawn from our experience with K-Rep in modeling the InterMed laboratory terminology and also developing a large clinical terminology now in production use at Kaiser-Permanente. System-level scalability of performance is achieved through an object-oriented database system which efficiently maps persistent memory to virtual memory. Equally important is conceptual scalability-the ability to support collaborative development, organization, and visualization of a substantial terminology as it evolves over time. K-Rep addresses this need by logically completing concept definitions and automatically classifying concepts in a taxonomy via subsumption inferences. The K-Rep system includes a general-purpose GUI environment for terminology development and browsing, a custom interface for formulary term maintenance, a C+2 application program interface, and a distributed client-server mode which provides lightweight clients with efficient run-time access to K-Rep by means of a scripting language.

  5. A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model

    NASA Technical Reports Server (NTRS)

    Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.

    2006-01-01

    The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, A. A., E-mail: aai@a5.kiam.ru; Martynov, A. A., E-mail: martynov@a5.kiam.ru; Medvedev, S. Yu., E-mail: medvedev@a5.kiam.ru

    In the MHD tokamak plasma theory, the plasma pressure is usually assumed to be isotropic. However, plasma heating by neutral beam injection and RF heating can lead to a strong anisotropy of plasma parameters and rotation of the plasma. The development of MHD equilibrium theory taking into account the plasma inertia and anisotropic pressure began a long time ago, but until now it has not been consistently applied in computational codes for engineering calculations of the plasma equilibrium and evolution in tokamak. This paper contains a detailed derivation of the axisymmetric plasma equilibrium equation in the most general form (withmore » arbitrary rotation and anisotropic pressure) and description of the specialized version of the SPIDER code. The original method of calculation of the equilibrium with an anisotropic pressure and a prescribed rotational transform profile is proposed. Examples of calculations and discussion of the results are also presented.« less

  7. Smoking Cessation among Low-Socioeconomic Status and Disadvantaged Population Groups: A Systematic Review of Research Output.

    PubMed

    Courtney, Ryan J; Naicker, Sundresan; Shakeshaft, Anthony; Clare, Philip; Martire, Kristy A; Mattick, Richard P

    2015-06-08

    Smoking cessation research output should move beyond descriptive research of the health problem to testing interventions that can provide causal data and effective evidence-based solutions. This review examined the number and type of published smoking cessation studies conducted in low-socioeconomic status (low-SES) and disadvantaged population groups. A systematic database search was conducted for two time periods: 2000-2004 (TP1) and 2008-2012 (TP2). Publications that examined smoking cessation in a low-SES or disadvantaged population were coded by: population of interest; study type (reviews, non-data based publications, data-based publications (descriptive, measurement and intervention research)); and country. Intervention studies were coded in accordance with the Cochrane Effective Practice and Organisation of Care data collection checklist and use of biochemical verification of self-reported abstinence was assessed. 278 citations were included. Research output (i.e., all study types) had increased from TP1 27% to TP2 73% (χ²=73.13, p<0.001), however, the proportion of data-based research had not significantly increased from TP1 and TP2: descriptive (TP1=23% vs. TP2=33%) or intervention (TP1=77% vs. TP2=67%). The proportion of intervention studies adopting biochemical verification of self-reported abstinence had significantly decreased from TP1 to TP2 with an increased reliance on self-reported abstinence (TP1=12% vs. TP2=36%). The current research output is not ideal or optimal to decrease smoking rates. Research institutions, scholars and funding organisations should take heed to review findings when developing future research and policy.

  8. Moment Tensor Descriptions for Simulated Explosions of the Source Physics Experiment (SPE)

    NASA Astrophysics Data System (ADS)

    Yang, X.; Rougier, E.; Knight, E. E.; Patton, H. J.

    2014-12-01

    In this research we seek to understand damage mechanisms governing the behavior of geo-materials in the explosion source region, and the role they play in seismic-wave generation. Numerical modeling tools can be used to describe these mechanisms through the development and implementation of appropriate material models. Researchers at Los Alamos National Laboratory (LANL) have been working on a novel continuum-based-viscoplastic strain-rate-dependent fracture material model, AZ_Frac, in an effort to improve the description of these damage sources. AZ_Frac has the ability to describe continuum fracture processes, and at the same time, to handle pre-existing anisotropic material characteristics. The introduction of fractures within the material generates further anisotropic behavior that is also accounted for within the model. The material model has been calibrated to a granitic medium and has been applied in a number of modeling efforts under the SPE project. In our modeling, we use a 2D, axisymmetric layered earth model of the SPE site consisting of a weathered layer on top of a half-space. We couple the hydrodynamic simulation code with a seismic simulation code and propagate the signals to distances of up to 2 km. The signals are inverted for time-dependent moment tensors using a modified inversion scheme that accounts for multiple sources at different depths. The inversion scheme is evaluated for its resolving power to determine a centroid depth and a moment tensor description of the damage source. The capabilities of the inversion method to retrieve such information from waveforms recorded on three SPE tests conducted to date are also being assessed.

  9. COGNATE: comparative gene annotation characterizer.

    PubMed

    Wilbrandt, Jeanne; Misof, Bernhard; Niehuis, Oliver

    2017-07-17

    The comparison of gene and genome structures across species has the potential to reveal major trends of genome evolution. However, such a comparative approach is currently hampered by a lack of standardization (e.g., Elliott TA, Gregory TR, Philos Trans Royal Soc B: Biol Sci 370:20140331, 2015). For example, testing the hypothesis that the total amount of coding sequences is a reliable measure of potential proteome diversity (Wang M, Kurland CG, Caetano-Anollés G, PNAS 108:11954, 2011) requires the application of standardized definitions of coding sequence and genes to create both comparable and comprehensive data sets and corresponding summary statistics. However, such standard definitions either do not exist or are not consistently applied. These circumstances call for a standard at the descriptive level using a minimum of parameters as well as an undeviating use of standardized terms, and for software that infers the required data under these strict definitions. The acquisition of a comprehensive, descriptive, and standardized set of parameters and summary statistics for genome publications and further analyses can thus greatly benefit from the availability of an easy to use standard tool. We developed a new open-source command-line tool, COGNATE (Comparative Gene Annotation Characterizer), which uses a given genome assembly and its annotation of protein-coding genes for a detailed description of the respective gene and genome structure parameters. Additionally, we revised the standard definitions of gene and genome structures and provide the definitions used by COGNATE as a working draft suggestion for further reference. Complete parameter lists and summary statistics are inferred using this set of definitions to allow down-stream analyses and to provide an overview of the genome and gene repertoire characteristics. COGNATE is written in Perl and freely available at the ZFMK homepage ( https://www.zfmk.de/en/COGNATE ) and on github ( https://github.com/ZFMK/COGNATE ). The tool COGNATE allows comparing genome assemblies and structural elements on multiples levels (e.g., scaffold or contig sequence, gene). It clearly enhances comparability between analyses. Thus, COGNATE can provide the important standardization of both genome and gene structure parameter disclosure as well as data acquisition for future comparative analyses. With the establishment of comprehensive descriptive standards and the extensive availability of genomes, an encompassing database will become possible.

  10. Toward a standard reference database for computer-aided mammography

    NASA Astrophysics Data System (ADS)

    Oliveira, Júlia E. E.; Gueld, Mark O.; de A. Araújo, Arnaldo; Ott, Bastian; Deserno, Thomas M.

    2008-03-01

    Because of the lack of mammography databases with a large amount of codified images and identified characteristics like pathology, type of breast tissue, and abnormality, there is a problem for the development of robust systems for computer-aided diagnosis. Integrated to the Image Retrieval in Medical Applications (IRMA) project, we present an available mammography database developed from the union of: The Mammographic Image Analysis Society Digital Mammogram Database (MIAS), The Digital Database for Screening Mammography (DDSM), the Lawrence Livermore National Laboratory (LLNL), and routine images from the Rheinisch-Westfälische Technische Hochschule (RWTH) Aachen. Using the IRMA code, standardized coding of tissue type, tumor staging, and lesion description was developed according to the American College of Radiology (ACR) tissue codes and the ACR breast imaging reporting and data system (BI-RADS). The import was done automatically using scripts for image download, file format conversion, file name, web page and information file browsing. Disregarding the resolution, this resulted in a total of 10,509 reference images, and 6,767 images are associated with an IRMA contour information feature file. In accordance to the respective license agreements, the database will be made freely available for research purposes, and may be used for image based evaluation campaigns such as the Cross Language Evaluation Forum (CLEF). We have also shown that it can be extended easily with further cases imported from a picture archiving and communication system (PACS).

  11. Free-Form Region Description with Second-Order Pooling.

    PubMed

    Carreira, João; Caseiro, Rui; Batista, Jorge; Sminchisescu, Cristian

    2015-06-01

    Semantic segmentation and object detection are nowadays dominated by methods operating on regions obtained as a result of a bottom-up grouping process (segmentation) but use feature extractors developed for recognition on fixed-form (e.g. rectangular) patches, with full images as a special case. This is most likely suboptimal. In this paper we focus on feature extraction and description over free-form regions and study the relationship with their fixed-form counterparts. Our main contributions are novel pooling techniques that capture the second-order statistics of local descriptors inside such free-form regions. We introduce second-order generalizations of average and max-pooling that together with appropriate non-linearities, derived from the mathematical structure of their embedding space, lead to state-of-the-art recognition performance in semantic segmentation experiments without any type of local feature coding. In contrast, we show that codebook-based local feature coding is more important when feature extraction is constrained to operate over regions that include both foreground and large portions of the background, as typical in image classification settings, whereas for high-accuracy localization setups, second-order pooling over free-form regions produces results superior to those of the winning systems in the contemporary semantic segmentation challenges, with models that are much faster in both training and testing.

  12. Physical Models for Particle Tracking Simulations in the RF Gap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shishlo, Andrei P.; Holmes, Jeffrey A.

    2015-06-01

    This document describes the algorithms that are used in the PyORBIT code to track the particles accelerated in the Radio-Frequency cavities. It gives the mathematical description of the algorithms and the assumptions made in each case. The derived formulas have been implemented in the PyORBIT code. The necessary data for each algorithm are described in detail.

  13. Patterns of Revision in Online Writing: A Study of Wikipedia's Featured Articles

    ERIC Educational Resources Information Center

    Jones, John

    2008-01-01

    This study examines the revision histories of 10 Wikipedia articles nominated for the site's Featured Article Class (FAC), its highest quality rating, 5 of which achieved FAC and 5 of which did not. The revisions to each article were coded, and the coding results were combined with a descriptive analysis of two representative articles in order to…

  14. Common Day Care Safety Renovations: Descriptions, Explanations and Cost Estimates.

    ERIC Educational Resources Information Center

    Spack, Stan

    This booklet explains some of the day care safety features specified by the new Massachusetts State Building Code (January 1, 1975) which must be met before a new day care center can be licensed. The safety features described are those which most often require renovation to meet the building code standards. Best estimates of the costs involved in…

  15. COMPLETE DETERMINATION OF POLARIZATION FOR A HIGH-ENERGY DEUTERON BEAM (thesis)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Button, J

    1959-05-01

    please delete the no. 17076<>13:017077The P/sub 1/ multigroup code was written for the IBM-704 in order to determine the accuracy of the few- group diffusion scheme with various imposed conditions and also to provide an alternate computational method when this scheme fails to be sufficiently accurate. The code solves for the spatially dependent multigroup flux, taking into account such nuclear phenomena is slowing down of neutrons resulting from elastic and inelastic scattering, the removal of neutrons resulting from epithermal capture and fission resonances, and the regeneration of fist neutrons resulting from fissioning which may occur in any of as manymore » as 80 fast multigroups or in the one thermal group. The code will accept as input a physical description of the reactor (that is: slab, cylindrical, or spherical geometry, number of points and regions, composition description group dependent boundary condition, transverse buckling, and mesh sizes) and a prepared library of nuclear properties of all the isotopes in each composition. The code will produce as output multigroup fluxes, currents, and isotopic slowing-down densities, in addition to pointwise and regionwise few-group macroscopic cross sections. (auth)« less

  16. Investigation of Fluctuation-Induced Electron Transport in Hall Thrusters with a 2D Hybrid Code in the Azimuthal and Axial Coordinates

    NASA Astrophysics Data System (ADS)

    Fernandez, Eduardo; Borelli, Noah; Cappelli, Mark; Gascon, Nicolas

    2003-10-01

    Most current Hall thruster simulation efforts employ either 1D (axial), or 2D (axial and radial) codes. These descriptions crucially depend on the use of an ad-hoc perpendicular electron mobility. Several models for the mobility are typically invoked: classical, Bohm, empirically based, wall-induced, as well as combinations of the above. Experimentally, it is observed that fluctuations and electron transport depend on axial distance and operating parameters. Theoretically, linear stability analyses have predicted a number of unstable modes; yet the nonlinear character of the fluctuations and/or their contribution to electron transport remains poorly understood. Motivated by these observations, a 2D code in the azimuthal and axial coordinates has been written. In particular, the simulation self-consistently calculates the azimuthal disturbances resulting in fluctuating drifts, which in turn (if properly correlated with plasma density disturbances) result in fluctuation-driven electron transport. The characterization of the turbulence at various operating parameters and across the channel length is also the object of this study. A description of the hybrid code used in the simulation as well as the initial results will be presented.

  17. THR-TH: a high-temperature gas-cooled nuclear reactor core thermal hydraulics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.

    1984-07-01

    The ORNL version of PEBBLE, the (RZ) pebble bed thermal hydraulics code, has been extended for application to a prismatic gas cooled reactor core. The supplemental treatment is of one-dimensional coolant flow in up to a three-dimensional core description. Power density data from a neutronics and exposure calculation are used as the basic information for the thermal hydraulics calculation of heat removal. Two-dimensional neutronics results may be expanded for a three-dimensional hydraulics calculation. The geometric description for the hydraulics problem is the same as used by the neutronics code. A two-dimensional thermal cell model is used to predict temperatures inmore » the fuel channel. The capability is available in the local BOLD VENTURE computation system for reactor core analysis with capability to account for the effect of temperature feedback by nuclear cross section correlation. Some enhancements have also been added to the original code to add pebble bed modeling flexibility and to generate useful auxiliary results. For example, an estimate is made of the distribution of fuel temperatures based on average and extreme conditions regularly calculated at a number of locations.« less

  18. Measurement Requirements for Improved Modeling of Arcjet Facility Flows

    NASA Technical Reports Server (NTRS)

    Fletcher, Douglas G.

    2000-01-01

    Current efforts to develop new reusable launch vehicles and to pursue low-cost robotic planetary missions have led to a renewed interest in understanding arc-jet flows. Part of this renewed interest is concerned with improving the understanding of arc-jet test results and the potential use of available computational-fluid- dynamic (CFD) codes to aid in this effort. These CFD codes have been extensively developed and tested for application to nonequilibrium, hypersonic flow modeling. It is envisioned, perhaps naively, that the application of these CFD codes to the simulation of arc-jet flows would serve two purposes: first. the codes would help to characterize the nonequilibrium nature of the arc-jet flows; and second. arc-jet experiments could potentially be used to validate the flow models. These two objectives are, to some extent, mutually exclusive. However, the purpose of the present discussion is to address what role CFD codes can play in the current arc-jet flow characterization effort, and whether or not the simulation of arc-jet facility tests can be used to eva1uate some of the modeling that is used to formu1ate these codes. This presentation is organized into several sections. In the introductory section, the development of large-scale, constricted-arc test facilities within NASA is reviewed, and the current state of flow diagnostics using conventional instrumentation is summarized. The motivation for using CFD to simulate arc-jet flows is addressed in the next section, and the basic requirements for CFD models that would be used for these simulations are briefly discussed. This section is followed by a more detailed description of experimental measurements that are needed to initiate credible simulations and to evaluate their fidelity in the different flow regions of an arc-jet facility. Observations from a recent combined computational and experiment.al investigation of shock-layer flows in a large-scale arc-jet facility are then used to illustrate the current state of development of diagnostic instrumentation, CFD simulations, and general knowledge in the field of arc-jet characterization. Finally, the main points are summarized and recommendations for future efforts are given.

  19. Applying Standard Interfaces to a Process-Control Language

    NASA Technical Reports Server (NTRS)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  20. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  1. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  2. An evaluation of medical knowledge contained in Wikipedia and its use in the LOINC database.

    PubMed

    Friedlin, Jeff; McDonald, Clement J

    2010-01-01

    The logical observation identifiers names and codes (LOINC) database contains 55 000 terms consisting of more atomic components called parts. LOINC carries more than 18 000 distinct parts. It is necessary to have definitions/descriptions for each of these parts to assist users in mapping local laboratory codes to LOINC. It is believed that much of this information can be obtained from the internet; the first effort was with Wikipedia. This project focused on 1705 laboratory analytes (the first part in the LOINC laboratory name). Of the 1705 parts queried, 1314 matching articles were found in Wikipedia. Of these, 1299 (98.9%) were perfect matches that exactly described the LOINC part, 15 (1.14%) were partial matches (the description in Wikipedia was related to the LOINC part, but did not describe it fully), and 102 (7.76%) were mis-matches. The current release of RELMA and LOINC include Wikipedia descriptions of LOINC parts obtained as a direct result of this project.

  3. TORBEAM 2.0, a paraxial beam tracing code for electron-cyclotron beams in fusion plasmas for extended physics applications

    NASA Astrophysics Data System (ADS)

    Poli, E.; Bock, A.; Lochbrunner, M.; Maj, O.; Reich, M.; Snicker, A.; Stegmeir, A.; Volpe, F.; Bertelli, N.; Bilato, R.; Conway, G. D.; Farina, D.; Felici, F.; Figini, L.; Fischer, R.; Galperti, C.; Happel, T.; Lin-Liu, Y. R.; Marushchenko, N. B.; Mszanowski, U.; Poli, F. M.; Stober, J.; Westerhof, E.; Zille, R.; Peeters, A. G.; Pereverzev, G. V.

    2018-04-01

    The paraxial WKB code TORBEAM (Poli, 2001) is widely used for the description of electron-cyclotron waves in fusion plasmas, retaining diffraction effects through the solution of a set of ordinary differential equations. With respect to its original form, the code has undergone significant transformations and extensions, in terms of both the physical model and the spectrum of applications. The code has been rewritten in Fortran 90 and transformed into a library, which can be called from within different (not necessarily Fortran-based) workflows. The models for both absorption and current drive have been extended, including e.g. fully-relativistic calculation of the absorption coefficient, momentum conservation in electron-electron collisions and the contribution of more than one harmonic to current drive. The code can be run also for reflectometry applications, with relativistic corrections for the electron mass. Formulas that provide the coupling between the reflected beam and the receiver have been developed. Accelerated versions of the code are available, with the reduced physics goal of inferring the location of maximum absorption (including or not the total driven current) for a given setting of the launcher mirrors. Optionally, plasma volumes within given flux surfaces and corresponding values of minimum and maximum magnetic field can be provided externally to speed up the calculation of full driven-current profiles. These can be employed in real-time control algorithms or for fast data analysis.

  4. Etiology of work-related electrical injuries: a narrative analysis of workers' compensation claims.

    PubMed

    Lombardi, David A; Matz, Simon; Brennan, Melanye J; Smith, Gordon S; Courtney, Theodore K

    2009-10-01

    The purpose of this study was to provide new insight into the etiology of primarily nonfatal, work-related electrical injuries. We developed a multistage, case-selection algorithm to identify electrical-related injuries from workers' compensation claims and a customized coding taxonomy to identify pre-injury circumstances. Workers' compensation claims routinely collected over a 1-year period from a large U.S. insurance provider were used to identify electrical-related injuries using an algorithm that evaluated: coded injury cause information, nature of injury, "accident" description, and injury description narratives. Concurrently, a customized coding taxonomy for these narratives was developed to abstract the activity, source, initiating process, mechanism, vector, and voltage. Among the 586,567 reported claims during 2002, electrical-related injuries accounted for 1283 (0.22%) of nonfatal claims and 15 fatalities (1.2% of electrical). Most (72.3%) were male, average age of 36, working in services (33.4%), manufacturing (24.7%), retail trade (17.3%), and construction (7.2%). Body part(s) injured most often were the hands, fingers, or wrist (34.9%); multiple body parts/systems (25.0%); lower/upper arm; elbow; shoulder, and upper extremities (19.2%). The leading activities were conducting manual tasks (55.1%); working with machinery, appliances, or equipment; working with electrical wire; and operating powered or nonpowered hand tools. Primary injury sources were appliances and office equipment (24.4%); wires, cables/cords (18.0%); machines and other equipment (11.8%); fixtures, bulbs, and switches (10.4%); and lightning (4.3%). No vector was identified in 85% of cases. and the work process was initiated by others in less than 1% of cases. Injury narratives provide valuable information to overcome some of the limitations of precoded data, more specially for identifying additional injury cases and in supplementing traditional epidemiologic data for further understanding the etiology of work-related electrical injuries that may lead to further prevention opportunities.

  5. Effects of Deployment on the Mental Health of Service Members at Fort Hood

    DTIC Science & Technology

    2006-07-06

    167) Once (n= 1498) More Than Once (n=566) Characteristic 11 Percent n Percent n Percent Gender Male 120 71.9 1150 76.8 439 77.6 Female 47 28.1 348...4,5,6,7,9,10,13,14,15,16) and item 2 in the provider section. Gender was coded as ŕ" for female and Ŕ" for male. The remainder of the nominal level variables...Appendix C Variables, Measures, and Coding of Data VARIBLE DESCRIPTION SPSS DATA CODE & SPSS CODE Male Gender Female FEMALE =0, MALE=1 El 01 Wi E2 02 W2 E3

  6. GSE, data management system programmers/User' manual

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.

    1974-01-01

    The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.

  7. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  8. The EGS5 Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users ofmore » EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version, a deliberate attempt was made to present example problems in order to help the user ''get started'', and we follow that spirit in this report. A series of elementary tutorial user codes are presented in Chapter 3, with more sophisticated sample user codes described in Chapter 4. Novice EGS users will find it helpful to read through the initial sections of the EGS5 User Manual (provided in Appendix B of this report), proceeding then to work through the tutorials in Chapter 3. The User Manuals and other materials found in the appendices contain detailed flow charts, variable lists, and subprogram descriptions of EGS5 and PEGS. Included are step-by-step instructions for developing basic EGS5 user codes and for accessing all of the physics options available in EGS5 and PEGS. Once acquainted with the basic structure of EGS5, users should find the appendices the most frequently consulted sections of this report.« less

  9. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  10. CIFOG: Cosmological Ionization Fields frOm Galaxies

    NASA Astrophysics Data System (ADS)

    Hutter, Anne

    2018-03-01

    CIFOG is a versatile MPI-parallelised semi-numerical tool to perform simulations of the Epoch of Reionization. From a set of evolving cosmological gas density and ionizing emissivity fields, it computes the time and spatially dependent ionization of neutral hydrogen (HI), neutral (HeI) and singly ionized helium (HeII) in the intergalactic medium (IGM). The code accounts for HII, HeII, HeIII recombinations, and provides different descriptions for the photoionization rate that are used to calculate the residual HI fraction in ionized regions. This tool has been designed to be coupled to semi-analytic galaxy formation models or hydrodynamical simulations. The modular fashion of the code allows the user to easily introduce new descriptions for recombinations and the photoionization rate.

  11. Scalable Implementation of Finite Elements by NASA _ Implicit (ScIFEi)

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Bomarito, Geoffrey F.; Heber, Gerd; Hochhalter, Jacob D.

    2016-01-01

    Scalable Implementation of Finite Elements by NASA (ScIFEN) is a parallel finite element analysis code written in C++. ScIFEN is designed to provide scalable solutions to computational mechanics problems. It supports a variety of finite element types, nonlinear material models, and boundary conditions. This report provides an overview of ScIFEi (\\Sci-Fi"), the implicit solid mechanics driver within ScIFEN. A description of ScIFEi's capabilities is provided, including an overview of the tools and features that accompany the software as well as a description of the input and output le formats. Results from several problems are included, demonstrating the efficiency and scalability of ScIFEi by comparing to finite element analysis using a commercial code.

  12. The WINCOF-I code: Detailed description

    NASA Technical Reports Server (NTRS)

    Murthy, S. N. B.; Mullican, A.

    1993-01-01

    The performance of an axial-flow fan-compressor unit is basically unsteady when there is ingestion of water along with the gas phase. The gas phase is a mixture of air and water vapor in the case of a bypass fan engine that provides thrust power to an aircraft. The liquid water may be in the form of droplets and film at entry to the fan. The unsteadiness is then associated with the relative motion between the gas phase and water, at entry and within the machine, while the water undergoes impact on material surfaces, centrifuging, heat and mass transfer processes, and reingestion in blade wakes, following peal off from blade surfaces. The unsteadiness may be caused by changes in atmospheric conditions and at entry into and exit from rain storms while the aircraft is in flight. In a multi-stage machine, with an uneven distribution of blade tip clearance, the combined effect of various processes in the presence of steady or time-dependent ingestion is such as to make the performance of a fan and a compressor unit time-dependent from the start of ingestion up to a short time following termination of ingestion. The original WINCOF code was developed without accounting for the relative motion between gas and liquid phases in the ingested fluid. A modification of the WINCOF code was developed and named WINCOF-1. The WINCOF-1 code can provide the transient performance of a fan-compressor unit under a variety of input conditions.

  13. RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2012-06-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.« less

  14. RELAP5-3D results for phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, G.; Epiney, A. S.

    2012-07-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2. (authors)« less

  15. Development of an Acoustic Signal Analysis Tool “Auto-F” Based on the Temperament Scale

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    The MIDI interface is originally designed for electronic musical instruments but we consider this music-note based coding concept can be extended for general acoustic signal description. We proposed applying the MIDI technology to coding of bio-medical auscultation sound signals such as heart sounds for retrieving medical records and performing telemedicine. Then we have tried to extend our encoding targets including vocal sounds, natural sounds and electronic bio-signals such as ECG, using Generalized Harmonic Analysis method. Currently, we are trying to separate vocal sounds included in popular songs and encode both vocal sounds and background instrumental sounds into separate MIDI channels. And also, we are trying to extract articulation parameters such as MIDI pitch-bend parameters in order to reproduce natural acoustic sounds using a GM-standard MIDI tone generator. In this paper, we present an overall algorithm of our developed acoustic signal analysis tool, based on those research works, which can analyze given time-based signals on the musical temperament scale. The prominent feature of this tool is producing high-precision MIDI codes, which reproduce the similar signals as the given source signal using a GM-standard MIDI tone generator, and also providing analyzed texts in the XML format.

  16. ICF-CY code set for infants with early delay and disabilities (EDD Code Set) for interdisciplinary assessment: a global experts survey.

    PubMed

    Pan, Yi-Ling; Hwang, Ai-Wen; Simeonsson, Rune J; Lu, Lu; Liao, Hua-Fang

    2015-01-01

    Comprehensive description of functioning is important in providing early intervention services for infants with developmental delay/disabilities (DD). A code set of the International Classification of Functioning, Disability and Health: Children and Youth Version (ICF-CY) could facilitate the practical use of the ICF-CY in team evaluation. The purpose of this study was to derive an ICF-CY code set for infants under three years of age with early delay and disabilities (EDD Code Set) for initial team evaluation. The EDD Code Set based on the ICF-CY was developed on the basis of a Delphi survey of international professionals experienced in implementing the ICF-CY and professionals in early intervention service system in Taiwan. Twenty-five professionals completed the Delphi survey. A total of 82 ICF-CY second-level categories were identified for the EDD Code Set, including 28 categories from the domain Activities and Participation, 29 from body functions, 10 from body structures and 15 from environmental factors. The EDD Code Set of 82 ICF-CY categories could be useful in multidisciplinary team evaluations to describe functioning of infants younger than three years of age with DD, in a holistic manner. Future validation of the EDD Code Set and examination of its clinical utility are needed. The EDD Code Set with 82 essential ICF-CY categories could be useful in the initial team evaluation as a common language to describe functioning of infants less than three years of age with developmental delay/disabilities, with a more holistic view. The EDD Code Set including essential categories in activities and participation, body functions, body structures and environmental factors could be used to create a functional profile for each infant with special needs and to clarify the interaction of child and environment accounting for the child's functioning.

  17. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  18. Portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele

    2018-03-01

    Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.

  19. The Comprehensive AOCMF Classification: Skull Base and Cranial Vault Fractures – Level 2 and 3 Tutorial

    PubMed Central

    Ieva, Antonio Di; Audigé, Laurent; Kellman, Robert M.; Shumrick, Kevin A.; Ringl, Helmut; Prein, Joachim; Matula, Christian

    2014-01-01

    The AOCMF Classification Group developed a hierarchical three-level craniomaxillofacial classification system with increasing level of complexity and details. The highest level 1 system distinguish four major anatomical units, including the mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94). This tutorial presents the level 2 and more detailed level 3 systems for the skull base and cranial vault units. The level 2 system describes fracture location outlining the topographic boundaries of the anatomic regions, considering in particular the endocranial and exocranial skull base surfaces. The endocranial skull base is divided into nine regions; a central skull base adjoining a left and right side are divided into the anterior, middle, and posterior skull base. The exocranial skull base surface and cranial vault are divided in regions defined by the names of the bones involved: frontal, parietal, temporal, sphenoid, and occipital bones. The level 3 system allows assessing fracture morphology described by the presence of fracture fragmentation, displacement, and bone loss. A documentation of associated intracranial diagnostic features is proposed. This tutorial is organized in a sequence of sections dealing with the description of the classification system with illustrations of the topographical skull base and cranial vault regions along with rules for fracture location and coding, a series of case examples with clinical imaging and a general discussion on the design of this classification. PMID:25489394

  20. A qualitative content analysis of global health engagements in Peacekeeping and Stability Operations Institute's stability operations lessons learned and information management system.

    PubMed

    Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel

    2015-04-01

    Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

Top