Sample records for documentation program code

  1. Software for Better Documentation of Other Software

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  2. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  3. Guidelines for development structured FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Earnest, B. M.

    1984-01-01

    Computer programming and coding standards were compiled to serve as guidelines for the uniform writing of FORTRAN 77 programs at NASA Langley. Software development philosophy, documentation, general coding conventions, and specific FORTRAN coding constraints are discussed.

  4. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  5. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  6. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    PubMed

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p < 0.01); for new patients the monthly average E&M level increased from 2.61 to 3.19 (p < 0.01). This study describes a series of educational and workflow interventions, which improved resident coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  7. Methods, media, and systems for detecting attack on a digital processing device

    DOEpatents

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromylis, Angelos D.; Androulaki, Elli

    2014-07-22

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document to the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.

  8. Methods, media, and systems for detecting attack on a digital processing device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stolfo, Salvatore J.; Li, Wei-Jen; Keromytis, Angelos D.

    Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document tomore » the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.« less

  9. Nevada Administrative Code for Special Education Programs.

    ERIC Educational Resources Information Center

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  10. Single-channel voice-response-system program documentation volume I : system description

    DOT National Transportation Integrated Search

    1977-01-01

    This report documents the design and implementation of a Voice Response System (VRS) using Adaptive Differential Pulse Code Modulation (ADPCM) voice coding. Implemented on a Digital Equipment Corporation PDP-11/20,R this VRS system supports a single ...

  11. The National Transport Code Collaboration Module Library

    NASA Astrophysics Data System (ADS)

    Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.

    2004-12-01

    This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website http://w3.pppl.gov/NTCC. The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.

  12. GSE, data management system programmers/User' manual

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.; Dolerhie, B. D., Jr.; Ghiglieri, F. J.

    1974-01-01

    The GSE data management system is a computerized program which provides for a central storage source for key data associated with the mechanical ground support equipment (MGSE). Eight major sort modes can be requested by the user. Attributes that are printed automatically with each sort include the GSE end item number, description, class code, functional code, fluid media, use location, design responsibility, weight, cost, quantity, dimensions, and applicable documents. Multiple subsorts are available for the class code, functional code, fluid media, use location, design responsibility, and applicable document categories. These sorts and how to use them are described. The program and GSE data bank may be easily updated and expanded.

  13. Programming (Tips) for Physicists & Engineers

    ScienceCinema

    Ozcan, Erkcan

    2018-02-19

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  14. Programming (Tips) for Physicists & Engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozcan, Erkcan

    2010-07-13

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  15. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. George L Mesina

    Our ultimate goal is to create and maintain RELAP5-3D as the best software tool available to analyze nuclear power plants. This begins with writing excellent programming and requires thorough testing. This document covers development of RELAP5-3D software, the behavior of the RELAP5-3D program that must be maintained, and code testing. RELAP5-3D must perform in a manner consistent with previous code versions with backward compatibility for the sake of the users. Thus file operations, code termination, input and output must remain consistent in form and content while adding appropriate new files, input and output as new features are developed. As computermore » hardware, operating systems, and other software change, RELAP5-3D must adapt and maintain performance. The code must be thoroughly tested to ensure that it continues to perform robustly on the supported platforms. The coding must be written in a consistent manner that makes the program easy to read to reduce the time and cost of development, maintenance and error resolution. The programming guidelines presented her are intended to institutionalize a consistent way of writing FORTRAN code for the RELAP5-3D computer program that will minimize errors and rework. A common format and organization of program units creates a unifying look and feel to the code. This in turn increases readability and reduces time required for maintenance, development and debugging. It also aids new programmers in reading and understanding the program. Therefore, when undertaking development of the RELAP5-3D computer program, the programmer must write computer code that follows these guidelines. This set of programming guidelines creates a framework of good programming practices, such as initialization, structured programming, and vector-friendly coding. It sets out formatting rules for lines of code, such as indentation, capitalization, spacing, etc. It creates limits on program units, such as subprograms, functions, and modules. It establishes documentation guidance on internal comments. The guidelines apply to both existing and new subprograms. They are written for both FORTRAN 77 and FORTRAN 95. The guidelines are not so rigorous as to inhibit a programmer’s unique style, but do restrict the variations in acceptable coding to create sufficient commonality that new readers will find the coding in each new subroutine familiar. It is recognized that this is a “living” document and must be updated as languages, compilers, and computer hardware and software evolve.« less

  17. Thrust Chamber Modeling Using Navier-Stokes Equations: Code Documentation and Listings. Volume 2

    NASA Technical Reports Server (NTRS)

    Daley, P. L.; Owens, S. F.

    1988-01-01

    A copy of the PHOENICS input files and FORTRAN code developed for the modeling of thrust chambers is given. These copies are contained in the Appendices. The listings are contained in Appendices A through E. Appendix A describes the input statements relevant to thrust chamber modeling as well as the FORTRAN code developed for the Satellite program. Appendix B describes the FORTRAN code developed for the Ground program. Appendices C through E contain copies of the Q1 (input) file, the Satellite program, and the Ground program respectively.

  18. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  19. Documentation of a numerical code for the simulation of variable density ground-water flow in three dimensions

    USGS Publications Warehouse

    Kuiper, L.K.

    1985-01-01

    A numerical code is documented for the simulation of variable density time dependent groundwater flow in three dimensions. The groundwater density, although variable with distance, is assumed to be constant in time. The Integrated Finite Difference grid elements in the code follow the geologic strata in the modeled area. If appropriate, the determination of hydraulic head in confining beds can be deleted to decrease computation time. The strongly implicit procedure (SIP), successive over-relaxation (SOR), and eight different preconditioned conjugate gradient (PCG) methods are used to solve the approximating equations. The use of the computer program that performs the calculations in the numerical code is emphasized. Detailed instructions are given for using the computer program, including input data formats. An example simulation and the Fortran listing of the program are included. (USGS)

  20. BASIC Language Flow Charting Program (BASCHART). Technical Note 3-82.

    ERIC Educational Resources Information Center

    Johnson, Charles C.; And Others

    This document describes BASCHART, a computer aid designed to decipher and automatically flow chart computer program logic; it also provides the computer code necessary for this process. Developed to reduce the labor intensive manual process of producing a flow chart for an undocumented or inadequately documented program, BASCHART will…

  1. Rotor Wake/Stator Interaction Noise Prediction Code Technical Documentation and User's Manual

    NASA Technical Reports Server (NTRS)

    Topol, David A.; Mathews, Douglas C.

    2010-01-01

    This report documents the improvements and enhancements made by Pratt & Whitney to two NASA programs which together will calculate noise from a rotor wake/stator interaction. The code is a combination of subroutines from two NASA programs with many new features added by Pratt & Whitney. To do a calculation V072 first uses a semi-empirical wake prediction to calculate the rotor wake characteristics at the stator leading edge. Results from the wake model are then automatically input into a rotor wake/stator interaction analytical noise prediction routine which calculates inlet aft sound power levels for the blade-passage-frequency tones and their harmonics, along with the complex radial mode amplitudes. The code allows for a noise calculation to be performed for a compressor rotor wake/stator interaction, a fan wake/FEGV interaction, or a fan wake/core stator interaction. This report is split into two parts, the first part discusses the technical documentation of the program as improved by Pratt & Whitney. The second part is a user's manual which describes how input files are created and how the code is run.

  2. Volume I: fluidized-bed code documentation, for the period February 28, 1983-March 18, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piperopoulou, H.; Finson, M.; Bloomfield, D.

    1983-03-01

    This documentation supersedes the previous documentation of the Fluidized-Bed Gasifier code. Volume I documents a simulation program of a Fluidized-Bed Gasifier (FBG), and Volume II documents a systems model of the FBG. The FBG simulation program is an updated version of the PSI/FLUBED code which is capable of modeling slugging beds and variable bed diameter. In its present form the code is set up to model a Westinghouse commercial scale gasifier. The fluidized bed gasifier model combines the classical bubbling bed description for the transport and mixing processes with PSI-generated models for coal chemistry. At the distributor plate, the bubblemore » composition is that of the inlet gas and the initial bubble size is set by the details of the distributor plate. Bubbles grow by coalescence as they rise. The bubble composition and temperature change with height due to transport to and from the cloud as well as homogeneous reactions within the bubble. The cloud composition also varies with height due to cloud/bubble exchange, cloud/emulsion, exchange, and heterogeneous coal char reactions. The emulsion phase is considered to be well mixed.« less

  3. Medical decision making: guide to improved CPT coding.

    PubMed

    Holt, Jim; Warsy, Ambreen; Wright, Paula

    2010-04-01

    The Current Procedural Terminology (CPT) coding system for office visits, which has been in use since 1995, has not been well studied, but it is generally agreed that the system contains much room for error. In fact, the available literature suggests that only slightly more than half of physicians will agree on the same CPT code for a given visit, and only 60% of professional coders will agree on the same code for a particular visit. In addition, the criteria used to assign a code are often related to the amount of written documentation. The goal of this study was to evaluate two novel methods to assess if the most appropriate CPT code is used: the level of medical decision making, or the sum of all problems mentioned by the patient during the visit. The authors-a professional coder, a residency faculty member, and a PGY-3 family medicine resident-reviewed 351 randomly selected visit notes from two residency programs in the Northeast Tennessee region for the level of documentation, the level of medical decision making, and the total number of problems addressed. The authors assigned appropriate CPT codes at each of those three levels. Substantial undercoding occurred at each of the three levels. Approximately 33% of visits were undercoded based on the written documentation. Approximately 50% of the visits were undercoded based on the level of documented medical decision making. Approximately 80% of the visits were undercoded based on the total number of problems which the patient presented during the visit. Interrater agreement was fair, and similar to that noted in other coding studies. Undercoding is not only common in a family medicine residency program but it also occurs at levels that would not be evident from a simple audit of the documentation on the visit note. Undercoding also occurs from not exploring problems mentioned by the patient and not documenting additional work that was performed. Family physicians may benefit from minor alterations in their documentation of office visit notes.

  4. ANOPP programming and documentation standards document

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Standards defining the requirements for preparing software for the Aircraft Noise Prediction Program (ANOPP) were given. It is the intent of these standards to provide definition, design, coding, and documentation criteria for the achievement of a unity among ANOPP products. These standards apply to all of ANOPP's standard software system. The standards encompass philosophy as well as techniques and conventions.

  5. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  6. Malnutrition coding 101: financial impact and more.

    PubMed

    Giannopoulos, Georgia A; Merriman, Louise R; Rumsey, Alissa; Zwiebel, Douglas S

    2013-12-01

    Recent articles have addressed the characteristics associated with adult malnutrition as published by the Academy of Nutrition and Dietetics (the Academy) and the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.). This article describes a successful interdisciplinary program developed by the Department of Food and Nutrition at New York-Presbyterian Hospital to maintain and monitor clinical documentation, ensure accurate International Classification of Diseases 9th Edition (ICD-9) coding, and identify subsequent incremental revenue resulting from the early identification, documentation, and treatment of malnutrition in an adult inpatient population. The first step in the process requires registered dietitians to identify patients with malnutrition; then clear and specifically worded diagnostic statements that include the type and severity of malnutrition are documented in the medical record by the physician, nurse practitioner, or physician's assistant. This protocol allows the Heath Information Management/Coding department to accurately assign ICD-9 codes associated with protein-energy malnutrition. Once clinical coding is complete, a final diagnosis related group (DRG) is generated to ensure appropriate hospital reimbursement. Successful interdisciplinary programs such as this can drive optimal care and ensure appropriate reimbursement.

  7. Guide for Training Medical Laboratory Technicians. Fourth Edition.

    ERIC Educational Resources Information Center

    American Medical Technologists, Park Ridge, IL.

    This document is intended to assist educators in the development of medical laboratory technician training programs. The following elements are included in the document: (1) an introduction; (2) the American Medical Technologists' Code of Ethics; (3) suggested curricula for medical laboratory technician programs for a 12-month course and an…

  8. Methodology, status, and plans for development and assessment of the RELAP5 code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.W.; Riemke, R.A.

    1997-07-01

    RELAP/MOD3 is a computer code used for the simulation of transients and accidents in light-water nuclear power plants. The objective of the program to develop and maintain RELAP5 was and is to provide the U.S. Nuclear Regulatory Commission with an independent tool for assessing reactor safety. This paper describes code requirements, models, solution scheme, language and structure, user interface validation, and documentation. The paper also describes the current and near term development program and provides an assessment of the code`s strengths and limitations.

  9. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  10. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  11. ACCESS 1: Approximation Concepts Code for Efficient Structural Synthesis program documentation and user's guide

    NASA Technical Reports Server (NTRS)

    Miura, H.; Schmit, L. A., Jr.

    1976-01-01

    The program documentation and user's guide for the ACCESS-1 computer program is presented. ACCESS-1 is a research oriented program which implements a collection of approximation concepts to achieve excellent efficiency in structural synthesis. The finite element method is used for structural analysis and general mathematical programming algorithms are applied in the design optimization procedure. Implementation of the computer program, preparation of input data and basic program structure are described, and three illustrative examples are given.

  12. Computer software documentation

    NASA Technical Reports Server (NTRS)

    Comella, P. A.

    1973-01-01

    A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.

  13. Summary Report for ASC L2 Milestone #4782: Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neely, J. R.; Hornung, R.; Black, A.

    This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completionmore » signed by the review committee will act as proof of completion for this milestone.« less

  14. Alternative Education for Disruptive Youth: Recommended Parameters and Best Practices for Effective Programs

    ERIC Educational Resources Information Center

    Pennsylvania Department of Education, 2005

    2005-01-01

    This document was prepared to serve as an aid in the planning, design, implementation, and ongoing evaluation of Alternative Education for Disruptive Youth Programs in Pennsylvania. School Codes and Public Laws are cited for guidance on mandatory issues. The suggestions and implementation strategies provided in this document are not requirements…

  15. 75 FR 28594 - Ready-to-Learn Television Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... Federal Register. Free Internet access to the official edition of the Federal Register and the Code of... Access to This Document: You can view this document, as well as all other documents of this Department published in the Federal Register, in text or Adobe Portable Document Format (PDF) on the Internet at the...

  16. TORO II: A finite element computer program for nonlinear quasi-static problems in electromagnetics: Part 2, User`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gartling, D.K.

    User instructions are given for the finite element, electromagnetics program, TORO II. The theoretical background and numerical methods used in the program are documented in SAND95-2472. The present document also describes a number of example problems that have been analyzed with the code and provides sample input files for typical simulations. 20 refs., 34 figs., 3 tabs.

  17. Tools & Services - SEER Registrars

    Cancer.gov

    View glossary for registrars. Access ICD conversion programs, SEER Abstracting Tool, SEER Data Viewer, SEER interactive drug database for coding oncology drugs, data documentation, variable recodes, and SEER Application Programming Interface for developers.

  18. EA Shuttle Document Retention Effort

    NASA Technical Reports Server (NTRS)

    Wagner, Howard A.

    2010-01-01

    This slide presentation reviews the effort of code EA at Johnson Space Center (JSC) to identify and acquire databases and documents from the space shuttle program that are adjudged important for retention after the retirement of the space shuttle.

  19. Communications and Information: Compendium of Communications and Information Terminology

    DTIC Science & Technology

    2002-02-01

    Basic Access Module BASIC— Beginners All-Purpose Symbolic Instruction Code BBP—Baseband Processor BBS—Bulletin Board Service (System) BBTC—Broadband...media, formats and labels, programming language, computer documentation, flowcharts and terminology, character codes, data communications and input

  20. Calculation of two-dimensional inlet flow fields in a supersonic free stream: Program documentation and test cases

    NASA Technical Reports Server (NTRS)

    Biringen, S. H.; Mcmillan, O. J.

    1980-01-01

    The use of a computer code for the calculation of two dimensional inlet flow fields in a supersonic free stream and a nonorthogonal mesh-generation code are illustrated by specific examples. Input, output, and program operation and use are given and explained for the case of supercritical inlet operation at a subdesign Mach number (M Mach free stream = 2.09) for an isentropic-compression, drooped-cowl inlet. Source listings of the computer codes are also provided.

  1. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  2. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  3. Computer program CDCID: an automated quality control program using CDC update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less

  4. Planning for Pre-Exascale Platform Environment (Fiscal Year 2015 Level 2 Milestone 5216)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springmeyer, R.; Lang, M.; Noe, J.

    This Plan for ASC Pre-Exascale Platform Environments document constitutes the deliverable for the fiscal year 2015 (FY15) Advanced Simulation and Computing (ASC) Program Level 2 milestone Planning for Pre-Exascale Platform Environment. It acknowledges and quantifies challenges and recognized gaps for moving the ASC Program towards effective use of exascale platforms and recommends strategies to address these gaps. This document also presents an update to the concerns, strategies, and plans presented in the FY08 predecessor document that dealt with the upcoming (at the time) petascale high performance computing (HPC) platforms. With the looming push towards exascale systems, a review of themore » earlier document was appropriate in light of the myriad architectural choices currently under consideration. The ASC Program believes the platforms to be fielded in the 2020s will be fundamentally different systems that stress ASC’s ability to modify codes to take full advantage of new or unique features. In addition, the scale of components will increase the difficulty of maintaining an errorfree system, thus driving new approaches to resilience and error detection/correction. The code revamps of the past, from serial- to vector-centric code to distributed memory to threaded implementations, will be revisited as codes adapt to a new message passing interface (MPI) plus “x” or more advanced and dynamic programming models based on architectural specifics. Development efforts are already underway in some cases, and more difficult or uncertain aspects of the new architectures will require research and analysis that may inform future directions for program choices. In addition, the potential diversity of system architectures may require parallel if not duplicative efforts to analyze and modify environments, codes, subsystems, libraries, debugging tools, and performance analysis techniques as well as exploring new monitoring methodologies. It is difficult if not impossible to selectively eliminate some of these activities until more information is available through simulations of potential architectures, analysis of systems designs, and informed study of commodity technologies that will be the constituent parts of future platforms.« less

  5. Praxis language reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, J.H.

    1981-01-01

    This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less

  6. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  7. TableSim--A program for analysis of small-sample categorical data.

    Treesearch

    David J. Rugg

    2003-01-01

    Documents a computer program for calculating correct P-values of 1-way and 2-way tables when sample sizes are small. The program is written in Fortran 90; the executable code runs in 32-bit Microsoft-- command line environments.

  8. A Semantic Analysis Method for Scientific and Engineering Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  9. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  10. Computational strategy for the solution of large strain nonlinear problems using the Wilkins explicit finite-difference approach

    NASA Technical Reports Server (NTRS)

    Hofmann, R.

    1980-01-01

    The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.

  11. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette

    2014-06-01

    The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.

  12. Programming in HAL/S

    NASA Technical Reports Server (NTRS)

    Ryer, M. J.

    1978-01-01

    HAL/S is a computer programming language; it is a representation for algorithms which can be interpreted by either a person or a computer. HAL/S compilers transform blocks of HAL/S code into machine language which can then be directly executed by a computer. When the machine language is executed, the algorithm specified by the HAL/S code (source) is performed. This document describes how to read and write HAL/S source.

  13. An Experiment in Scientific Program Understanding

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Owen, Karl (Technical Monitor)

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  14. Documentation of a graphical display program for the saturated- unsaturated transport (SUTRA) finite-element simulation model

    USGS Publications Warehouse

    Souza, W.R.

    1987-01-01

    This report documents a graphical display program for the U. S. Geological Survey finite-element groundwater flow and solute transport model. Graphic features of the program, SUTRA-PLOT (SUTRA-PLOT = saturated/unsaturated transport), include: (1) plots of the finite-element mesh, (2) velocity vector plots, (3) contour plots of pressure, solute concentration, temperature, or saturation, and (4) a finite-element interpolator for gridding data prior to contouring. SUTRA-PLOT is written in FORTRAN 77 on a PRIME 750 computer system, and requires Version 9.0 or higher of the DISSPLA graphics library. The program requires two input files: the SUTRA input data list and the SUTRA simulation output listing. The program is menu driven and specifications for individual types of plots are entered and may be edited interactively. Installation instruction, a source code listing, and a description of the computer code are given. Six examples of plotting applications are used to demonstrate various features of the plotting program. (Author 's abstract)

  15. [Quality of documentation of intraoperative and postoperative complications : improvement of documentation for a nationwide quality assurance program and comparison with routine data].

    PubMed

    Jakob, J; Marenda, D; Sold, M; Schlüter, M; Post, S; Kienle, P

    2014-08-01

    Complications after cholecystectomy are continuously documented in a nationwide database in Germany. Recent studies demonstrated a lack of reliability of these data. The aim of the study was to evaluate the impact of a control algorithm on documentation quality and the use of routine diagnosis coding as an additional validation instrument. Completeness and correctness of the documentation of complications after cholecystectomy was compared over a time interval of 12 months before and after implementation of an algorithm for faster and more accurate documentation. Furthermore, the coding of all diagnoses was screened to identify intraoperative and postoperative complications. The sensitivity of the documentation for complications improved from 46 % to 70 % (p = 0.05, specificity 98 % in both time intervals). A prolonged time interval of more than 6 weeks between patient discharge and documentation was associated with inferior data quality (incorrect documentation in 1.5 % versus 15 %, p < 0.05). The rate of case documentation within the 6 weeks after hospital discharge was clearly improved after implementation of the control algorithm. Sensitivity and specificity of screening for complications by evaluating routine diagnoses coding were 70 % and 85 %, respectively. The quality of documentation was improved by implementation of a simple memory algorithm.

  16. An ultraviolet-visible spectrophotometer automation system. Part 3: Program documentation

    NASA Astrophysics Data System (ADS)

    Roth, G. S.; Teuschler, J. M.; Budde, W. L.

    1982-07-01

    The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to process manually entered data for the analysis of chlorophyll or color. For each program of the UVVIS system, this document contains a program description, flowchart, variable dictionary, code listing, and symbol cross-reference table. Also included are descriptions of file structures and of routines common to all automated analyses. The programs are written in Data General extended BASIC, Revision 4.3, under the RDOS operating systems, Revision 6.2. The BASIC code has been enhanced for real-time data acquisition, which is accomplished by CALLS to assembly language subroutines. Two other related publications are 'An Ultraviolet-Visible Spectrophotometer Automation System - Part I Functional Specifications,' and 'An Ultraviolet-Visible Spectrophotometer Automation System - Part II User's Guide.'

  17. PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 2: User's guide

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady

    1990-01-01

    A new computer code was developed to solve the two-dimensional or axisymmetric, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 2 is the User's Guide, and describes the program's general features, the input and output, the procedure for setting up initial conditions, the computer resource requirements, the diagnostic messages that may be generated, the job control language used to run the program, and several test cases.

  18. Metal Matrix Laminate Tailoring (MMLT) code: User's manual

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Morel, M. R.; Saravanos, D. A.

    1993-01-01

    The User's Manual for the Metal Matrix Laminate Tailoring (MMLT) program is presented. The code is capable of tailoring the fabrication process, constituent characteristics, and laminate parameters (individually or concurrently) for a wide variety of metal matrix composite (MMC) materials, to improve the performance and identify trends or behavior of MMC's under different thermo-mechanical loading conditions. This document is meant to serve as a guide in the use of the MMLT code. Detailed explanations of the composite mechanics and tailoring analysis are beyond the scope of this document, and may be found in the references. MMLT was developed by the Structural Mechanics Branch at NASA Lewis Research Center (LeRC).

  19. DOE Hydrogen Program 2004 Annual Merit Review and Peer Evaluation Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This document summarizes the project evaluations and comments from the DOE Hydrogen Program 2004 Annual Program Review. Hydrogen production, delivery and storage; fuel cells; technology validation; safety, codes and standards; and education R&D projects funded by DOE in FY2004 are reviewed.

  20. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.

  1. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  2. AGARD Flight Test Instrumentation Series. Volume 18. Microprocessor Applications in Airborne Flight Test Instrumentation

    DTIC Science & Technology

    1987-02-01

    flowcharting . 3. ProEram Codin in HLL. This stage consists of transcribing the previously designed program into R an t at can be translated into the machine...specified conditios 7. Documentation. Program documentation is necessary for user information, for maintenance, and for future applications. Flowcharts ...particular CP U. Asynchronous. Operating without reference to an overall timing source. BASIC. Beginners ’ All-purpose Symbolic Instruction Code; a widely

  3. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  4. Residential Building Energy Code Field Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Bartlett, M. Halverson, V. Mendon, J. Hathaway, Y. Xie

    This document presents a methodology for assessing baseline energy efficiency in new single-family residential buildings and quantifying related savings potential. The approach was developed by Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE) Building Energy Codes Program with the objective of assisting states as they assess energy efficiency in residential buildings and implementation of their building energy codes, as well as to target areas for improvement through energy codes and broader energy-efficiency programs. It is also intended to facilitate a consistent and replicable approach to research studies of this type and establish a transparent data setmore » to represent baseline construction practices across U.S. states.« less

  5. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  6. 40 CFR 80.158 - Product transfer documents (PTDs).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exempt base gasoline to be used for research, development, or test purposes only, the following warning must also be stated on the PTD: “For use in research, development, and test programs only.” (6) The...) Use of product codes and other non-regulatory language. (1) Product codes and other non-regulatory...

  7. PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 3: Programmer's reference

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady

    1990-01-01

    A new computer code was developed to solve the 2-D or axisymmetric, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating-direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 3 is the Programmer's Reference, and describes the program structure, the FORTRAN variables stored in common blocks, and the details of each subprogram.

  8. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

    NASA Technical Reports Server (NTRS)

    Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

    2004-01-01

    This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

  9. A supersonic three-dimensional code for flow over blunt bodies: Program documentation and test cases

    NASA Technical Reports Server (NTRS)

    Chaussee, D. S.; Mcmillan, O. J.

    1980-01-01

    The use of a computer code for the calculation of steady, supersonic, three dimensional, inviscid flow over blunt bodies is illustrated. Input and output are given and explained for two cases: a pointed code of 20 deg half angle at 15 deg angle of attack in a free stream with M sub infinite = 7, and a cone-ogive-cylinder at 10 deg angle of attack with M sub infinite = 2.86. A source listing of the computer code is provided.

  10. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-04-01

    This Requirements Identification Document (RID) describes an Occupational Health and Safety Program as defined through the Relevant DOE Orders, regulations, industry codes/standards, industry guidance documents and, as appropriate, good industry practice. The definition of an Occupational Health and Safety Program as specified by this document is intended to address Defense Nuclear Facilities Safety Board Recommendations 90-2 and 91-1, which call for the strengthening of DOE complex activities through the identification and application of relevant standards which supplement or exceed requirements mandated by DOE Orders. This RID applies to the activities, personnel, structures, systems, components, and programs involved in maintaining themore » facility and executing the mission of the High-Level Waste Storage Tank Farms.« less

  11. The NATA code; theory and analysis. Volume 3: Programmer's manual. [for calculating flow in arc-heated wind tunnels and conditions on models tested in reentry simulation

    NASA Technical Reports Server (NTRS)

    Bade, W. L.; Yos, J. M.

    1975-01-01

    The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.

  12. Using Administrative Mental Health Indicators in Heart Failure Outcomes Research: Comparison of Clinical Records and International Classification of Disease Coding.

    PubMed

    Bender, Miriam; Smith, Tyler C

    2016-01-01

    Use of mental indication in health outcomes research is of growing interest to researchers. This study, as part of a larger research program, quantified agreement between administrative International Classification of Disease (ICD-9) coding for, and "gold standard" clinician documentation of, mental health issues (MHIs) in hospitalized heart failure (HF) patients to determine the validity of mental health administrative data for use in HF outcomes research. A 13% random sample (n = 504) was selected from all unique patients (n = 3,769) hospitalized with a primary HF diagnosis at 4 San Diego County community hospitals during 2009-2012. MHI was defined as ICD-9 discharge diagnostic coding 290-319. Records were audited for clinician documentation of MHI. A total of 43% (n = 216) had mental health clinician documentation; 33% (n = 164) had ICD-9 coding for MHI. ICD-9 code bundle 290-319 had 0.70 sensitivity, 0.97 specificity, and kappa 0.69 (95% confidence interval 0.61-0.79). More specific ICD-9 MHI code bundles had kappas ranging from 0.44 to 0.82 and sensitivities ranging from 42% to 82%. Agreement between ICD-9 coding and clinician documentation for a broadly defined MHI is substantial, and can validly "rule in" MHI for hospitalized patients with heart failure. More specific MHI code bundles had fair to almost perfect agreement, with a wide range of sensitivities for identifying patients with an MHI. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Transient Heat Transfer in Coated Superconductors.

    DTIC Science & Technology

    1982-10-29

    of the use of the SCEPTRE code are contained in the instruction manual and the book on the code. 30 An example of an actual SCEPTRE program is given in...22. 0. Tsukomoto and S. Kobayashi, J. of Appl. Physics, 46, 1359, (1975) 23. Y Iwasa and B.A. Apgar , Cryogenics 18, 267, (1978) 24. D.E. Baynham, V.W...Computer program for circuit and Systems Analysis. Prentice Hall 1971 and J.C. Bowers et. al. Users Manual for Super-Sceptre Government Document AD/A-OIl

  14. FPT- FORTRAN PROGRAMMING TOOLS FOR THE DEC VAX

    NASA Technical Reports Server (NTRS)

    Ragosta, A. E.

    1994-01-01

    The FORTRAN Programming Tools (FPT) are a series of tools used to support the development and maintenance of FORTRAN 77 source codes. Included are a debugging aid, a CPU time monitoring program, source code maintenance aids, print utilities, and a library of useful, well-documented programs. These tools assist in reducing development time and encouraging high quality programming. Although intended primarily for FORTRAN programmers, some of the tools can be used on data files and other programming languages. BUGOUT is a series of FPT programs that have proven very useful in debugging a particular kind of error and in optimizing CPU-intensive codes. The particular type of error is the illegal addressing of data or code as a result of subtle FORTRAN errors that are not caught by the compiler or at run time. A TRACE option also allows the programmer to verify the execution path of a program. The TIME option assists the programmer in identifying the CPU-intensive routines in a program to aid in optimization studies. Program coding, maintenance, and print aids available in FPT include: routines for building standard format subprogram stubs; cleaning up common blocks and NAMELISTs; removing all characters after column 72; displaying two files side by side on a VT-100 terminal; creating a neat listing of a FORTRAN source code including a Table of Contents, an Index, and Page Headings; converting files between VMS internal format and standard carriage control format; changing text strings in a file without using EDT; and replacing tab characters with spaces. The library of useful, documented programs includes the following: time and date routines; a string categorization routine; routines for converting between decimal, hex, and octal; routines to delay process execution for a specified time; a Gaussian elimination routine for solving a set of simultaneous linear equations; a curve fitting routine for least squares fit to polynomial, exponential, and sinusoidal forms (with a screen-oriented editor); a cubic spline fit routine; a screen-oriented array editor; routines to support parsing; and various terminal support routines. These FORTRAN programming tools are written in FORTRAN 77 and ASSEMBLER for interactive and batch execution. FPT is intended for implementation on DEC VAX series computers operating under VMS. This collection of tools was developed in 1985.

  15. Integral flange design program. [procedure for computing stresses

    NASA Technical Reports Server (NTRS)

    Wilson, J. F.

    1974-01-01

    An automated interactive flange design program utilizing an electronic desk top calculator is presented. The program calculates the operating and seating stresses for circular flanges of the integral or optional type subjected to internal pressure. The required input information is documented. The program provides an automated procedure for computing stresses in selected flange geometries for comparison to the allowable code values.

  16. Staggered-grid finite-difference acoustic modeling with the Time-Domain Atmospheric Acoustic Propagation Suite (TDAAPS).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldridge, David Franklin; Collier, Sandra L.; Marlin, David H.

    2005-05-01

    This document is intended to serve as a users guide for the time-domain atmospheric acoustic propagation suite (TDAAPS) program developed as part of the Department of Defense High-Performance Modernization Office (HPCMP) Common High-Performance Computing Scalable Software Initiative (CHSSI). TDAAPS performs staggered-grid finite-difference modeling of the acoustic velocity-pressure system with the incorporation of spatially inhomogeneous winds. Wherever practical the control structure of the codes are written in C++ using an object oriented design. Sections of code where a large number of calculations are required are written in C or F77 in order to enable better compiler optimization of these sections. Themore » TDAAPS program conforms to a UNIX style calling interface. Most of the actions of the codes are controlled by adding flags to the invoking command line. This document presents a large number of examples and provides new users with the necessary background to perform acoustic modeling with TDAAPS.« less

  17. SLS-SPEC-159 Cross-Program Design Specification for Natural Environments (DSNE) Revision E

    NASA Technical Reports Server (NTRS)

    Roberts, Barry C.

    2017-01-01

    The DSNE completes environment-related specifications for architecture, system-level, and lower-tier documents by specifying the ranges of environmental conditions that must be accounted for by NASA ESD Programs. To assure clarity and consistency, and to prevent requirements documents from becoming cluttered with extensive amounts of technical material, natural environment specifications have been compiled into this document. The intent is to keep a unified specification for natural environments that each Program calls out for appropriate application. This document defines the natural environments parameter limits (maximum and minimum values, energy spectra, or precise model inputs, assumptions, model options, etc.), for all ESD Programs. These environments are developed by the NASA Marshall Space Flight Center (MSFC) Natural Environments Branch (MSFC organization code: EV44). Many of the parameter limits are based on experience with previous programs, such as the Space Shuttle Program. The parameter limits contain no margin and are meant to be evaluated individually to ensure they are reasonable (i.e., do not apply unrealistic extreme-on-extreme conditions). The natural environments specifications in this document should be accounted for by robust design of the flight vehicle and support systems. However, it is understood that in some cases the Programs will find it more effective to account for portions of the environment ranges by operational mitigation or acceptance of risk in accordance with an appropriate program risk management plan and/or hazard analysis process. The DSNE is not intended as a definition of operational models or operational constraints, nor is it adequate, alone, for ground facilities which may have additional requirements (for example, building codes and local environmental constraints). "Natural environments," as the term is used here, refers to the environments that are not the result of intended human activity or intervention. It consists of a variety of external environmental factors (most of natural origin and a few of human origin) which impose restrictions or otherwise impact the development or operation of flight vehicles and destination surface systems.

  18. Cognitive and Neural Sciences Division, 1991 Programs.

    ERIC Educational Resources Information Center

    Vaughan, Willard S., Ed.

    This report documents research and development performed under the sponsorship of the Cognitive and Neural Sciences Division of the Office of Naval Research in fiscal year 1991. It provides abstracts (title, principal investigator, project code, objective, approach, progress, and related reports) of projects of three program divisions (cognitive…

  19. An Annotated Bibliography of Current Literature Dealing with the Effective Teaching of Computer Programming in High Schools.

    ERIC Educational Resources Information Center

    Taylor, Karen A.

    This review of the literature and annotated bibliography summarizes the available research relating to teaching programming to high school students. It is noted that, while the process of programming a computer could be broken down into five steps--problem definition, algorithm design, code writing, debugging, and documentation--current research…

  20. Implementation of a Clinical Documentation Improvement Curriculum Improves Quality Metrics and Hospital Charges in an Academic Surgery Department.

    PubMed

    Reyes, Cynthia; Greenbaum, Alissa; Porto, Catherine; Russell, John C

    2017-03-01

    Accurate clinical documentation (CD) is necessary for many aspects of modern health care, including excellent communication, quality metrics reporting, and legal documentation. New requirements have mandated adoption of ICD-10-CM coding systems, adding another layer of complexity to CD. A clinical documentation improvement (CDI) and ICD-10 training program was created for health care providers in our academic surgery department. We aimed to assess the impact of our CDI curriculum by comparing quality metrics, coding, and reimbursement before and after implementation of our CDI program. A CDI/ICD-10 training curriculum was instituted in September 2014 for all members of our university surgery department. The curriculum consisted of didactic lectures, 1-on-1 provider training, case reviews, e-learning modules, and CD queries from nurse CDI staff and hospital coders. Outcomes parameters included monthly documentation completion rates, severity of illness (SOI), risk of mortality (ROM), case-mix index (CMI), all-payer refined diagnosis-related groups (APR-DRG), and Surgical Care Improvement Program (SCIP) metrics. Financial gain from responses to CDI queries was determined retrospectively. Surgery department delinquent documentation decreased by 85% after CDI implementation. Compliance with SCIP measures improved from 85% to 97%. Significant increases in surgical SOI, ROM, CMI, and APR-DRG (all p < 0.01) were found after CDI/ICD-10 training implementation. Provider responses to CDI queries resulted in an estimated $4,672,786 increase in charges. Clinical documentation improvement/ICD-10 training in an academic surgery department is an effective method to improve documentation rates, increase the hospital estimated reimbursement based on more accurate CD, and provide better compliance with surgical quality measures. Copyright © 2016 American College of Surgeons. All rights reserved.

  1. User's guide for FRMOD, a zero dimensional FRM burn code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driemeryer, D.; Miley, G.H.

    1979-10-15

    The zero-dimensional FRM plasma burn code, FRMOD is written in the FORTRAN language and is currently available on the Control Data Corporation (CDC) 7600 computer at the Magnetic Fusion Energy Computer Center (MFECC), sponsored by the US Department of Energy, in Livermore, CA. This guide assumes that the user is familiar with the system architecture and some of the utility programs available on the MFE-7600 machine, since online documentation is available for system routines through the use of the DOCUMENT utility. Users may therefore refer to it for answers to system related questions.

  2. Ohio Business Management. Technical Competency Profile (TCP).

    ERIC Educational Resources Information Center

    Ray, Gayl M.; Wilson, Nick; Mangini, Rick

    This document describes the essential competencies from secondary through post-secondary associate degree programs for a career in business management. Ohio College Tech Prep Program standards are described, and a key to profile codes is provided. Sample occupations in this career area, such as management trainee, product manager, and advertising…

  3. The Development and Validation of the Inquiry Science Observation Coding Sheet

    ERIC Educational Resources Information Center

    Brandon, P. R.; Taum, A. K. H.; Young, D. B.; Pottenger, F. M., III

    2008-01-01

    Evaluation reports increasingly document the degree of program implementation, particularly the extent to which programs adhere to prescribed steps and procedures. Many reports are cursory, however, and few, if any, fully portray the long and winding path taken when developing evaluation instruments, particularly observation instruments. In this…

  4. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  5. Base heating methodology improvements, volume 1

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold

    1992-01-01

    This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.

  6. Program Synthesizes UML Sequence Diagrams

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2006-01-01

    A computer program called "Rational Sequence" generates Universal Modeling Language (UML) sequence diagrams of a target Java program running on a Java virtual machine (JVM). Rational Sequence thereby performs a reverse engineering function that aids in the design documentation of the target Java program. Whereas previously, the construction of sequence diagrams was a tedious manual process, Rational Sequence generates UML sequence diagrams automatically from the running Java code.

  7. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  8. Proteus two-dimensional Navier-Stokes computer code, version 2.0. Volume 3: Programmer's reference

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 2D was developed to solve the two-dimensional planar or axisymmetric, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort was to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The governing equations are solved in generalized nonorthogonal body-fitted coordinates, by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. The Programmer's Reference contains detailed information useful when modifying the program. The program structure, the Fortran variables stored in common blocks, and the details of each subprogram are described.

  9. Proteus three-dimensional Navier-Stokes computer code, version 1.0. Volume 3: Programmer's reference

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 3D was developed to solve the three-dimensional, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort was to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The governing equations are solved in generalized nonorthogonal body fitted coordinates, by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. The Programmer's Reference contains detailed information useful when modifying the program. The program structure, the Fortran variables stored in common blocks, and the details of each subprogram are described.

  10. Proteus three-dimensional Navier-Stokes computer code, version 1.0. Volume 2: User's guide

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 3D was developed to solve the three-dimensional, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort was to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The governing equations are solved in generalized nonorthogonal body-fitted coordinates, by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. This User's Guide describes the program's features, the input and output, the procedure for setting up initial conditions, the computer resource requirements, the diagnostic messages that may be generated, the job control language used to run the program, and several test cases.

  11. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  12. TRIAD VIII: Nationwide Multicenter Evaluation to Determine Whether Patient Video Testimonials Can Safely Help Ensure Appropriate Critical Versus End-of-Life Care.

    PubMed

    Mirarchi, Ferdinando L; Cooney, Timothy E; Venkat, Arvind; Wang, David; Pope, Thaddeus M; Fant, Abra L; Terman, Stanley A; Klauer, Kevin M; Williams-Murphy, Monica; Gisondi, Michael A; Clemency, Brian; Doshi, Ankur A; Siegel, Mari; Kraemer, Mary S; Aberger, Kate; Harman, Stephanie; Ahuja, Neera; Carlson, Jestin N; Milliron, Melody L; Hart, Kristopher K; Gilbertson, Chelsey D; Wilson, Jason W; Mueller, Larissa; Brown, Lori; Gordon, Bradley D

    2017-06-01

    End-of-life interventions should be predicated on consensus understanding of patient wishes. Written documents are not always understood; adding a video testimonial/message (VM) might improve clarity. Goals of this study were to (1) determine baseline rates of consensus in assigning code status and resuscitation decisions in critically ill scenarios and (2) determine whether adding a VM increases consensus. We randomly assigned 2 web-based survey links to 1366 faculty and resident physicians at institutions with graduate medical education programs in emergency medicine, family practice, and internal medicine. Each survey asked for code status interpretation of stand-alone Physician Orders for Life-Sustaining Treatment (POLST) and living will (LW) documents in 9 scenarios. Respondents assigned code status and resuscitation decisions to each scenario. For 1 of 2 surveys, a VM was included to help clarify patient wishes. Response rate was 54%, and most were male emergency physicians who lacked formal advanced planning document interpretation training. Consensus was not achievable for stand-alone POLST or LW documents (68%-78% noted "DNR"). Two of 9 scenarios attained consensus for code status (97%-98% responses) and treatment decisions (96%-99%). Adding a VM significantly changed code status responses by 9% to 62% (P ≤ 0.026) in 7 of 9 scenarios with 4 achieving consensus. Resuscitation responses changed by 7% to 57% (P ≤ 0.005) with 4 of 9 achieving consensus with VMs. For most scenarios, consensus was not attained for code status and resuscitation decisions with stand-alone LW and POLST documents. Adding VMs produced significant impacts toward achieving interpretive consensus.

  13. USL/DBMS NASA/PC R and D project C programming standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1984-01-01

    A set of programming standards intended to promote reliability, readability, and portability of C programs written for PC research and development projects is established. These standards must be adhered to except where reasons for deviation are clearly identified and approved by the PC team. Any approved deviation from these standards must also be clearly documented in the pertinent source code.

  14. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    NASA Astrophysics Data System (ADS)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  15. Use, Assessment, and Improvement of the Loci-CHEM CFD Code for Simulation of Combustion in a Single Element GO2/GH2 Injector and Chamber

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Lin, Jeff; West, Jeff; Tucker, Kevin

    2006-01-01

    This document is a viewgraph presentation of a paper that documents a continuing effort at Marshall Space Flight Center (MSFC) to use, assess, and continually improve CFD codes to the point of material utility in the design of rocket engine combustion devices. This paper describes how the code is presently being used to simulate combustion in a single element combustion chamber with shear coaxial injectors using gaseous oxygen and gaseous hydrogen propellants. The ultimate purpose of the efforts documented is to assess and further improve the Loci-CHEM code and the implementation of it. Single element shear coaxial injectors were tested as part of the Staged Combustion Injector Technology (SCIT) program, where detailed chamber wall heat fluxes were measured. Data was taken over a range of chamber pressures for propellants injected at both ambient and elevated temperatures. Several test cases are simulated as part of the effort to demonstrate use of the Loci-CHEM CFD code and to enable us to make improvements in the code as needed. The simulations presented also include a grid independence study on hybrid grids. Several two-equation eddy viscosity low Reynolds number turbulence models are also evaluated as part of the study. All calculations are presented with a comparison to the experimental data. Weaknesses of the code relative to test data are discussed and continuing efforts to improve the code are presented.

  16. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  17. Software to model AXAF image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees

    1993-01-01

    This draft final report describes the work performed under this delivery order from May 1992 through June 1993. The purpose of this contract was to enhance and develop an integrated optical performance modeling software for complex x-ray optical systems such as AXAF. The GRAZTRACE program developed by the MSFC Optical Systems Branch for modeling VETA-I was used as the starting baseline program. The original program was a large single file program and, therefore, could not be modified very efficiently. The original source code has been reorganized, and a 'Make Utility' has been written to update the original program. The new version of the source code consists of 36 small source files to make it easier for the code developer to manage and modify the program. A user library has also been built and a 'Makelib' utility has been furnished to update the library. With the user library, the users can easily access the GRAZTRACE source files and build a custom library. A user manual for the new version of GRAZTRACE has been compiled. The plotting capability for the 3-D point spread functions and contour plots has been provided in the GRAZTRACE using the graphics package DISPLAY. The Graphics emulator over the network has been set up for programming the graphics routine. The point spread function and the contour plot routines have also been modified to display the plot centroid, and to allow the user to specify the plot range, and the viewing angle options. A Command Mode version of GRAZTRACE has also been developed. More than 60 commands have been implemented in a Code-V like format. The functions covered in this version include data manipulation, performance evaluation, and inquiry and setting of internal parameters. The user manual for these commands has been formatted as in Code-V, showing the command syntax, synopsis, and options. An interactive on-line help system for the command mode has also been accomplished to allow the user to find valid commands, command syntax, and command function. A translation program has been written to convert FEA output from structural analysis to GRAZTRACE surface deformation file (.dfm file). The program can accept standard output files and list files from COSMOS/M and NASTRAN finite analysis programs. Some interactive options are also provided, such as Cartesian or cylindrical coordinate transformation, coordinate shift and scale, and axial length change. A computerized database for technical documents relating to the AXAF project has been established. Over 5000 technical documents have been entered into the master database. A user can now rapidly retrieve the desired documents relating to the AXAF project. The summary of the work performed under this contract is shown.

  18. Minnesota Department of Human Services audit of medication therapy management programs.

    PubMed

    Smith, Stephanie; Cell, Penny; Anderson, Lowell; Larson, Tom

    2013-01-01

    To inform medication therapy management (MTM) providers of findings of the Minnesota Department of Human Services review of claims submitted to Minnesota Health Care Programs (MHCP) for patients receiving MTM services and to discuss the impact of the audit on widespread MTM services and future audits. A retrospective review was completed on MTM claims submitted to MHCP from 2008 to 2010. The auditor verified that the Current Procedural Terminology codes billed matched the actual number of medications, conditions, and drug therapy problems assessed during an encounter. 190 claims were reviewed for 57 distinct pharmacies that billed for MTM services from 2008 to 2010, representing 4.5% of all claims submitted. The auditor reported that generally, the documentation within the electronic medical record had the least "up-coding" of all documentation systems. A total of 18 claims were coded at a higher level than appropriate, but only 10 notices were sent out to recover money because the others did not meet the minimum $50 threshold. The auditor expressed concerns that a number of claims billed at the highest complexity level were only 15 minutes long. Providers will need to be cautious of the conditions that they bill as complex and of how they define drug therapy problems. Everything for which is being billed must be clearly assessed or rationalized in the documentation note. The auditor expressed that overall, documentation was well done; however, many MTM providers are now asking how to internally prepare for future audits.

  19. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  20. Ohio Financial Services and Risk Management. Technical Competency Profile (TCP).

    ERIC Educational Resources Information Center

    Ray, Gayl M.; Wilson, Nick; Mangini, Rick

    This document describes the essential competencies from secondary through post-secondary associate degree programs for a career in financial services and risk management. Ohio College Tech Prep Program standards are described, and a key to profile codes is provided. Sample occupations in this career area, such as financial accountant, loan…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singleton, Jr., Robert; Israel, Daniel M.; Doebling, Scott William

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returnedmore » at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.« less

  2. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  3. Earth Observatory Satellite system definition study. Report no. 3: Design/cost tradeoff studies. Appendix C: EOS program requirements document

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis of the requirements for the Earth Observatory Satellite (EOS) system specifications is presented. The analysis consists of requirements obtained from existing documentation and those derived from functional analysis. The requirements follow the hierarchy of program, mission, system, and subsystem. The code for designating specific requirements is explained. Among the subjects considered are the following: (1) the traffic model, (2) space shuttle related performance, (3) booster related performance, (4) the data collection system, (5) spacecraft structural tests, and (6) the ground support requirements.

  4. Virtual Doors to Brick and Mortar Learning

    NASA Astrophysics Data System (ADS)

    Shaw, M. S.; Gay, P. L.; Meyer, D. T.; Zamfirescu, J. D.; Smith, J. E.; MIT Educational Studies Program Team

    2005-12-01

    The MIT Educational Studies Program (ESP) has spent the past year developing an online gateway for outreach programs. The website has a five-fold purpose: to introduce the organization to potential students, teachers, volunteers and collaborators; to allow teachers to create, design and interact with classes and to allow students to register for and dialogue with these classes; to provide an online forum for continuing dialogue; and to provide organizers a wiki for documenting program administration. What makes our site unique is the free and flexible nature of our easily edited and expanded code. In its standard installation, teachers setup classes, and administrators can approve/edit classes and make classes visible in an online catalogues. Student registration is completely customizable - students can register for self-selected classes, or they can register for a program and later get placed into teacher-selected classes. Free wiki software allows users to interactively create and edit documentation and knowledgebases. This allows administrators to track online what has been done while at the same time creating instant documentation for future programs. The online forum is a place where students can go after our programs end to learn more, interact with their classmates, and continue dialogues started in our classrooms. We also use the forum to get feedback on past and future programs. The ease with which the software handles program creation, registration, communications and more allows programs for roughly 3000 students per year to be handled by about 20 volunteering undergraduates. By combining all these elements - promotion, class creation, program registration, an organizational wiki, and student forums - we create a one-stop virtual entryway into face-to-face learning that allows students to continue their experience after they leave the classroom. The code for this site is available for free upon request to other organizations.

  5. A users' guide to the trace contaminant control simulation computer program

    NASA Technical Reports Server (NTRS)

    Perry, J. L.

    1994-01-01

    The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various trace contaminant control technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. The results obtained from the program can be useful in assessing different technology combinations, system sizing, system location with respect to other life support systems, and the overall life cycle economics of a trace contaminant control system. The user's manual is extracted in its entirety from NASA TM-108409 to provide a stand-alone reference for using any version of the program. The first publication of the manual as part of TM-108409 also included a detailed listing of version 8.0 of the program. As changes to the code were necessary, it became apparent that the user's manual should be separate from the computer code documentation and be general enough to provide guidance in using any version of the program. Provided in the guide are tips for input file preparation, general program execution, and output file manipulation. Information concerning source code listings of the latest version of the computer program may be obtained by contacting the author.

  6. Modeling of rolling element bearing mechanics. Theoretical manual

    NASA Technical Reports Server (NTRS)

    Merchant, David H.; Greenhill, Lyn M.

    1994-01-01

    This report documents the theoretical basis for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings; duplex angular contact ball bearings; and cylindrical roller bearings. The model includes the effects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program; and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. A companion report addresses the input instructions for and features of the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  7. SEGY to ASCII: Conversion and Plotting Program

    USGS Publications Warehouse

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  8. The Navy/NASA Engine Program (NNEP89): A user's manual

    NASA Technical Reports Server (NTRS)

    Plencner, Robert M.; Snyder, Christopher A.

    1991-01-01

    An engine simulation computer code called NNEP89 was written to perform 1-D steady state thermodynamic analysis of turbine engine cycles. By using a very flexible method of input, a set of standard components are connected at execution time to simulate almost any turbine engine configuration that the user could imagine. The code was used to simulate a wide range of engine cycles from turboshafts and turboprops to air turborockets and supersonic cruise variable cycle engines. Off design performance is calculated through the use of component performance maps. A chemical equilibrium model is incorporated to adequately predict chemical dissociation as well as model virtually any fuel. NNEP89 is written in standard FORTRAN77 with clear structured programming and extensive internal documentation. The standard FORTRAN77 programming allows it to be installed onto most mainframe computers and workstations without modification. The NNEP89 code was derived from the Navy/NASA Engine program (NNEP). NNEP89 provides many improvements and enhancements to the original NNEP code and incorporates features which make it easier to use for the novice user. This is a comprehensive user's guide for the NNEP89 code.

  9. MPS Solidification Model. Volume 2: Operating guide and software documentation for the unsteady model

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1981-01-01

    The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.

  10. NASTRAN hydroelastic modal studies. Volume 2: Programmer documentation

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The operational steps, data descriptions, and program code for the new NASTRAN hydroelastic analysis system are described. The overall flow of the system is described, followed by the descriptions of the individual modules and their subroutines.

  11. Advanced Fuel Properties; A Computer Program for Estimating Property Values

    DTIC Science & Technology

    1993-05-01

    security considerations, contractual obligations, or notice on a specific document. REPORT DOCUMENTATION PAGE Fogu Approwd I OMB No. 0704-01=5 Ps NP...found in fuels. 14. SUBJECT TERMS 15. NUMBEROF PAGES 175 Fuel properties, Physical Propertie, Thermodynamnics, Predictions 16. PRICE CODE 17. SECURITY ...CLASSIFICATION is. SECURrrY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. LIMITFATION OF ABSTRACT OF REPORT OF THIS PAGE OF ABSTRACT Unclassified

  12. An improved panel method for the solution of three-dimensional leading edge vortex flows Volume 2: User's guide and programmer's document

    NASA Technical Reports Server (NTRS)

    Tinoco, E. N.; Lu, P.; Johnson, F. T.

    1980-01-01

    A computer program developed for solving the subsonic, three dimensional flow over wing-body configurations with leading edge vortex separation is presented. Instructions are given for the proper set up and input of a problem into the computer code. Program input formats and output are described, as well as the overlay structure of the program. The program is written in FORTRAN.

  13. AutoBayes Program Synthesis System Users Manual

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd

    2008-01-01

    Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.

  14. Asynchronous Training in Pharmaceutical Manufacturing: A Model for University and Industrial Collaboration

    ERIC Educational Resources Information Center

    Elliot, Norbert; Haggerty, Blake; Foster, Mary; Spak, Gale

    2008-01-01

    The present study documents the results of a 17-month program to train Cardinal Health Pharmaceutical Technology Services (PTS) employees in an innovative model that combines investigative and writing techniques. Designed to address the Code of Federal Regulations (CFR) for the United States Food and Drug Administration (FDA), the program is a…

  15. Idaho National Engineering Laboratory code assessment of the Rocky Flats transuranic waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-07-01

    This report is an assessment of the content codes associated with transuranic waste shipped from the Rocky Flats Plant in Golden, Colorado, to INEL. The primary objective of this document is to characterize and describe the transuranic wastes shipped to INEL from Rocky Flats by item description code (IDC). This information will aid INEL in determining if the waste meets the waste acceptance criteria (WAC) of the Waste Isolation Pilot Plant (WIPP). The waste covered by this content code assessment was shipped from Rocky Flats between 1985 and 1989. These years coincide with the dates for information available in themore » Rocky Flats Solid Waste Information Management System (SWIMS). The majority of waste shipped during this time was certified to the existing WIPP WAC. This waste is referred to as precertified waste. Reassessment of these precertified waste containers is necessary because of changes in the WIPP WAC. To accomplish this assessment, the analytical and process knowledge available on the various IDCs used at Rocky Flats were evaluated. Rocky Flats sources for this information include employee interviews, SWIMS, Transuranic Waste Certification Program, Transuranic Waste Inspection Procedure, Backlog Waste Baseline Books, WIPP Experimental Waste Characterization Program (headspace analysis), and other related documents, procedures, and programs. Summaries are provided of: (a) certification information, (b) waste description, (c) generation source, (d) recovery method, (e) waste packaging and handling information, (f) container preparation information, (g) assay information, (h) inspection information, (i) analytical data, and (j) RCRA characterization.« less

  16. Proteus two-dimensional Navier-Stokes computer code, version 2.0. Volume 2: User's guide

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Bui, Trong T.

    1993-01-01

    A computer code called Proteus 2D was developed to solve the two-dimensional planar or axisymmetric, Reynolds-averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The objective in this effort was to develop a code for aerospace propulsion applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The governing equations are solved in generalized nonorthogonal body-fitted coordinates, by marching in time using a fully-coupled ADI solution procedure. The boundary conditions are treated implicitly. All terms, including the diffusion terms, are linearized using second-order Taylor series expansions. Turbulence is modeled using either an algebraic or two-equation eddy viscosity model. The thin-layer or Euler equations may also be solved. The energy equation may be eliminated by the assumption of constant total enthalpy. Explicit and implicit artificial viscosity may be used. Several time step options are available for convergence acceleration. The documentation is divided into three volumes. This is the User's Guide, and describes the program's features, the input and output, the procedure for setting up initial conditions, the computer resource requirements, the diagnostic messages that may be generated, the job control language used to run the program, and several test cases.

  17. TORO II: A finite element computer program for nonlinear quasi-static problems in electromagnetics: Part 1, Theoretical background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gartling, D.K.

    The theoretical and numerical background for the finite element computer program, TORO II, is presented in detail. TORO II is designed for the multi-dimensional analysis of nonlinear, electromagnetic field problems described by the quasi-static form of Maxwell`s equations. A general description of the boundary value problems treated by the program is presented. The finite element formulation and the associated numerical methods used in TORO II are also outlined. Instructions for the use of the code are documented in SAND96-0903; examples of problems analyzed with the code are also provided in the user`s manual. 24 refs., 8 figs.

  18. Medicaid provider reimbursement policy for adult immunizations☆

    PubMed Central

    Stewart, Alexandra M.; Lindley, Megan C.; Cox, Marisa A.

    2015-01-01

    Background State Medicaid programs establish provider reimbursement policy for adult immunizations based on: costs, private insurance payments, and percentage of Medicare payments for equivalent services. Each program determines provider eligibility, payment amount, and permissible settings for administration. Total reimbursement consists of different combinations of Current Procedural Terminology codes: vaccine, vaccine administration, and visit. Objective Determine how Medicaid programs in the 50 states and the District of Columbia approach provider reimbursement for adult immunizations. Design Observational analysis using document review and a survey. Setting and participants Medicaid administrators in 50 states and the District of Columbia. Measurements Whether fee-for-service programs reimburse providers for: vaccines; their administration; and/or office visits when provided to adult enrollees. We assessed whether adult vaccination services are reimbursed when administered by a wide range of providers in a wide range of settings. Results Medicaid programs use one of 4 payment methods for adults: (1) a vaccine and an administration code; (2) a vaccine and visit code; (3) a vaccine code; and (4) a vaccine, visit, and administration code. Limitations Study results do not reflect any changes related to implementation of national health reform. Nine of fifty one programs did not respond to the survey or declined to participate, limiting the information available to researchers. Conclusions Medicaid reimbursement policy for adult vaccines impacts provider participation and enrollee access and uptake. While programs have generally increased reimbursement levels since 2003, each program could assess whether current policies reflect the most effective approach to encourage providers to increase vaccination services. PMID:26403369

  19. Medicaid provider reimbursement policy for adult immunizations.

    PubMed

    Stewart, Alexandra M; Lindley, Megan C; Cox, Marisa A

    2015-10-26

    State Medicaid programs establish provider reimbursement policy for adult immunizations based on: costs, private insurance payments, and percentage of Medicare payments for equivalent services. Each program determines provider eligibility, payment amount, and permissible settings for administration. Total reimbursement consists of different combinations of Current Procedural Terminology codes: vaccine, vaccine administration, and visit. Determine how Medicaid programs in the 50 states and the District of Columbia approach provider reimbursement for adult immunizations. Observational analysis using document review and a survey. Medicaid administrators in 50 states and the District of Columbia. Whether fee-for-service programs reimburse providers for: vaccines; their administration; and/or office visits when provided to adult enrollees. We assessed whether adult vaccination services are reimbursed when administered by a wide range of providers in a wide range of settings. Medicaid programs use one of 4 payment methods for adults: (1) a vaccine and an administration code; (2) a vaccine and visit code; (3) a vaccine code; and (4) a vaccine, visit, and administration code. Study results do not reflect any changes related to implementation of national health reform. Nine of fifty one programs did not respond to the survey or declined to participate, limiting the information available to researchers. Medicaid reimbursement policy for adult vaccines impacts provider participation and enrollee access and uptake. While programs have generally increased reimbursement levels since 2003, each program could assess whether current policies reflect the most effective approach to encourage providers to increase vaccination services. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Discovering DISCOVER

    DTIC Science & Technology

    1997-10-01

    used to establish associations between source code and Adobe FrameMaker documents. The associations are represented as links that facilitate...possible (such as that provided with FrameMaker ). There is no scripting interface that would enable end-user programming of its modules. The suite of

  1. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  2. NASA Requirements for Ground-Based Pressure Vessels and Pressurized Systems (PVS). Revision C

    NASA Technical Reports Server (NTRS)

    Greulich, Owen Rudolf

    2017-01-01

    The purpose of this document is to ensure the structural integrity of PVS through implementation of a minimum set of requirements for ground-based PVS in accordance with this document, NASA Policy Directive (NPD) 8710.5, NASA Safety Policy for Pressure Vessels and Pressurized Systems, NASA Procedural Requirements (NPR) 8715.3, NASA General Safety Program Requirements, applicable Federal Regulations, and national consensus codes and standards (NCS).

  3. Sierra Toolkit Manual Version 4.48.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Toolkit Team

    This report provides documentation for the SIERRA Toolkit (STK) modules. STK modules are intended to provide infrastructure that assists the development of computational engineering soft- ware such as finite-element analysis applications. STK includes modules for unstructured-mesh data structures, reading/writing mesh files, geometric proximity search, and various utilities. This document contains a chapter for each module, and each chapter contains overview descriptions and usage examples. Usage examples are primarily code listings which are generated from working test programs that are included in the STK code-base. A goal of this approach is to ensure that the usage examples will not fall outmore » of date. This page intentionally left blank.« less

  4. Conjunctive programming: An interactive approach to software system synthesis

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1992-01-01

    This report introduces a technique of software documentation called conjunctive programming and discusses its role in the development and maintenance of software systems. The report also describes the conjoin tool, an adjunct to assist practitioners. Aimed at supporting software reuse while conforming with conventional development practices, conjunctive programming is defined as the extraction, integration, and embellishment of pertinent information obtained directly from an existing database of software artifacts, such as specifications, source code, configuration data, link-edit scripts, utility files, and other relevant information, into a product that achieves desired levels of detail, content, and production quality. Conjunctive programs typically include automatically generated tables of contents, indexes, cross references, bibliographic citations, tables, and figures (including graphics and illustrations). This report presents an example of conjunctive programming by documenting the use and implementation of the conjoin program.

  5. Minnesota Department of Transportation Research Services : 2009 annual report.

    DOT National Transportation Integrated Search

    2010-01-01

    The purpose of this report is to meet the requirements set forth by the Code of Federal Regulations, : Part 420Planning and Research Program Administration420.117 2(e): : Suitable reports that document the results of activities performed wit...

  6. A Comprehensive review on the open source hackable text editor-ATOM

    NASA Astrophysics Data System (ADS)

    Sumangali, K.; Borra, Lokesh; Suraj Mishra, Amol

    2017-11-01

    This document represents a comprehensive study of “Atom”, one of the best open-source code editors available with many features built-in to support multitude of programming environments and to provide a more productive toolset for developers.

  7. 40 CFR 131.44 - Florida.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY STANDARDS Federally Promulgated Water Quality Standards § 131.44 Florida. (a) Phosphorus Rule. (1) The document entitled “Florida Administrative Code, Chapter 62-302, Surface Water Quality Standards, Section 62-302.540...

  8. 78 FR 14467 - Energy Conservation Program: Availability of the Preliminary Technical Support Document for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-06

    .... EERE-2011-BT-STD-0006] RIN 1904-AC43 Energy Conservation Program: Availability of the Preliminary... paragraph, on the first and second lines, `` GSFL-IRL_2011-STD[email protected] '' should read `` GSFL-IRL_2011-STD[email protected] ''. [FR Doc. C1-2013-04711 Filed 3-5-13; 8:45 am] BILLING CODE 1505-01-D ...

  9. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  10. A Concept for Run-Time Support of the Chapel Language

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, D L

    The US Department of Energy (DOE) has been conducting, through several of its operating contractors, an evaluation and testing program to qualify Type A radioactive material packagings per US Department of Transportation (DOT) Specification 7A (DOT-7A) of the Code of Federal Regulations (CFR), Title 49, Part 178 (49 CFR 178). This document summarizes the evaluation and testing performed for all of the packagings successfully qualified in this program. This document supersedes DOE Evaluation Document for DOT-7A Type A Packaging (Edling 1987), originally issued in 1987 by Monsanto Research Corporation Mound Laboratory (MLM), Miamisburg, Ohio, for the Department of Energy, Securitymore » Evaluation Program (I)P-4. Mound Laboratory issued four revisions to the document between November 1988 and December 1989. In September 1989, the program was transferred to Westinghouse Hanford Company (Westinghouse Hanford) in Richland, Washington. One additional revision was issued in March 1990 by Westinghouse Hanford. This revision reflects the earlier material and incorporates a number of changes. Evaluation and testing activities on 1208 three DOT-7A Program Dockets resulted in the qualification of three new packaging configurations, which are incorporated herein and summarized. This document presents approximately 300 different packagings that have been determined to meet the requirements for a DOT-7A, type A packaging per 49 CFR 178.350.« less

  12. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    PubMed

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  13. Analyzing and modeling gravity and magnetic anomalies using the SPHERE program and Magsat data

    NASA Technical Reports Server (NTRS)

    Braile, L. W.; Hinze, W. J.; Vonfrese, R. R. B. (Principal Investigator)

    1981-01-01

    Computer codes were completed, tested, and documented for analyzing magnetic anomaly vector components by equivalent point dipole inversion. The codes are intended for use in inverting the magnetic anomaly due to a spherical prism in a horizontal geomagnetic field and for recomputing the anomaly in a vertical geomagnetic field. Modeling of potential fields at satellite elevations that are derived from three dimensional sources by program SPHERE was made significantly more efficient by improving the input routines. A preliminary model of the Andean subduction zone was used to compute the anomaly at satellite elevations using both actual geomagnetic parameters and vertical polarization. Program SPHERE is also being used to calculate satellite level magnetic and gravity anomalies from the Amazon River Aulacogen.

  14. Clean Energy in City Codes: A Baseline Analysis of Municipal Codification across the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeffrey J.; Aznar, Alexandra; Dane, Alexander

    Municipal governments in the United States are well positioned to influence clean energy (energy efficiency and alternative energy) and transportation technology and strategy implementation within their jurisdictions through planning, programs, and codification. Municipal governments are leveraging planning processes and programs to shape their energy futures. There is limited understanding in the literature related to codification, the primary way that municipal governments enact enforceable policies. The authors fill the gap in the literature by documenting the status of municipal codification of clean energy and transportation across the United States. More directly, we leverage online databases of municipal codes to develop nationalmore » and state-specific representative samples of municipal governments by population size. Our analysis finds that municipal governments with the authority to set residential building energy codes within their jurisdictions frequently do so. In some cases, communities set codes higher than their respective state governments. Examination of codes across the nation indicates that municipal governments are employing their code as a policy mechanism to address clean energy and transportation.« less

  15. NEQAIR96,Nonequilibrium and Equilibrium Radiative Transport and Spectra Program: User's Manual

    NASA Technical Reports Server (NTRS)

    Whiting, Ellis E.; Park, Chul; Liu, Yen; Arnold, James O.; Paterson, John A.

    1996-01-01

    This document is the User's Manual for a new version of the NEQAIR computer program, NEQAIR96. The program is a line-by-line and a line-of-sight code. It calculates the emission and absorption spectra for atomic and diatomic molecules and the transport of radiation through a nonuniform gas mixture to a surface. The program has been rewritten to make it easy to use, run faster, and include many run-time options that tailor a calculation to the user's requirements. The accuracy and capability have also been improved by including the rotational Hamiltonian matrix formalism for calculating rotational energy levels and Hoenl-London factors for dipole and spin-allowed singlet, doublet, triplet, and quartet transitions. Three sample cases are also included to help the user become familiar with the steps taken to produce a spectrum. A new user interface is included that uses check location, to select run-time options and to enter selected run data, making NEQAIR96 easier to use than the older versions of the code. The ease of its use and the speed of its algorithms make NEQAIR96 a valuable educational code as well as a practical spectroscopic prediction and diagnostic code.

  16. PyORBIT: A Python Shell For ORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jean-Francois Ostiguy; Jeffrey Holmes

    2003-07-01

    ORBIT is code developed at SNS to simulate beam dynamics in accumulation rings and synchrotrons. The code is structured as a collection of external C++ modules for SuperCode, a high level interpreter shell developed at LLNL in the early 1990s. SuperCode is no longer actively supported and there has for some time been interest in replacing it by a modern scripting language, while preserving the feel of the original ORBIT program. In this paper, we describe a new version of ORBIT where the role of SuperCode is assumed by Python, a free, well-documented and widely supported object-oriented scripting language. Wemore » also compare PyORBIT to ORBIT from the standpoint of features, performance and future expandability.« less

  17. GCS programmer's manual

    NASA Technical Reports Server (NTRS)

    Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.

    1990-01-01

    A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.

  18. Stonix, Version 0.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-05-13

    STONIX is a program for configuring UNIX and Linux computer operating systems. It applies configurations based on the guidance from publicly accessible resources such as: NSA Guides, DISA STIGs, the Center for Internet Security (CIS), USGCB and vendor security documentation. STONIX is written in the Python programming language using the QT4 and PyQT4 libraries to provide a GUI. The code is designed to be easily extensible and customizable.

  19. Nondiscrimination on the Basis of Handicap in Programs or Activities Receiving Federal Financial Assistance. Title 34, Code of Federal Regulations, Part 104.

    ERIC Educational Resources Information Center

    Office for Civil Rights (ED), Washington, DC.

    This document presents the final regulations of the Office for Civil Rights, Department of Education, concerning nondiscrimination on the basis of handicap in programs or activities receiving federal financial assistance under Section 504 of the Rehabilitation Act of 1973. The first part covers general provisions such as purpose, application,…

  20. C style guide

    NASA Technical Reports Server (NTRS)

    Doland, Jerry; Valett, Jon

    1994-01-01

    This document discusses recommended practices and style for programmers using the C language in the Flight Dynamics Division environment. Guidelines are based on generally recommended software engineering techniques, industry resources, and local convention. The Guide offers preferred solutions to common C programming issues and illustrates through examples of C Code.

  1. Document image retrieval through word shape coding.

    PubMed

    Lu, Shijian; Li, Linlin; Tan, Chew Lim

    2008-11-01

    This paper presents a document retrieval technique that is capable of searching document images without OCR (optical character recognition). The proposed technique retrieves document images by a new word shape coding scheme, which captures the document content through annotating each word image by a word shape code. In particular, we annotate word images by using a set of topological shape features including character ascenders/descenders, character holes, and character water reservoirs. With the annotated word shape codes, document images can be retrieved by either query keywords or a query document image. Experimental results show that the proposed document image retrieval technique is fast, efficient, and tolerant to various types of document degradation.

  2. 17 CFR 232.106 - Prohibition against electronic submissions containing executable code.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... executable code will be suspended, unless the executable code is contained only in one or more PDF documents, in which case the submission will be accepted but the PDF document(s) containing executable code will...

  3. Code Parallelization with CAPO: A User Manual

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.

  4. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  5. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    NASA Astrophysics Data System (ADS)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git has been adopted in order to facilitate the collaborative maintenance and improvement of the code; CopyrightsOFF is a free software that anyone can use, copy, distribute, study, change and improve under the GNU Public License version 3. The present paper is a manifesto of OFF code and presents the currently implemented features and ongoing developments. This work is focused on the computational techniques adopted and a detailed description of the main API characteristics is reported. OFF capabilities are demonstrated by means of one and two dimensional examples and a three dimensional real application.

  6. Clean air program : design guidelines for bus transit systems using compressed natural gas as an alternative fuel

    DOT National Transportation Integrated Search

    1996-06-01

    This report documents design guidelines for the safe use of Compressed Natural Gas (CNG). The report is designed to provide guidance, information on safe industry practices, applicable national codes and standards, and reference data that transit age...

  7. Cognitive and Neural Sciences Division, 1988 Programs.

    ERIC Educational Resources Information Center

    Vaughan, Willard S., Ed.

    The research and development efforts performed by principal investigators under sponsorship of the Office of Naval Research Cognitive and Neural Sciences Division during 1988 are documented. The title, name and affiliation of the principal investigator, project code, contract number, current end date, technical objective, approach, and progress of…

  8. Cryptography based on the absorption/emission features of multicolor semiconductor nanocrystal quantum dots.

    PubMed

    Zhou, Ming; Chang, Shoude; Grover, Chander

    2004-06-28

    Further to the optical coding based on fluorescent semiconductor quantum dots (QDs), a concept of using mixtures of multiple single-color QDs for creating highly secret cryptograms based on their absorption/emission properties was demonstrated. The key to readout of the optical codes is a group of excitation lights with the predetermined wavelengths programmed in a secret manner. The cryptograms can be printed on the surfaces of different objects such as valuable documents for security purposes.

  9. A computer program for processing impedance cardiographic data: Improving accuracy through user-interactive software

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Naifeh, Karen; Thrasher, Chet

    1988-01-01

    This report contains the source code and documentation for a computer program used to process impedance cardiography data. The cardiodynamic measures derived from impedance cardiography are ventricular stroke column, cardiac output, cardiac index and Heather index. The program digitizes data collected from the Minnesota Impedance Cardiograph, Electrocardiography (ECG), and respiratory cycles and then stores these data on hard disk. It computes the cardiodynamic functions using interactive graphics and stores the means and standard deviations of each 15-sec data epoch on floppy disk. This software was designed on a Digital PRO380 microcomputer and used version 2.0 of P/OS, with (minimally) a 4-channel 16-bit analog/digital (A/D) converter. Applications software is written in FORTRAN 77, and uses Digital's Pro-Tool Kit Real Time Interface Library, CORE Graphic Library, and laboratory routines. Source code can be readily modified to accommodate alternative detection, A/D conversion and interactive graphics. The object code utilizing overlays and multitasking has a maximum of 50 Kbytes.

  10. Documentation of the Fourth Order Band Model

    NASA Technical Reports Server (NTRS)

    Kalnay-Rivas, E.; Hoitsma, D.

    1979-01-01

    A general circulation model is presented which uses quadratically conservative, fourth order horizontal space differences on an unstaggered grid and second order vertical space differences with a forward-backward or a smooth leap frog time scheme to solve the primitive equations of motion. The dynamic equations for motion, finite difference equations, a discussion of the structure and flow chart of the program code, a program listing, and three relevent papers are given.

  11. Aero/structural tailoring of engine blades (AERO/STAEBL)

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1988-01-01

    This report describes the Aero/Structural Tailoring of Engine Blades (AERO/STAEBL) program, which is a computer code used to perform engine fan and compressor blade aero/structural numerical optimizations. These optimizations seek a blade design of minimum operating cost that satisfies realistic blade design constraints. This report documents the overall program (i.e., input, optimization procedures, approximate analyses) and also provides a detailed description of the validation test cases.

  12. Text Exchange System

    NASA Technical Reports Server (NTRS)

    Snyder, W. V.; Hanson, R. J.

    1986-01-01

    Text Exchange System (TES) exchanges and maintains organized textual information including source code, documentation, data, and listings. System consists of two computer programs and definition of format for information storage. Comprehensive program used to create, read, and maintain TES files. TES developed to meet three goals: First, easy and efficient exchange of programs and other textual data between similar and dissimilar computer systems via magnetic tape. Second, provide transportable management system for textual information. Third, provide common user interface, over wide variety of computing systems, for all activities associated with text exchange.

  13. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  14. Liquid rocket combustion computer model with distributed energy release. DER computer program documentation and user's guide, volume 1

    NASA Technical Reports Server (NTRS)

    Combs, L. P.

    1974-01-01

    A computer program for analyzing rocket engine performance was developed. The program is concerned with the formation, distribution, flow, and combustion of liquid sprays and combustion product gases in conventional rocket combustion chambers. The capabilities of the program to determine the combustion characteristics of the rocket engine are described. Sample data code sheets show the correct sequence and formats for variable values and include notes concerning options to bypass the input of certain data. A seperate list defines the variables and indicates their required dimensions.

  15. Use of General-purpose Negation Detection to Augment Concept Indexing of Medical Documents

    PubMed Central

    Mutalik, Pradeep G.; Deshpande, Aniruddha; Nadkarni, Prakash M.

    2001-01-01

    Objectives: To test the hypothesis that most instances of negated concepts in dictated medical documents can be detected by a strategy that relies on tools developed for the parsing of formal (computer) languages—specifically, a lexical scanner (“lexer”) that uses regular expressions to generate a finite state machine, and a parser that relies on a restricted subset of context-free grammars, known as LALR(1) grammars. Methods: A diverse training set of 40 medical documents from a variety of specialties was manually inspected and used to develop a program (Negfinder) that contained rules to recognize a large set of negated patterns occurring in the text. Negfinder's lexer and parser were developed using tools normally used to generate programming language compilers. The input to Negfinder consisted of medical narrative that was preprocessed to recognize UMLS concepts: the text of a recognized concept had been replaced with a coded representation that included its UMLS concept ID. The program generated an index with one entry per instance of a concept in the document, where the presence or absence of negation of that concept was recorded. This information was used to mark up the text of each document by color-coding it to make it easier to inspect. The parser was then evaluated in two ways: 1) a test set of 60 documents (30 discharge summaries, 30 surgical notes) marked-up by Negfinder was inspected visually to quantify false-positive and false-negative results; and 2) a different test set of 10 documents was independently examined for negatives by a human observer and by Negfinder, and the results were compared. Results: In the first evaluation using marked-up documents, 8,358 instances of UMLS concepts were detected in the 60 documents, of which 544 were negations detected by the program and verified by human observation (true-positive results, or TPs). Thirteen instances were wrongly flagged as negated (false-positive results, or FPs), and the program missed 27 instances of negation (false-negative results, or FNs), yielding a sensitivity of 95.3 percent and a specificity of 97.7 percent. In the second evaluation using independent negation detection, 1,869 concepts were detected in 10 documents, with 135 TPs, 12 FPs, and 6 FNs, yielding a sensitivity of 95.7 percent and a specificity of 91.8 percent. One of the words “no,” “denies/denied,” “not,” or “without” was present in 92.5 percent of all negations. Conclusions: Negation of most concepts in medical narrative can be reliably detected by a simple strategy. The reliability of detection depends on several factors, the most important being the accuracy of concept matching. PMID:11687566

  16. Ada Structure Design Language (ASDL)

    NASA Technical Reports Server (NTRS)

    Chedrawi, Lutfi

    1986-01-01

    An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.

  17. Wind turbine design codes: A preliminary comparison of the aerodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buhl, M.L. Jr.; Wright, A.D.; Tangler, J.L.

    1997-12-01

    The National Wind Technology Center of the National Renewable Energy Laboratory is comparing several computer codes used to design and analyze wind turbines. The first part of this comparison is to determine how well the programs predict the aerodynamic behavior of turbines with no structural degrees of freedom. Without general agreement on the aerodynamics, it is futile to try to compare the structural response due to the aerodynamic input. In this paper, the authors compare the aerodynamic loads for three programs: Garrad Hassan`s BLADED, their own WT-PERF, and the University of Utah`s YawDyn. This report documents a work in progressmore » and compares only two-bladed, downwind turbines.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    The systems resilience research community has developed methods to manually insert additional source-program level assertions to trap errors, and also devised tools to conduct fault injection studies for scalar program codes. In this work, we contribute the first vector oriented LLVM-level fault injector VULFI to help study the effects of faults in vector architectures that are of growing importance, especially for vectorizing loops. Using VULFI, we conduct a resiliency study of nine real-world vector benchmarks using Intel’s AVX and SSE extensions as the target vector instruction sets, and offer the first reported understanding of how faults affect vector instruction sets.more » We take this work further toward automating the insertion of resilience assertions during compilation. This is based on our observation that during intermediate (e.g., LLVM-level) code generation to handle full and partial vectorization, modern compilers exploit (and explicate in their code-documentation) critical invariants. These invariants are turned into error-checking code. We confirm the efficacy of these automatically inserted low-overhead error detectors for vectorized for-loops.« less

  19. Exshall: A Turkel-Zwas explicit large time-step FORTRAN program for solving the shallow-water equations in spherical coordinates

    NASA Astrophysics Data System (ADS)

    Navon, I. M.; Yu, Jian

    A FORTRAN computer program is presented and documented applying the Turkel-Zwas explicit large time-step scheme to a hemispheric barotropic model with constraint restoration of integral invariants of the shallow-water equations. We then proceed to detail the algorithms embodied in the code EXSHALL in this paper, particularly algorithms related to the efficiency and stability of T-Z scheme and the quadratic constraint restoration method which is based on a variational approach. In particular we provide details about the high-latitude filtering, Shapiro filtering, and Robert filtering algorithms used in the code. We explain in detail the various subroutines in the EXSHALL code with emphasis on algorithms implemented in the code and present the flowcharts of some major subroutines. Finally, we provide a visual example illustrating a 4-day run using real initial data, along with a sample printout and graphic isoline contours of the height field and velocity fields.

  20. Semi-automated Modular Program Constructor for physiological modeling: Building cell and organ models.

    PubMed

    Jardine, Bartholomew; Raymond, Gary M; Bassingthwaighte, James B

    2015-01-01

    The Modular Program Constructor (MPC) is an open-source Java based modeling utility, built upon JSim's Mathematical Modeling Language (MML) ( http://www.physiome.org/jsim/) that uses directives embedded in model code to construct larger, more complicated models quickly and with less error than manually combining models. A major obstacle in writing complex models for physiological processes is the large amount of time it takes to model the myriad processes taking place simultaneously in cells, tissues, and organs. MPC replaces this task with code-generating algorithms that take model code from several different existing models and produce model code for a new JSim model. This is particularly useful during multi-scale model development where many variants are to be configured and tested against data. MPC encodes and preserves information about how a model is built from its simpler model modules, allowing the researcher to quickly substitute or update modules for hypothesis testing. MPC is implemented in Java and requires JSim to use its output. MPC source code and documentation are available at http://www.physiome.org/software/MPC/.

  1. Computer-Assisted Instruction: Authoring Languages. ERIC Digest.

    ERIC Educational Resources Information Center

    Reeves, Thomas C.

    One of the most perplexing tasks in producing computer-assisted instruction (CAI) is the authoring process. Authoring is generally defined as the process of turning the flowcharts, control algorithms, format sheets, and other documentation of a CAI program's design into computer code that will operationalize the simulation on the delivery system.…

  2. Automatic mathematical modeling for real time simulation program (AI application)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1989-01-01

    A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.

  3. Advisory on Relocatable and Renovated Classrooms. IAQ Info Sheet.

    ERIC Educational Resources Information Center

    California State Dept. of Health Services, Berkeley.

    Many California school districts, in complying with the Class Size Reduction Program, will obtain relocatable classrooms directly from manufacturers who are under no specific guidelines or codes relative to indoor air quality (IAQ). This document, designed to aid school facility managers in minimizing potential IAQ problems, summarizes the indoor…

  4. World Wide Web Page Design: A Structured Approach.

    ERIC Educational Resources Information Center

    Gregory, Gwen; Brown, M. Marlo

    1997-01-01

    Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…

  5. "Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.

    ERIC Educational Resources Information Center

    Brown, John Seely; And Others

    Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…

  6. MODPATH-LGR; documentation of a computer program for particle tracking in shared-node locally refined grids by using MODFLOW-LGR

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.

    2011-01-01

    The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.

  7. River Protection Project (RPP) Dangerous Waste Training Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POHTO, R.E.

    2000-03-09

    This supporting document contains the training plan for dangerous waste management at River Protection Project TSD Units. This document outlines the dangerous waste training program developed and implemented for all Treatment, Storage, and Disposal (TSD) Units operated by River Protection Project (RPP) in the Hanford 200 East, 200 West and 600 Areas and the <90 Day Accumulation Area at 209E. Operating TSD Units managed by RPP are: the Double-Shell Tank (DST) System, 204-AR Waste Unloading Facility, Grout, and the Single-Shell Tank (SST) System. The program is designed in compliance with the requirements of Washington Administrative Code (WAC) 173-303-330 and Titlemore » 40 Code of Federal Regulations (CFR) 265.16 for the development of a written dangerous waste training program and the Hanford Facility Permit. Training requirements were determined by an assessment of employee duties and responsibilities. The RPP training program is designed to prepare employees to operate and maintain the Tank Farms in a safe, effective, efficient, and environmentally sound manner. In addition to preparing employees to operate and maintain the Tank Farms under normal conditions, the training program ensures that employees are prepared to respond in a prompt and effective manner should abnormal or emergency conditions occur. Emergency response training is consistent with emergency responses outlined in the following Building Emergency Plans: HNF-IP-0263-TF and HNF-=IP-0263-209E.« less

  8. Coupled 2-dimensional cascade theory for noise an d unsteady aerodynamics of blade row interaction in turbofans. Volume 2: Documentation for computer code CUP2D

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.

    1994-01-01

    A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.

  9. User's guide to SEAWAT; a computer program for simulation of three-dimensional variable-density ground-water flow

    USGS Publications Warehouse

    Guo, Weixing; Langevin, C.D.

    2002-01-01

    This report documents a computer program (SEAWAT) that simulates variable-density, transient, ground-water flow in three dimensions. The source code for SEAWAT was developed by combining MODFLOW and MT3DMS into a single program that solves the coupled flow and solute-transport equations. The SEAWAT code follows a modular structure, and thus, new capabilities can be added with only minor modifications to the main program. SEAWAT reads and writes standard MODFLOW and MT3DMS data sets, although some extra input may be required for some SEAWAT simulations. This means that many of the existing pre- and post-processors can be used to create input data sets and analyze simulation results. Users familiar with MODFLOW and MT3DMS should have little difficulty applying SEAWAT to problems of variable-density ground-water flow.

  10. Solid waste projection model: Model version 1. 0 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkins, M.L.; Crow, V.L.; Buska, D.E.

    1990-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software utilized in developing Version 1.0 of the modeling unit of SWPM. This document is intended for use by experienced software engineers and supports programming, code maintenance, and model enhancement. Those interested in using SWPM should refer to the SWPM Modelmore » User's Guide. This document is available from either the PNL project manager (D. L. Stiles, 509-376-4154) or the WHC program monitor (B. C. Anderson, 509-373-2796). 8 figs.« less

  11. Space Communications Emulation Facility

    NASA Technical Reports Server (NTRS)

    Hill, Chante A.

    2004-01-01

    Establishing space communication between ground facilities and other satellites is a painstaking task that requires many precise calculations dealing with relay time, atmospheric conditions, and satellite positions, to name a few. The Space Communications Emulation Facility (SCEF) team here at NASA is developing a facility that will approximately emulate the conditions in space that impact space communication. The emulation facility is comprised of a 32 node distributed cluster of computers; each node representing a satellite or ground station. The objective of the satellites is to observe the topography of the Earth (water, vegetation, land, and ice) and relay this information back to the ground stations. Software originally designed by the University of Kansas, labeled the Emulation Manager, controls the interaction of the satellites and ground stations, as well as handling the recording of data. The Emulation Manager is installed on a Linux Operating System, employing both Java and C++ programming codes. The emulation scenarios are written in extensible Markup Language, XML. XML documents are designed to store, carry, and exchange data. With XML documents data can be exchanged between incompatible systems, which makes it ideal for this project because Linux, MAC and Windows Operating Systems are all used. Unfortunately, XML documents cannot display data like HTML documents. Therefore, the SCEF team uses XML Schema Definition (XSD) or just schema to describe the structure of an XML document. Schemas are very important because they have the capability to validate the correctness of data, define restrictions on data, define data formats, and convert data between different data types, among other things. At this time, in order for the Emulation Manager to open and run an XML emulation scenario file, the user must first establish a link between the schema file and the directory under which the XML scenario files are saved. This procedure takes place on the command line on the Linux Operating System. Once this link has been established the Emulation manager validates all the XML files in that directory against the schema file, before the actual scenario is run. Using some very sophisticated commercial software called the Satellite Tool Kit (STK) installed on the Linux box, the Emulation Manager is able to display the data and graphics generated by the execution of a XML emulation scenario file. The Emulation Manager software is written in JAVA programming code. Since the SCEF project is in the developmental stage, the source code for this type of software is being modified to better fit the requirements of the SCEF project. Some parameters for the emulation are hard coded, set at fixed values. Members of the SCEF team are altering the code to allow the user to choose the values of these hard coded parameters by inserting a toolbar onto the preexisting GUI.

  12. Survey of computer programs for heat transfer analysis

    NASA Astrophysics Data System (ADS)

    Noor, A. K.

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  13. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1982-01-01

    An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency.

  14. Ground Operations Aerospace Language (GOAL). Volume 4: Interpretive code translator

    NASA Technical Reports Server (NTRS)

    1973-01-01

    This specification identifies and describes the principal functions and elements of the Interpretive Code Translator which has been developed for use with the GOAL Compiler. This translator enables the user to convert a compliled GOAL program to a highly general binary format which is designed to enable interpretive execution. The translator program provides user controls which are designed to enable the selection of various output types and formats. These controls provide a means for accommodating many of the implementation options which are discussed in the Interpretive Code Guideline document. The technical design approach is given. The relationship between the translator and the GOAL compiler is explained and the principal functions performed by the Translator are described. Specific constraints regarding the use of the Translator are discussed. The control options are described. These options enable the user to select outputs to be generated by the translator and to control vrious aspects of the translation processing.

  15. Australian Aerodynamic Design Codes for Aerial Tow Bodies.

    DTIC Science & Technology

    1987-08-27

    HTP -1, which deals with aerial targets, it was recognised that there was a need for a complete and well docL mented approach for their aerodynamic and...circular cables cannot be assessed with the programs in their present form. 10. none of the programs are well documented and user’s manuals are not...National Leader ANL TTCP HTP -1 Weapons Systems Research Laboratory Director Superintendent - Weapons Division - Combat Systems Division Navy Office Navy

  16. The Intelligent Program Editor: A Knowledge Based System for Supporting Program and Documentation Maintenance.

    DTIC Science & Technology

    1983-03-01

    As an illustration of this fact, the collection of tools designed to alleviate these maintenance costs for large systems typically our- problems, all...provide bnowledge which other or modifies code; and a collection of semantic systems (such as the IP) can employ, analysis and manipulation tools that...TPleetl and sumations. > (index *loops syntea-database) In the second approach, the user identifies a -> LOOPsetl:[lengtb 2) collection of items

  17. A computer simulation of the transient response of a 4 cylinder Stirling engine with burner and air preheater in a vehicle

    NASA Technical Reports Server (NTRS)

    Martini, W. R.

    1981-01-01

    A series of computer programs are presented with full documentation which simulate the transient behavior of a modern 4 cylinder Siemens arrangement Stirling engine with burner and air preheater. Cold start, cranking, idling, acceleration through 3 gear changes and steady speed operation are simulated. Sample results and complete operating instructions are given. A full source code listing of all programs are included.

  18. Services provided by community pharmacies in Wayne County, Michigan: a comparison by ZIP code characteristics.

    PubMed

    Erickson, Steven R; Workman, Paul

    2014-01-01

    To document the availability of selected pharmacy services and out-of-pocket cost of medication throughout a diverse county in Michigan and to assess possible associations between availability of services and price of medication and characteristics of residents of the ZIP codes in which the pharmacies were located. Cross-sectional telephone survey of pharmacies coupled with ZIP code-level census data. 503 pharmacies throughout the 63 ZIP codes of Wayne County, MI. The out-of-pocket cost for a 30 days' supply of levothyroxine 50 mcg and brand-name atorvastatin (Lipitor-Pfizer) 20 mg, availability of discount generic drug programs, home delivery of medications, hours of pharmacy operation, and availability of pharmacy-based immunization services. Census data aggregated at the ZIP code level included race, annual household income, age, and number of residents per pharmacy. The overall results per ZIP code showed that the average cost for levothyroxine was $10.01 ± $2.29 and $140.45 + $14.70 for Lipitor. Per ZIP code, the mean (± SD) percentages of pharmacies offering discount generic drug programs was 66.9% ± 15.0%; home delivery of medications was 44.5% ± 22.7%; and immunization for influenza was 46.7% ± 24.3% of pharmacies. The mean (± SD) hours of operation per pharmacy per ZIP code was 67.0 ± 25.2. ZIP codes with higher household income as well as higher percentage of residents being white had lower levothyroxine price, greater percentage of pharmacies offering discount generic drug programs, more hours of operation per week, and more pharmacy-based immunization services. The cost of Lipitor was not associated with any ZIP code characteristic. Disparities in the cost of generic levothyroxine, the availability of services such as discount generic drug programs, hours of operation, and pharmacy-based immunization services are evident based on race and household income within this diverse metropolitan county.

  19. Documenting AUTOGEN and APGEN Model Files

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.

    2008-01-01

    A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.

  20. SLS-SPEC-159 Cross-Program Design Specification for Natural Environments (DSNE) Revision D

    NASA Technical Reports Server (NTRS)

    Roberts, Barry C.

    2015-01-01

    This document is derived from the former National Aeronautics and Space Administration (NASA) Constellation Program (CxP) document CxP 70023, titled "The Design Specification for Natural Environments (DSNE), Revision C." The original document has been modified to represent updated Design Reference Missions (DRMs) for the NASA Exploration Systems Development (ESD) Programs. The DSNE completes environment-related specifications for architecture, system-level, and lower-tier documents by specifying the ranges of environmental conditions that must be accounted for by NASA ESD Programs. To assure clarity and consistency, and to prevent requirements documents from becoming cluttered with extensive amounts of technical material, natural environment specifications have been compiled into this document. The intent is to keep a unified specification for natural environments that each Program calls out for appropriate application. This document defines the natural environments parameter limits (maximum and minimum values, energy spectra, or precise model inputs, assumptions, model options, etc.), for all ESD Programs. These environments are developed by the NASA Marshall Space Flight Center (MSFC) Natural Environments Branch (MSFC organization code: EV44). Many of the parameter limits are based on experience with previous programs, such as the Space Shuttle Program. The parameter limits contain no margin and are meant to be evaluated individually to ensure they are reasonable (i.e., do not apply unrealistic extreme-on-extreme conditions). The natural environments specifications in this document should be accounted for by robust design of the flight vehicle and support systems. However, it is understood that in some cases the Programs will find it more effective to account for portions of the environment ranges by operational mitigation or acceptance of risk in accordance with an appropriate program risk management plan and/or hazard analysis process. The DSNE is not intended as a definition of operational models or operational constraints, nor is it adequate, alone, for ground facilities which may have additional requirements (for example, building codes and local environmental constraints). "Natural environments," as the term is used here, refers to the environments that are not the result of intended human activity or intervention. It consists of a variety of external environmental factors (most of natural origin and a few of human origin) which impose restrictions or otherwise impact the development or operation of flight vehicles and destination surface systems. These natural environments include the following types of environments: Terrestrial environments at launch, abort, and normal landing sites (winds, temperatures, pressures, surface roughness, sea conditions, etc.); Space environments (ionizing radiation, orbital debris, meteoroids, thermosphere density, plasma, solar, Earth, and lunar-emitted thermal radiation, etc.); Destination environments (Lunar surface and orbital, Mars atmosphere and surface, near Earth asteroids, etc.). Many of the environmental specifications in this document are based on models, data, and environment descriptions contained in the CxP 70044, Constellation Program Natural Environment Definition for Design (NEDD). The NEDD provides additional detailed environment data and model descriptions to support analytical studies for ESD Programs. For background information on specific environments and their effects on spacecraft design and operations, the environment models, and the data used to generate the specifications contained in the DSNE, the reader is referred to the NEDD paragraphs listed in each section of the DSNE. Also, most of the environmental specifications in this document are tied specifically to the ESD DRMs in ESD-10012, Revision B, Exploration Systems Development Concept of Operations (ConOps). Coordination between these environment specifications and the DRMs must be maintained. This document should be compatible with the current ESD DRMs, but updates to the mission definitions and variations in interpretation may require adjustments to the environment specifications.

  1. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current Ramses model that should be further investigated. Several skills were required to be built up over the course of the internship project. First and foremost, my Simulink skills have improved drastically, as much of my experience had been modeling electronic circuits as opposed to software models. Furthermore, I am now comfortable working with the Simulink Auto-coder, a tool I had never used until this summer; this tool also tested my critical thinking and C++ knowledge as I had to interpret the C++ code it was generating and attempt to understand how the Simulink model affected the generated code. I had come into the internship with a solid understanding of Matlab code, but had done very little in using it to automate tasks, particularly Simulink tasks; along the same lines, I had rarely used shell script to automate and interface with programs, which I gained a fair amount of experience with this summer, including how to use regular expression. Lastly, soft-skills are an area everyone can continuously improve on; having never worked with NASA engineers, which to me seem to be a completely different breed than what I am used to (commercial electronic engineers), I learned to utilize the wealth of knowledge present at JSC. I wish I had come into the internship knowing exactly how helpful everyone in my branch would be, as I would have picked up on this sooner. I hope that having gained such a strong foundation in Simulink over this summer will open the opportunity to return to work on this project, or potentially other opportunities within the division. The idea of leaving a project I devoted ten weeks to is a hard one to cope with, so having the chance to pick up where I left off sounds appealing; alternatively, I am interested to see if there are any opening in the future that would allow me to work on a project that is more in-line with my research in estimation algorithms. Regardless, this summer has been a milestone in my professional career, and I hope this has started a long-term relationship between JSC and myself. I really enjoy the thought of building on my experience here over future summers while I work to complete my PhD at Missouri University of Science and Technology.

  2. A Guide to Axial-Flow Turbine Off-Design Computer Program AXOD2

    NASA Technical Reports Server (NTRS)

    Chen, Shu-Cheng S.

    2014-01-01

    A Users Guide for the axial flow turbine off-design computer program AXOD2 is composed in this paper. This Users Guide is supplementary to the original Users Manual of AXOD. Three notable contributions of AXOD2 to its predecessor AXOD, both in the context of the Guide or in the functionality of the code, are described and discussed in length. These are: 1) a rational representation of the mathematical principles applied, with concise descriptions of the formulas implemented in the actual coding. Their physical implications are addressed; 2) the creation and documentation of an Addendum Listing of input namelist-parameters unique to AXOD2, that differ from or are in addition to the original input-namelists given in the Manual of AXOD. Their usages are discussed; and 3) the institution of proper stoppages of the code execution, encoding termination messaging and error messages of the execution to AXOD2. These measures are to safe-guard the integrity of the code execution, such that a failure mode encountered during a case-study would not plunge the code execution into indefinite loop, or cause a blow-out of the program execution. Details on these are discussed and illustrated in this paper. Moreover, this computer program has since been reconstructed substantially. Standard FORTRAN Langue was instituted, and the code was formatted in Double Precision (REAL*8). As the result, the code is now suited for use in a local Desktop Computer Environment, is perfectly portable to any Operating System, and can be executed by any FORTRAN compiler equivalent to a FORTRAN 9095 compiler. AXOD2 will be available through NASA Glenn Research Center (GRC) Software Repository.

  3. Technical Support Document for Version 3.6.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2009-09-29

    This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  4. Patient safety principles in family medicine residency accreditation standards and curriculum objectives

    PubMed Central

    Kassam, Aliya; Sharma, Nishan; Harvie, Margot; O’Beirne, Maeve; Topps, Maureen

    2016-01-01

    Abstract Objective To conduct a thematic analysis of the College of Family Physicians of Canada’s (CFPC’s) Red Book accreditation standards and the Triple C Competency-based Curriculum objectives with respect to patient safety principles. Design Thematic content analysis of the CFPC’s Red Book accreditation standards and the Triple C curriculum. Setting Canada. Main outcome measures Coding frequency of the patient safety principles (ie, patient engagement; respectful, transparent relationships; complex systems; a just and trusting culture; responsibility and accountability for actions; and continuous learning and improvement) found in the analyzed CFPC documents. Results Within the analyzed CFPC documents, the most commonly found patient safety principle was patient engagement (n = 51 coding references); the least commonly found patient safety principles were a just and trusting culture (n = 5 coding references) and complex systems (n = 5 coding references). Other patient safety principles that were uncommon included responsibility and accountability for actions (n = 7 coding references) and continuous learning and improvement (n = 12 coding references). Conclusion Explicit inclusion of patient safety content such as the use of patient safety principles is needed for residency training programs across Canada to ensure the full spectrum of care is addressed, from community-based care to acute hospital-based care. This will ensure a patient safety culture can be cultivated from residency and sustained into primary care practice. PMID:27965349

  5. An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 1: Theory and validations

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Iannelli, G. S.; Manhardt, Paul D.; Orzechowski, J. A.

    1993-01-01

    This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.

  6. FEMFLOW3D; a finite-element program for the simulation of three-dimensional aquifers; version 1.0

    USGS Publications Warehouse

    Durbin, Timothy J.; Bond, Linda D.

    1998-01-01

    This document also includes model validation, source code, and example input and output files. Model validation was performed using four test problems. For each test problem, the results of a model simulation with FEMFLOW3D were compared with either an analytic solution or the results of an independent numerical approach. The source code, written in the ANSI x3.9-1978 FORTRAN standard, and the complete input and output of an example problem are listed in the appendixes.

  7. An arbitrary grid CFD algorithm for configuration aerodynamics analysis. Volume 2: FEMNAS user guide

    NASA Technical Reports Server (NTRS)

    Manhardt, Paul D.; Orzechowski, J. A.; Baker, A. J.

    1992-01-01

    This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.

  8. Construction of Hierarchical Models for Fluid Dynamics in Earth and Planetary Sciences : DCMODEL project

    NASA Astrophysics Data System (ADS)

    Takahashi, Y. O.; Takehiro, S.; Sugiyama, K.; Odaka, M.; Ishiwatari, M.; Sasaki, Y.; Nishizawa, S.; Ishioka, K.; Nakajima, K.; Hayashi, Y.

    2012-12-01

    Toward the understanding of fluid motions of planetary atmospheres and planetary interiors by performing multiple numerical experiments with multiple models, we are now proceeding ``dcmodel project'', where a series of hierarchical numerical models with various complexity is developed and maintained. In ``dcmodel project'', a series of the numerical models are developed taking care of the following points: 1) a common ``style'' of program codes assuring readability of the software, 2) open source codes of the models to the public, 3) scalability of the models assuring execution on various scales of computational resources, 4) stressing the importance of documentation and presenting a method for writing reference manuals. The lineup of the models and utility programs of the project is as follows: Gtool5, ISPACK/SPML, SPMODEL, Deepconv, Dcpam, and Rdoc-f95. In the followings, features of each component are briefly described. Gtool5 (Ishiwatari et al., 2012) is a Fortran90 library, which provides data input/output interfaces and various utilities commonly used in the models of dcmodel project. A self-descriptive data format netCDF is adopted as a IO format of Gtool5. The interfaces of gtool5 library can reduce the number of operation steps for the data IO in the program code of the models compared with the interfaces of the raw netCDF library. Further, by use of gtool5 library, procedures for data IO and addition of metadata for post-processing can be easily implemented in the program codes in a consolidated form independent of the size and complexity of the models. ``ISPACK'' is the spectral transformation library and ``SPML (SPMODEL library)'' (Takehiro et al., 2006) is its wrapper library. Most prominent feature of SPML is a series of array-handling functions with systematic function naming rules, and this enables us to write codes with a form which is easily deduced from the mathematical expressions of the governing equations. ``SPMODEL'' (Takehiro et al., 2006) is a collection of various sample programs using ``SPML''. These sample programs provide the basekit for simple numerical experiments of geophysical fluid dynamics. For example, SPMODEL includes 1-dimensional KdV equation model, 2-dimensional barotropic, shallow water, Boussinesq models, 3-dimensional MHD dynamo models in rotating spherical shells. These models are written in the common style in harmony with SPML functions. ``Deepconv'' (Sugiyama et al., 2010) and ``Dcpam'' are a cloud resolving model and a general circulation model for the purpose of applications to the planetary atmospheres, respectively. ``Deepconv'' includes several physical processes appropriate for simulations of Jupiter and Mars atmospheres, while ``Dcpam'' does for simulations of Earth, Mars, and Venus-like atmospheres. ``Rdoc-f95'' is a automatic generator of reference manuals of Fortran90/95 programs, which is an extension of ruby documentation tool kit ``rdoc''. It analyzes dependency of modules, functions, and subroutines in the multiple program source codes. At the same time, it can list up the namelist variables in the programs.

  9. ATLAS, an integrated structural analysis and design system. Volume 2: System design document

    NASA Technical Reports Server (NTRS)

    Erickson, W. J. (Editor)

    1979-01-01

    ATLAS is a structural analysis and design system, operational on the Control Data Corporation 6600/CYBER computers. The overall system design, the design of the individual program modules, and the routines in the ATLAS system library are described. The overall design is discussed in terms of system architecture, executive function, data base structure, user program interfaces and operational procedures. The program module sections include detailed code description, common block usage and random access file usage. The description of the ATLAS program library includes all information needed to use these general purpose routines.

  10. Let's go: Early universe 2. Primordial nucleosynthesis the computer way

    NASA Technical Reports Server (NTRS)

    Kawano, Lawrence

    1992-01-01

    This is a revised description and manual for the primordial nucleosynthesis program, NUC123, an updated and modified version of the code of R.V. Wagoner. NUC123 has undergone a number of changes, further enhancing its documentation and ease of use. Presented here is a guide to its use, followed by a series of appendices containing specific details such as a summary of the basic structure of the program, a description of the computational algorithm, and a presentation of the theory incorporated into the program.

  11. Integrated Electronic Warfare System Advanced Development Model (ADM); Appendix 1 - Functional Requirement Specification.

    DTIC Science & Technology

    1977-10-01

    APPROVED DATE FUNCTION APPROVED jDATE WRITER J . K-olanek 2/6/76 REVISIONS CHK DESCRIPTION REV CHK DESCRIPTION IREV REVISION jJ ~ ~ ~~~ _ II SHEET NO...DOCUMENT (CDBDD) 45 5.5 COMPUTER PROGRAM PACKAGE (CPP)- j 45 5.6 COMPUTER PROGRAM OPERATOR’S MANUAL (CPOM) 45 5.7 COMPUTER PROGRAM TEST PLAN (CPTPL) 45...I LIST OF FIGURES Number Page 1 JEWS Simplified Block Diagram 4 2 System Controller Architecture 5 SIZE CODE IDENT NO DRAWING NO. A 49956 SCALE REV J

  12. The use of NUREGs 1199 and 1200 in the Illinois LLW licensing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klinger, J.G.; Harmon, D.F.

    1991-12-31

    This paper will describe how the LLW licensing staff of the Illinois Department of Nuclear Safety used NRC`s NUREG 1199, NUREG 1200, NUREG 1300 and Regulatory Guide 4.18 in its licensing program for reviewing and evaluating a LLW disposal facility license application. The paper will discuss how Illinois guidance documents were prepared based on modifications made to these NRC documents which were necessary to take into account site and facility specific considerations, as well as changes required by Illinois statutes and regulatory codes. The paper will review the recent revisions (January 1991) to NUREG 1199 and 1200 and the importancemore » of these revisions. The paper will also discuss recommended modifications to these NRC documents and provide an update on the status of the Department`s review and evaluation of an application for a license to site, construct and operate a LLW disposal facility in Illinois.« less

  13. PROTEUS two-dimensional Navier-Stokes computer code, version 1.0. Volume 1: Analysis description

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Schwab, John R.; Benson, Thomas J.; Suresh, Ambady

    1990-01-01

    A new computer code was developed to solve the two-dimensional or axisymmetric, Reynolds averaged, unsteady compressible Navier-Stokes equations in strong conservation law form. The thin-layer or Euler equations may also be solved. Turbulence is modeled using an algebraic eddy viscosity model. The objective was to develop a code for aerospace applications that is easy to use and easy to modify. Code readability, modularity, and documentation were emphasized. The equations are written in nonorthogonal body-fitted coordinates, and solved by marching in time using a fully-coupled alternating direction-implicit procedure with generalized first- or second-order time differencing. All terms are linearized using second-order Taylor series. The boundary conditions are treated implicitly, and may be steady, unsteady, or spatially periodic. Simple Cartesian or polar grids may be generated internally by the program. More complex geometries require an externally generated computational coordinate system. The documentation is divided into three volumes. Volume 1 is the Analysis Description, and describes in detail the governing equations, the turbulence model, the linearization of the equations and boundary conditions, the time and space differencing formulas, the ADI solution procedure, and the artificial viscosity models.

  14. Aeroacoustic Analysis of Turbofan Noise Generation

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.; Envia, Edmane

    1996-01-01

    This report provides an updated version of analytical documentation for the V072 Rotor Wake/Stator Interaction Code. It presents the theoretical derivation of the equations used in the code and, where necessary, it documents the enhancements and changes made to the original code since its first release. V072 is a package of FORTRAN computer programs which calculate the in-duct acoustic modes excited by a fan/stator stage operating in a subsonic mean flow. Sound is generated by the stator vanes interacting with the mean wakes of the rotor blades. In this updated version, only the tonal noise produced at the blade passing frequency and its harmonics, is described. The broadband noise component analysis, which was part of the original report, is not included here. The code provides outputs of modal pressure and power amplitudes generated by the rotor-wake/stator interaction. The rotor/stator stage is modeled as an ensemble of blades and vanes of zero camber and thickness enclosed within an infinite hard-walled annular duct. The amplitude of each propagating mode is computed and summed to obtain the harmonics of sound power flux within the duct for both upstream and downstream propagating modes.

  15. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  16. Terminal Area Simulation System User's Guide - Version 10.0

    NASA Technical Reports Server (NTRS)

    Switzer, George F.; Proctor, Fred H.

    2014-01-01

    The Terminal Area Simulation System (TASS) is a three-dimensional, time-dependent, large eddy simulation model that has been developed for studies of wake vortex and weather hazards to aviation, along with other atmospheric turbulence, and cloud-scale weather phenomenology. This document describes the source code for TASS version 10.0 and provides users with needed documentation to run the model. The source code is programed in Fortran language and is formulated to take advantage of vector and efficient multi-processor scaling for execution on massively-parallel supercomputer clusters. The code contains different initialization modules allowing the study of aircraft wake vortex interaction with the atmosphere and ground, atmospheric turbulence, atmospheric boundary layers, precipitating convective clouds, hail storms, gust fronts, microburst windshear, supercell and mesoscale convective systems, tornadic storms, and ring vortices. The model is able to operate in either two- or three-dimensions with equations numerically formulated on a Cartesian grid. The primary output from the TASS is time-dependent domain fields generated by the prognostic equations and diagnosed variables. This document will enable a user to understand the general logic of TASS, and will show how to configure and initialize the model domain. Also described are the formats of the input and output files, as well as the parameters that control the input and output.

  17. GSC configuration management plan

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward

    1990-01-01

    The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.

  18. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  20. Pressure Safety Program Implementation at ORNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lower, Mark; Etheridge, Tom; Oland, C. Barry

    2013-01-01

    The Oak Ridge National Laboratory (ORNL) is a US Department of Energy (DOE) facility that is managed by UT-Battelle, LLC. In February 2006, DOE promulgated worker safety and health regulations to govern contractor activities at DOE sites. These regulations, which are provided in 10 CFR 851, Worker Safety and Health Program, establish requirements for worker safety and health program that reduce or prevent occupational injuries, illnesses, and accidental losses by providing DOE contractors and their workers with safe and healthful workplaces at DOE sites. The regulations state that contractors must achieve compliance no later than May 25, 2007. According tomore » 10 CFR 851, Subpart C, Specific Program Requirements, contractors must have a structured approach to their worker safety and health programs that at a minimum includes provisions for pressure safety. In implementing the structured approach for pressure safety, contractors must establish safety policies and procedures to ensure that pressure systems are designed, fabricated, tested, inspected, maintained, repaired, and operated by trained, qualified personnel in accordance with applicable sound engineering principles. In addition, contractors must ensure that all pressure vessels, boilers, air receivers, and supporting piping systems conform to (1) applicable American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (2004) Sections I through XII, including applicable code cases; (2) applicable ASME B31 piping codes; and (3) the strictest applicable state and local codes. When national consensus codes are not applicable because of pressure range, vessel geometry, use of special materials, etc., contractors must implement measures to provide equivalent protection and ensure a level of safety greater than or equal to the level of protection afforded by the ASME or applicable state or local codes. This report documents the work performed to address legacy pressure vessel deficiencies and comply with pressure safety requirements in 10 CFR 851. It also describes actions taken to develop and implement ORNL’s Pressure Safety Program.« less

  1. NASA Lewis Steady-State Heat Pipe Code Architecture

    NASA Technical Reports Server (NTRS)

    Mi, Ye; Tower, Leonard K.

    2013-01-01

    NASA Glenn Research Center (GRC) has developed the LERCHP code. The PC-based LERCHP code can be used to predict the steady-state performance of heat pipes, including the determination of operating temperature and operating limits which might be encountered under specified conditions. The code contains a vapor flow algorithm which incorporates vapor compressibility and axially varying heat input. For the liquid flow in the wick, Darcy s formula is employed. Thermal boundary conditions and geometric structures can be defined through an interactive input interface. A variety of fluid and material options as well as user defined options can be chosen for the working fluid, wick, and pipe materials. This report documents the current effort at GRC to update the LERCHP code for operating in a Microsoft Windows (Microsoft Corporation) environment. A detailed analysis of the model is presented. The programming architecture for the numerical calculations is explained and flowcharts of the key subroutines are given

  2. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  3. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  4. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  5. 2DRMP: A suite of two-dimensional R-matrix propagation codes

    NASA Astrophysics Data System (ADS)

    Scott, N. S.; Scott, M. P.; Burke, P. G.; Stitt, T.; Faro-Maza, V.; Denis, C.; Maniopoulou, A.

    2009-12-01

    The R-matrix method has proved to be a remarkably stable, robust and efficient technique for solving the close-coupling equations that arise in electron and photon collisions with atoms, ions and molecules. During the last thirty-four years a series of related R-matrix program packages have been published periodically in CPC. These packages are primarily concerned with low-energy scattering where the incident energy is insufficient to ionise the target. In this paper we describe 2DRMP, a suite of two-dimensional R-matrix propagation programs aimed at creating virtual experiments on high performance and grid architectures to enable the study of electron scattering from H-like atoms and ions at intermediate energies. Program summaryProgram title: 2DRMP Catalogue identifier: AEEA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 196 717 No. of bytes in distributed program, including test data, etc.: 3 819 727 Distribution format: tar.gz Programming language: Fortran 95, MPI Computer: Tested on CRAY XT4 [1]; IBM eServer 575 [2]; Itanium II cluster [3] Operating system: Tested on UNICOS/lc [1]; IBM AIX [2]; Red Hat Linux Enterprise AS [3] Has the code been vectorised or parallelised?: Yes. 16 cores were used for small test run Classification: 2.4 External routines: BLAS, LAPACK, PBLAS, ScaLAPACK Subprograms used: ADAZ_v1_1 Nature of problem: 2DRMP is a suite of programs aimed at creating virtual experiments on high performance architectures to enable the study of electron scattering from H-like atoms and ions at intermediate energies. Solution method: Two-dimensional R-matrix propagation theory. The (r,r) space of the internal region is subdivided into a number of subregions. Local R-matrices are constructed within each subregion and used to propagate a global R-matrix, ℜ, across the internal region. On the boundary of the internal region ℜ is transformed onto the IERM target state basis. Thus, the two-dimensional R-matrix propagation technique transforms an intractable problem into a series of tractable problems enabling the internal region to be extended far beyond that which is possible with the standard one-sector codes. A distinctive feature of the method is that both electrons are treated identically and the R-matrix basis states are constructed to allow for both electrons to be in the continuum. The subregion size is flexible and can be adjusted to accommodate the number of cores available. Restrictions: The implementation is currently restricted to electron scattering from H-like atoms and ions. Additional comments: The programs have been designed to operate on serial computers and to exploit the distributed memory parallelism found on tightly coupled high performance clusters and supercomputers. 2DRMP has been systematically and comprehensively documented using ROBODoc [4] which is an API documentation tool that works by extracting specially formatted headers from the program source code and writing them to documentation files. Running time: The wall clock running time for the small test run using 16 cores and performed on [3] is as follows: bp (7 s); rint2 (34 s); newrd (32 s); diag (21 s); amps (11 s); prop (24 s). References:HECToR, CRAY XT4 running UNICOS/lc, http://www.hector.ac.uk/, accessed 22 July, 2009. HPCx, IBM eServer 575 running IBM AIX, http://www.hpcx.ac.uk/, accessed 22 July, 2009. HP Cluster, Itanium II cluster running Red Hat Linux Enterprise AS, Queen s University Belfast, http://www.qub.ac.uk/directorates/InformationServices/Research/HighPerformanceComputing/Services/Hardware/HPResearch/, accessed 22 July, 2009. Automating Software Documentation with ROBODoc, http://www.xs4all.nl/~rfsber/Robo/, accessed 22 July, 2009.

  6. 75 FR 3212 - Office of Elementary and Secondary Education; College Assistance Migrant Program (CAMP)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-20

    ... Federal Register. Free Internet access to the official edition of the Federal Register and the Code of... (FRS), toll free, at 1-800-877-8339. Accessible Format: Individuals with disabilities can obtain this... published in the Federal Register, in text or Adobe Portable Document Format (PDF) on the Internet at the...

  7. 77 FR 27756 - Applications for New Awards; Promise Neighborhoods Program-Planning Grant Competition, Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-11

    .... Free Internet access to the official edition of the Federal Register and the Code of Federal...) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1- 800-877-8339... free at the site. You may also access documents of the Department published in the Federal Register by...

  8. 78 FR 22869 - Credit Enhancement for Charter School Facilities Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-17

    ... Register. Free Internet access to the official edition of the Federal Register and the Code of Federal... deaf (TDD) or a text telephone (TTY), you may call the Federal Relay Service (FRS), toll free, at 1-800... free at the site. You may also access documents of the Department published in the Federal Register by...

  9. 75 FR 3216 - Office of Elementary and Secondary Education; High School Equivalency Program (HEP)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-20

    ... Federal Register. Free Internet access to the official edition of the Federal Register and the Code of... (FRS), toll free, at 1-800-877-8339. Accessible Format: Individuals with disabilities can obtain this... published in the Federal Register, in text or Adobe Portable Document Format (PDF) on the Internet at the...

  10. International Standard Classification of Education (ISCED) Three Stage Classification System: 1973; Part 2 - Definitions.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France).

    The seven levels of education, as classified numerically by International Standard Classification of Education (ISCED), are defined along with courses, programs, and fields of education listed under each level. Also contained is an alphabetical subject index indicating appropriate code numbers. For related documents see TM003535 and TM003536. (RC)

  11. 19 CFR 122.49b - Electronic manifest requirement for crew members and non-crew members onboard commercial aircraft...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Name Record locator, if available; (xvi) International Air Transport Association (IATA) code of foreign... HOMELAND SECURITY; DEPARTMENT OF THE TREASURY AIR COMMERCE REGULATIONS Aircraft Entry and Entry Documents...” includes each entity that is an “aircraft operator” or “foreign air carrier” with a security program under...

  12. Strength and Comprehensiveness of District School Wellness Policies Predict Policy Implementation at the School Level

    ERIC Educational Resources Information Center

    Schwartz, Marlene B.; Henderson, Kathryn E.; Falbe, Jennifer; Novak, Sarah A.; Wharton, Christopher M.; Long, Michael W.; O'Connell, Meghan L.; Fiore, Susan S.

    2012-01-01

    Background: In 2006, all local education agencies in the United States participating in federal school meal programs were required to establish school wellness policies. This study documented the strength and comprehensiveness of 1 state's written district policies using a coding tool, and tested whether these traits predicted school-level…

  13. Users manual for flight control design programs

    NASA Technical Reports Server (NTRS)

    Nalbandian, J. Y.

    1975-01-01

    Computer programs for the design of analog and digital flight control systems are documented. The program DIGADAPT uses linear-quadratic-gaussian synthesis algorithms in the design of command response controllers and state estimators, and it applies covariance propagation analysis to the selection of sampling intervals for digital systems. Program SCHED executes correlation and regression analyses for the development of gain and trim schedules to be used in open-loop explicit-adaptive control laws. A linear-time-varying simulation of aircraft motions is provided by the program TVHIS, which includes guidance and control logic, as well as models for control actuator dynamics. The programs are coded in FORTRAN and are compiled and executed on both IBM and CDC computers.

  14. Development of Innovative Design Processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.S.; Park, C.O.

    2004-07-01

    The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which ismore » another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)« less

  15. Novel Method of Noninvasive Detection and Assessment of Gas Emboli and DCS

    DTIC Science & Technology

    2010-12-01

    Young Investigator Program, Undersea Medicine Research Program (Code: 34; PO: CDR Matthew Swiergosz). Report Documentation Page Form...esponse at 665 tte with respe glass thicknes ces of glass cu 00 µm thick g to 1,2,3,4 i -D profile sho µm peak. ct to the beam s, B- optical vette...arterial CO2 embolism occuring after laparoscopic surgery: A case report," Undersea & Hyperbaric Medicine, vol. 33, pp. 317-320, Sep-Oct 2006. [29] T

  16. Computer program documentation for a subcritical wing design code using higher order far-field drag minimization

    NASA Technical Reports Server (NTRS)

    Kuhlman, J. M.; Shu, J. Y.

    1981-01-01

    A subsonic, linearized aerodynamic theory, wing design program for one or two planforms was developed which uses a vortex lattice near field model and a higher order panel method in the far field. The theoretical development of the wake model and its implementation in the vortex lattice design code are summarized and sample results are given. Detailed program usage instructions, sample input and output data, and a program listing are presented in the Appendixes. The far field wake model assumes a wake vortex sheet whose strength varies piecewise linearly in the spanwise direction. From this model analytical expressions for lift coefficient, induced drag coefficient, pitching moment coefficient, and bending moment coefficient were developed. From these relationships a direct optimization scheme is used to determine the optimum wake vorticity distribution for minimum induced drag, subject to constraints on lift, and pitching or bending moment. Integration spanwise yields the bound circulation, which is interpolated in the near field vortex lattice to obtain the design camber surface(s).

  17. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  18. HKHC Community Dashboard: design, development, and function of a Web-based performance monitoring system.

    PubMed

    Bors, Philip A; Kemner, Allison; Fulton, John; Stachecki, Jessica; Brennan, Laura K

    2015-01-01

    As part of Robert Wood Johnson Foundation's Healthy Kids, Healthy Communities (HKHC) national grant program, a technical assistance team designed the HKHC Community Dashboard, an online progress documentation and networking system. The Dashboard was central to HKHC's multimethod program evaluation and became a communication interface for grantees and technical assistance providers. The Dashboard was designed through an iterative process of identifying needs and priorities; designing the user experience, technical development, and usability testing; and applying visual design. The system was created with an open-source content management system and support for building an online community of users. The site developer trained technical assistance providers at the national program office and evaluators, who subsequently trained all 49 grantees. Evaluators provided support for Dashboard users and populated the site with the bulk of its uploaded tools and resource documents. The system tracked progress through an interactive work plan template, regular documentation by local staff and partners, and data coding and analysis by the evaluation team. Other features included the ability to broadcast information to Dashboard users via e-mail, event calendars, discussion forums, private messaging, a resource clearinghouse, a technical assistance diary, and real-time progress reports. The average number of Dashboard posts was 694 per grantee during the grant period. Technical assistance providers and grantees uploaded a total of 1304 resource documents. The Dashboard functions with the highest grantee satisfaction were its interfaces for sharing and progress documentation. A majority of Dashboard users (69%) indicated a preference for continued access to the Dashboard's uploaded resource documents. The Dashboard was a useful and innovative tool for participatory evaluation of a large national grant program. While progress documentation added some burden to local project staff, the system proved to be a useful resource-sharing technology.

  19. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  20. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    USGS Publications Warehouse

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  1. User's Manual for the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Cheatwood, F. McNeil

    1996-01-01

    This user's manual provides detailed instructions for the installation and the application of version 4.1 of the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA). Also provides simulation of flow field in thermochemical nonequilibrium around vehicles traveling at hypersonic velocities through the atmosphere. Earlier versions of LAURA were predominantly research codes, and they had minimal (or no) documentation. This manual describes UNIX-based utilities for customizing the code for special applications that also minimize system resource requirements. The algorithm is reviewed, and the various program options are related to specific equations and variables in the theoretical development.

  2. Report number codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.N.

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in thismore » publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.« less

  3. GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collin, Blaise Paul

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. •more » The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary. 09/2016: Tables 6 and 8 updated. AGR-2 input data added« less

  4. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collin, Blaise P.

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparisonmore » of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary.« less

  5. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Appendix B: ROBSIM programmer's guide

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.

    1986-01-01

    The purpose of the Robotic Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAM 77 and implemented on a VAX 11/750 computer using the VMS operating system. The programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With the manual and the in-code documentation, an experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  6. Thermal/Structural Tailoring of Engine Blades (T/STAEBL) User's manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1994-01-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.

  7. Thermal/Structural Tailoring of Engine Blades (T/STAEBL): User's manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.

    1994-03-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.

  8. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, appendix B

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    The purpose of the Robotics Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAN 77 and implemented on a VAX 11/750 computer using the VMS operating system. This programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With this manual and the in-code documentation, and experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.

  9. SU-F-P-10: A Web-Based Radiation Safety Relational Database Module for Regulatory Compliance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosen, C; Ramsay, B; Konerth, S

    Purpose: Maintaining compliance with Radioactive Materials Licenses is inherently a time-consuming task requiring focus and attention to detail. Staff tasked with these responsibilities, such as the Radiation Safety Officer and associated personnel must retain disparate records for eventual placement into one or more annual reports. Entering results and records in a relational database using a web browser as the interface, and storing that data in a cloud-based storage site, removes procedural barriers. The data becomes more adaptable for mining and sharing. Methods: Web-based code was written utilizing the web framework Django, written in Python. Additionally, the application utilizes JavaScript formore » front-end interaction, SQL, HTML and CSS. Quality assurance code testing is performed in a sequential style, and new code is only added after the successful testing of the previous goals. Separate sections of the module include data entry and analysis for audits, surveys, quality management, and continuous quality improvement. Data elements can be adapted for quarterly and annual reporting, and for immediate notification of user determined alarm settings. Results: Current advances are focusing on user interface issues, and determining the simplest manner by which to teach the user to build query forms. One solution has been to prepare library documents that a user can select or edit in place of creation a new document. Forms are being developed based upon Nuclear Regulatory Commission federal code, and will be expanded to include State Regulations. Conclusion: Establishing a secure website to act as the portal for data entry, storage and manipulation can lead to added efficiencies for a Radiation Safety Program. Access to multiple databases can lead to mining for big data programs, and for determining safety issues before they occur. Overcoming web programming challenges, a category that includes mathematical handling, is providing challenges that are being overcome.« less

  10. Code-Switching in Judaeo-Arabic Documents from the Cairo Geniza

    ERIC Educational Resources Information Center

    Wagner, Esther-Miriam; Connolly, Magdalen

    2018-01-01

    This paper investigates code-switching and script-switching in medieval documents from the Cairo Geniza, written in Judaeo-Arabic (Arabic in Hebrew script), Hebrew, Arabic and Aramaic. Legal documents regularly show a macaronic style of Judaeo-Arabic, Aramaic and Hebrew, while in letters code-switching from Judaeo-Arabic to Hebrew is tied in with…

  11. Raptor: An Enterprise Knowledge Discovery Engine Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-08-31

    The Raptor Version 2.0 computer code uses a set of documents as seed documents to recommend documents of interest from a large, target set of documents. The computer code provides results that show the recommended documents with the highest similarity to the seed documents. Version 2.0 was specifically developed to work with SharePoint 2007 and MS SQL server.

  12. Design and Implementation of a Distributed Version of the NASA Engine Performance Program

    NASA Technical Reports Server (NTRS)

    Cours, Jeffrey T.

    1994-01-01

    Distributed NEPP is a new version of the NASA Engine Performance Program that runs in parallel on a collection of Unix workstations connected through a network. The program is fault-tolerant, efficient, and shows significant speed-up in a multi-user, heterogeneous environment. This report describes the issues involved in designing distributed NEPP, the algorithms the program uses, and the performance distributed NEPP achieves. It develops an analytical model to predict and measure the performance of the simple distribution, multiple distribution, and fault-tolerant distribution algorithms that distributed NEPP incorporates. Finally, the appendices explain how to use distributed NEPP and document the organization of the program's source code.

  13. 75 FR 39220 - Charter Schools Program (CSP) Grants for Replication and Expansion of High-Quality Charter Schools

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-08

    ... Federal Register. Free Internet access to the official edition of the Federal Register and the Code of... telecommunications device for the deaf, call the Federal Relay Service, toll free, at 1-800-877-8339. Electronic... published in the Federal Register, in text or Adobe Portable Document Format (PDF) on the Internet at the...

  14. 77 FR 43277 - Applications for New Awards; Innovative Approaches to Literacy Program (CFDA 84.215G); Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... Register. Free Internet access to the official edition of the Federal Register and the Code of Federal... deaf (TDD) or a text telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1- 800-877... free at the site. You may also access documents of the Department published in the Federal Register by...

  15. 76 FR 25211 - Energy Conservation Program: Test Procedures for Fluorescent Lamp Ballasts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold by the Superintendent of Documents. #0... the Regulatory Flexibility Act C. Review Under the Paperwork Reduction Act of 1995 D. Review Under the... Conservation Act (42 U.S.C. 6291, et seq.; ``EPCA'' or, ``the Act'') sets forth a variety of provisions...

  16. Lean and Efficient Software: Whole-Program Optimization of Executables

    DTIC Science & Technology

    2015-06-30

    Approved for public release; distribution is unlimited. Financial Data Contact: Krisztina Nagy T: (607) 273-7340 x.117 F : (607) 273-8752 knagy...grammatech.com Administrative Contact: Derek Burrows T: (607) 273-7340 x.113 F : (607) 273-8752 dburrows@grammatech.com Report Documentation Page...library subroutines, removing redundant argument checking and interface layers, eliminating dead code, and improving computational efficiency. In

  17. Technical Support Document for Version 3.9.0 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2011-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.« less

  18. Technical Support Document for Version 3.9.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.« less

  19. [Disease management programs in Germany: Validity of the medical documentation].

    PubMed

    Linder, R; Horenkamp-Sonntag, D; Engel, S; Köppel, D; Heilmann, T; Verheyen, F

    2014-01-01

    The specific documentation for disease management programs (DMP) in Germany with respect to § 137 Social Code Book V is the basis for evaluating the DMP. DMP run up costs of the order of a billion euro without assessing evidence-based benefit so far. Aim of this study was to question if and to which extent this documentation may be suitable for reliable quality assurance in its present form. Data of nearly 300000 insured persons of a German Statutory Health Insurance (Techniker Krankenkasse, TK) which were continuously registered from July 1st 2009 until December 31st 2010 in a DMP were analyzed. We analyzed how items which were components of claims data and of DMP documentation were matched. With regard to prescriptions there were some considerable differences. Prescription of glibenclamid was documented twice as frequently in the DMP documentation compared to prescriptions filled in pharmacies. Only a fraction of emergency hospitalizations documented in the claims data were found in the DMP documentation. Investigations of the fundus oculi for diabetics are mentioned three times more frequently in the DMP documentation than they are accounted by ophthalmologists. There are considerable differences between claims data and DMP specific documentation. The latter shows a plainly reduced validity for investigated fields in the documentation forms. Reasons for this are manifold. Former evaluations of DM Ps carried out just on the basis of DMP documentation are thus highly questionable. Therefore, the DMPs themselves and their documentation have to be reformed. © Georg Thieme Verlag KG Stuttgart · New York.

  20. Photovoltaic module certification and laboratory accreditation criteria development

    NASA Astrophysics Data System (ADS)

    Osterwald, Carl R.; Zerlaut, Gene; Hammond, Robert; D'Aiello, Robert

    1996-01-01

    This paper overviews a model product certification and test laboratory accreditation program for photovoltaic (PV) modules that was recently developed by the National Renewable Energy Laboratory and Arizona State University. The specific objective of this project was to produce a document that details the equipment, facilities, quality assurance procedures, and technical expertise an accredited laboratory needs for performance and qualification testing of PV modules, along with the specific tests needed for a module design to be certified. The document was developed in conjunction with a criteria development committee consisting of representatives from 30 U.S. PV manufacturers, end users, standards and codes organizations, and testing laboratories. The intent is to lay the groundwork for a future U.S. PV certification and accreditation program that will be beneficial to the PV industry as a whole.

  1. A retrospective review of the Honduras AIN-C program guided by a community health worker performance logic model.

    PubMed

    Rodríguez, Daniela C; Peterson, Lauren A

    2016-05-06

    Factors that influence performance of community health workers (CHWs) delivering health services are not well understood. A recent logic model proposed categories of support from both health sector and communities that influence CHW performance and program outcomes. This logic model has been used to review a growth monitoring program delivered by CHWs in Honduras, known as Atención Integral a la Niñez en la Comunidad (AIN-C). A retrospective review of AIN-C was conducted through a document desk review and supplemented with in-depth interviews. Documents were systematically coded using the categories from the logic model, and gaps were addressed through interviews. Authors reviewed coded data for each category to analyze program details and outcomes as well as identify potential issues and gaps in the logic model. Categories from the logic model were inconsistently represented, with more information available for health sector than community. Context and input activities were not well documented. Information on health sector systems-level activities was available for governance but limited for other categories, while not much was found for community systems-level activities. Most available information focused on program-level activities with substantial data on technical support. Output, outcome, and impact data were drawn from various resources and suggest mixed results of AIN-C on indicators of interest. Assessing CHW performance through a desk review left gaps that could not be addressed about the relationship of activities and performance. There were critical characteristics of program design that made it contextually appropriate; however, it was difficult to identify clear links between AIN-C and malnutrition indicators. Regarding the logic model, several categories were too broad (e.g., technical support, context) and some aspects of AIN-C did not fit neatly in logic model categories (e.g., political commitment, equity, flexibility in implementation). The CHW performance logic model has potential as a tool for program planning and evaluation but would benefit from additional supporting tools and materials to facilitate and operationalize its use.

  2. Source Methodology for Turbofan Noise Prediction (SOURCE3D Technical Documentation)

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides the analytical documentation for the SOURCE3D Rotor Wake/Stator Interaction Code. It derives the equations for the rotor scattering coefficients and stator source vector and scattering coefficients that are needed for use in the TFANS (Theoretical Fan Noise Design/Prediction System). SOURCE3D treats the rotor and stator as isolated source elements. TFANS uses this information, along with scattering coefficients for inlet and exit elements, and provides complete noise solutions for turbofan engines. SOURCE3D is composed of a collection of FORTRAN programs that have been obtained by extending the approach of the earlier V072 Rotor Wake/Stator Interaction Code. Similar to V072, it treats the rotor and stator as a collection of blades and vanes having zero thickness and camber contained in an infinite, hardwall annular duct. SOURCE3D adds important features to the V072 capability-a rotor element, swirl flow and vorticity waves, actuator disks for flow turning, and combined rotor/actuator disk and stator/actuator disk elements. These items allow reflections from the rotor, frequency scattering, and mode trapping, thus providing more complete noise predictions than previously. The code has been thoroughly verified through comparison with D.B. Hanson's CUP2D two- dimensional code using a narrow annulus test case.

  3. Addressing medical coding and billing part II: a strategy for achieving compliance. A risk management approach for reducing coding and billing errors.

    PubMed Central

    Adams, Diane L.; Norman, Helen; Burroughs, Valentine J.

    2002-01-01

    Medical practice today, more than ever before, places greater demands on physicians to see more patients, provide more complex medical services and adhere to stricter regulatory rules, leaving little time for coding and billing. Yet, the need to adequately document medical records, appropriately apply billing codes and accurately charge insurers for medical services is essential to the medical practice's financial condition. Many physicians rely on office staff and billing companies to process their medical bills without ever reviewing the bills before they are submitted for payment. Some physicians may not be receiving the payment they deserve when they do not sufficiently oversee the medical practice's coding and billing patterns. This article emphasizes the importance of monitoring and auditing medical record documentation and coding application as a strategy for achieving compliance and reducing billing errors. When medical bills are submitted with missing and incorrect information, they may result in unpaid claims and loss of revenue to physicians. Addressing Medical Audits, Part I--A Strategy for Achieving Compliance--CMS, JCAHO, NCQA, published January 2002 in the Journal of the National Medical Association, stressed the importance of preparing the medical practice for audits. The article highlighted steps the medical practice can take to prepare for audits and presented examples of guidelines used by regulatory agencies to conduct both medical and financial audits. The Medicare Integrity Program was cited as an example of guidelines used by regulators to identify coding errors during an audit and deny payment to providers when improper billing occurs. For each denied claim, payments owed to the medical practice are are also denied. Health care is, no doubt, a costly endeavor for health care providers, consumers and insurers. The potential risk to physicians for improper billing may include loss of revenue, fraud investigations, financial sanction, disciplinary action and exclusion from participation in government programs. Part II of this article recommends an approach for assessing potential risk, preventing improper billing, and improving financial management of the medical practice. Images p432-a PMID:12078924

  4. SIRU development. Volume 3: Software description and program documentation

    NASA Technical Reports Server (NTRS)

    Oehrle, J.

    1973-01-01

    The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.

  5. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  6. MASTRE trajectory code update to automate flight trajectory design, performance predictions, and vehicle sizing for support of shuttle and shuttle derived vehicles: Programmers manual

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.

  7. A Component-Centered Meta-Analysis of Family-Based Prevention Programs for Adolescent Substance Use

    PubMed Central

    Roseth, Cary J.; Fosco, Gregory M.; Lee, You-kyung; Chen, I-Chien

    2016-01-01

    Although research has documented the positive effects of family-based prevention programs, the field lacks specific information regarding why these programs are effective. The current study summarized the effects of family-based programs on adolescent substance use using a component-based approach to meta-analysis in which we decomposed programs into a set of key topics or components that were specifically addressed by program curricula (e.g., parental monitoring/behavior management, problem solving, positive family relations, etc.). Components were coded according to the amount of time spent on program services that targeted youth, parents, and the whole family; we also coded effect sizes across studies for each substance-related outcome. Given the nested nature of the data, we used hierarchical linear modeling to link program components (Level 2) with effect sizes (Level 1). The overall effect size across programs was .31, which did not differ by type of substance. Youth-focused components designed to encourage more positive family relationships and a positive orientation toward the future emerged as key factors predicting larger than average effect sizes. Our results suggest that, within the universe of family-based prevention, where components such as parental monitoring/behavior management are almost universal, adding or expanding certain youth-focused components may be able to enhance program efficacy. PMID:27064553

  8. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  9. Impact of documentation errors on accuracy of cause of death coding in an educational hospital in Southern Iran.

    PubMed

    Haghighi, Mohammad Hosein Hayavi; Dehghani, Mohammad; Teshnizi, Saeid Hoseini; Mahmoodi, Hamid

    2014-01-01

    Accurate cause of death coding leads to organised and usable death information but there are some factors that influence documentation on death certificates and therefore affect the coding. We reviewed the role of documentation errors on the accuracy of death coding at Shahid Mohammadi Hospital (SMH), Bandar Abbas, Iran. We studied the death certificates of all deceased patients in SMH from October 2010 to March 2011. Researchers determined and coded the underlying cause of death on the death certificates according to the guidelines issued by the World Health Organization in Volume 2 of the International Statistical Classification of Diseases and Health Related Problems-10th revision (ICD-10). Necessary ICD coding rules (such as the General Principle, Rules 1-3, the modification rules and other instructions about death coding) were applied to select the underlying cause of death on each certificate. Demographic details and documentation errors were then extracted. Data were analysed with descriptive statistics and chi square tests. The accuracy rate of causes of death coding was 51.7%, demonstrating a statistically significant relationship (p=.001) with major errors but not such a relationship with minor errors. Factors that result in poor quality of Cause of Death coding in SMH are lack of coder training, documentation errors and the undesirable structure of death certificates.

  10. Aeroservoelastic and Flight Dynamics Analysis Using Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Arena, Andrew S., Jr.

    1999-01-01

    This document in large part is based on the Masters Thesis of Cole Stephens. The document encompasses a variety of technical and practical issues involved when using the STARS codes for Aeroservoelastic analysis of vehicles. The document covers in great detail a number of technical issues and step-by-step details involved in the simulation of a system where aerodynamics, structures and controls are tightly coupled. Comparisons are made to a benchmark experimental program conducted at NASA Langley. One of the significant advantages of the methodology detailed is that as a result of the technique used to accelerate the CFD-based simulation, a systems model is produced which is very useful for developing the control law strategy, and subsequent high-speed simulations.

  11. Modifications of the U.S. Geological Survey modular, finite-difference, ground-water flow model to read and write geographic information system files

    USGS Publications Warehouse

    Orzol, Leonard L.; McGrath, Timothy S.

    1992-01-01

    This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.

  12. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  13. The Joint Polar Satellite System (JPSS) Program's Algorithm Change Process (ACP): Past, Present and Future

    NASA Technical Reports Server (NTRS)

    Griffin, Ashley

    2017-01-01

    The Joint Polar Satellite System (JPSS) Program Office is the supporting organization for the Suomi National Polar Orbiting Partnership (S-NPP) and JPSS-1 satellites. S-NPP carries the following sensors: VIIRS, CrIS, ATMS, OMPS, and CERES with instruments that ultimately produce over 25 data products that cover the Earths weather, oceans, and atmosphere. A team of scientists and engineers from all over the United States document, monitor and fix errors in operational software code or documentation with the algorithm change process (ACP) to ensure the success of the S-NPP and JPSS 1 missions by maintaining quality and accuracy of the data products the scientific community relies on. This poster will outline the programs algorithm change process (ACP), identify the various users and scientific applications of our operational data products and highlight changes that have been made to the ACP to accommodate operating system upgrades to the JPSS programs Interface Data Processing Segment (IDPS), so that the program is ready for the transition to the 2017 JPSS-1 satellite mission and beyond.

  14. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)

  15. Monte Carlo tests of the ELIPGRID-PC algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, J.R.

    1995-04-01

    The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangularmore » sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.« less

  16. CARES/LIFE Software Commercialization

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.

  17. Flexible high-speed CODEC

    NASA Technical Reports Server (NTRS)

    Segallis, Greg P.; Wernlund, Jim V.; Corry, Glen

    1993-01-01

    This report is prepared by Harris Government Communication Systems Division for NASA Lewis Research Center under contract NAS3-25087. It is written in accordance with SOW section 4.0 (d) as detailed in section 2.6. The purpose of this document is to provide a summary of the program, performance results and analysis, and a technical assessment. The purpose of this program was to develop a flexible, high-speed CODEC that provides substantial coding gain while maintaining bandwidth efficiency for use in both continuous and bursted data environments for a variety of applications.

  18. NASCAP programmer's reference manual

    NASA Astrophysics Data System (ADS)

    Mandell, M. J.; Stannard, P. R.; Katz, I.

    1993-05-01

    The NASA Charging Analyzer Program (NASCAP) is a computer program designed to model the electrostatic charging of complicated three-dimensional objects, both in a test tank and at geosynchronous altitudes. This document is a programmer's reference manual and user's guide. It is designed as a reference to experienced users of the code, as well as an introduction to its use for beginners. All of the many capabilities of NASCAP are covered in detail, together with examples of their use. These include the definition of objects, plasma environments, potential calculations, particle emission and detection simulations, and charging analysis.

  19. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  20. Implementing a bar-coded bedside medication administration system.

    PubMed

    Yates, Cindy

    2007-01-01

    Hospitals across the nation are struggling with implementing electronic medication administration and reporting (eMAR) systems as part of patient safety programs. St Luke's Hospital in Chesterfield, Mo, initiated their eMAR initiative in June 2003, initiating program start-up in September 2004. This case study documents how the project was approached, its overall success, and what was learned along the way. Also included is a recent update highlighting the expansion of St Luke's patient safety initiative, adapting eMAR to two specialty units: dialysis and laboratory processes.

  1. NASCAP programmer's reference manual

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Stannard, P. R.; Katz, I.

    1993-01-01

    The NASA Charging Analyzer Program (NASCAP) is a computer program designed to model the electrostatic charging of complicated three-dimensional objects, both in a test tank and at geosynchronous altitudes. This document is a programmer's reference manual and user's guide. It is designed as a reference to experienced users of the code, as well as an introduction to its use for beginners. All of the many capabilities of NASCAP are covered in detail, together with examples of their use. These include the definition of objects, plasma environments, potential calculations, particle emission and detection simulations, and charging analysis.

  2. History of DCAS 1961. Volume V. Origins of the USAF Space Program 1945-1956

    DTIC Science & Technology

    1961-01-01

    AVAILABILITY CODES N! _______________ PHOTOGRAPH TIS SHEET AND RETURN TO DTIC-DDA-2 FORM DOCUMENT PROCESSING SHEE~~DTIC OC 970A OCT 79 AFSC HISTORICAL I...IZ ION OR HIGHER A THO ITY IN T E DI ECT LI COMMAN SIi Prepredundr te poviion ofAirFore Rgultio 21-1 nd ir ori i Sytem ComandSuplemnt N. Itheetoas...information as it appears in the narrative. It is to be hoped that additional information bearing on the formative years of the space program will appear as a

  3. Technical Support Document for Version 3.4.0 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2007-09-14

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.« less

  4. Using Inspections to Improve the Quality of Product Documentation and Code.

    ERIC Educational Resources Information Center

    Zuchero, John

    1995-01-01

    Describes how, by adapting software inspections to assess documentation and code, technical writers can collaborate with development personnel, editors, and customers to dramatically improve both the quality of documentation and the very process of inspecting that documentation. Notes that the five steps involved in the inspection process are:…

  5. User's manual for COAST 4: a code for costing and sizing tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sink, D. A.; Iwinski, E. M.

    1979-09-01

    The purpose of this report is to document the computer program COAST 4 for the user/analyst. COAST, COst And Size Tokamak reactors, provides complete and self-consistent size models for the engineering features of D-T burning tokamak reactors and associated facilities involving a continuum of performance including highly beam driven through ignited plasma devices. TNS (The Next Step) devices with no tritium breeding or electrical power production are handled as well as power producing and fissile producing fusion-fission hybrid reactors. The code has been normalized with a TFTR calculation which is consistent with cost, size, and performance data published in themore » conceptual design report for that device. Information on code development, computer implementation and detailed user instructions are included in the text.« less

  6. Guidelines for VCCT-Based Interlaminar Fatigue and Progressive Failure Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Deobald, Lyle R.; Mabson, Gerald E.; Engelstad, Steve; Prabhakar, M.; Gurvich, Mark; Seneviratne, Waruna; Perera, Shenal; O'Brien, T. Kevin; Murri, Gretchen; Ratcliffe, James; hide

    2017-01-01

    This document is intended to detail the theoretical basis, equations, references and data that are necessary to enhance the functionality of commercially available Finite Element codes, with the objective of having functionality better suited for the aerospace industry in the area of composite structural analysis. The specific area of focus will be improvements to composite interlaminar fatigue and progressive interlaminar failure. Suggestions are biased towards codes that perform interlaminar Linear Elastic Fracture Mechanics (LEFM) using Virtual Crack Closure Technique (VCCT)-based algorithms [1,2]. All aspects of the science associated with composite interlaminar crack growth are not fully developed and the codes developed to predict this mode of failure must be programmed with sufficient flexibility to accommodate new functional relationships as the science matures.

  7. Evaluation of Recent Upgrades to the NESS (Nuclear Engine System Simulation) Code

    NASA Technical Reports Server (NTRS)

    Fittje, James E.; Schnitzler, Bruce G.

    2008-01-01

    The Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for exploratory expeditions to the moon, Mars, and beyond. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the Rover/NERVA program from 1955 to 1973. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design, and a comparison of its results to the Small Nuclear Rocket Engine (SNRE) design.

  8. Top 10 Tips for Using Advance Care Planning Codes in Palliative Medicine and Beyond.

    PubMed

    Jones, Christopher A; Acevedo, Jean; Bull, Janet; Kamal, Arif H

    2016-12-01

    Although recommended for all persons with serious illness, advance care planning (ACP) has historically been a charitable clinical service. Inadequate or unreliable provisions for reimbursement, among other barriers, have spurred a gap between the evidence demonstrating the importance of timely ACP and recognition by payers for its delivery. 1 For the first time, healthcare is experiencing a dramatic shift in billing codes that support increased care management and care coordination. ACP, chronic care management, and transitional care management codes are examples of this newer recognition of the value of these types of services. ACP discussions are an integral component of comprehensive, high-quality palliative care delivery. The advent of reimbursement mechanisms to recognize these services has an enormous potential to impact palliative care program sustainability and growth. In this article, we highlight 10 tips to effectively using the new ACP codes reimbursable under Medicare. The importance of documentation, proper billing, and nuances regarding coding is addressed.

  9. Top 10 Tips for Using Advance Care Planning Codes in Palliative Medicine and Beyond

    PubMed Central

    Acevedo, Jean; Bull, Janet; Kamal, Arif H.

    2016-01-01

    Abstract Although recommended for all persons with serious illness, advance care planning (ACP) has historically been a charitable clinical service. Inadequate or unreliable provisions for reimbursement, among other barriers, have spurred a gap between the evidence demonstrating the importance of timely ACP and recognition by payers for its delivery.1 For the first time, healthcare is experiencing a dramatic shift in billing codes that support increased care management and care coordination. ACP, chronic care management, and transitional care management codes are examples of this newer recognition of the value of these types of services. ACP discussions are an integral component of comprehensive, high-quality palliative care delivery. The advent of reimbursement mechanisms to recognize these services has an enormous potential to impact palliative care program sustainability and growth. In this article, we highlight 10 tips to effectively using the new ACP codes reimbursable under Medicare. The importance of documentation, proper billing, and nuances regarding coding is addressed. PMID:27682147

  10. Documentation of the GLAS fourth order general circulation model. Volume 2: Scalar code

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 2, of a 3 volume technical memoranda contains a detailed documentation of the GLAS fourth order general circulation model. Volume 2 contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A variable name dictionary for the scalar code, and code listings are outlined.

  11. Coding of procedures documented by general practitioners in Swedish primary care-an explorative study using two procedure coding systems

    PubMed Central

    2012-01-01

    Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095

  12. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equationsmore » for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.« less

  13. CET93 and CETPC: An interim updated version of the NASA Lewis computer program for calculating complex chemical equilibria with applications

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Reno, Martin A.; Gordon, Sanford

    1994-01-01

    The NASA Lewis chemical equilibrium program with applications continues to be improved and updated. The latest version is CET93. This code, with smaller arrays, has been compiled for use on an IBM or IBM-compatible personal computer and is called CETPC. This report is intended to be primarily a users manual for CET93 and CETPC. It does not repeat the more complete documentation of earlier reports on the equilibrium program. Most of the discussion covers input and output files, two new options (ONLY and comments), example problems, and implementation of CETPC.

  14. A computer program for calculating relative-transmissivity input arrays to aid model calibration

    USGS Publications Warehouse

    Weiss, Emanuel

    1982-01-01

    A program is documented that calculates a transmissivity distribution for input to a digital ground-water flow model. Factors that are taken into account in the calculation are: aquifer thickness, ground-water viscosity and its dependence on temperature and dissolved solids, and permeability and its dependence on overburden pressure. Other factors affecting ground-water flow are indicated. With small changes in the program code, leakance also could be calculated. The purpose of these calculations is to provide a physical basis for efficient calibration, and to extend rational transmissivity trends into areas where model calibration is insensitive to transmissivity values.

  15. An Engine Research Program Focused on Low Pressure Turbine Aerodynamic Performance

    NASA Technical Reports Server (NTRS)

    Castner, Raymond; Wyzykowski, John; Chiapetta, Santo; Adamczyk, John

    2002-01-01

    A comprehensive test program was performed in the Propulsion Systems Laboratory at the NASA Glenn Research Center, Cleveland Ohio using a highly instrumented Pratt and Whitney Canada PW 545 turbofan engine. A key objective of this program was the development of a high-altitude database on small, high-bypass ratio engine performance and operability. In particular, the program documents the impact of altitude (Reynolds Number) on the aero-performance of the low-pressure turbine (fan turbine). A second objective was to assess the ability of a state-of-the-art CFD code to predict the effect of Reynolds number on the efficiency of the low-pressure turbine. CFD simulation performed prior and after the engine tests will be presented and discussed. Key findings are the ability of a state-of-the art CFD code to accurately predict the impact of Reynolds Number on the efficiency and flow capacity of the low-pressure turbine. In addition the CFD simulations showed the turbulent intensity exiting the low-pressure turbine to be high (9%). The level is consistent with measurements taken within an engine.

  16. Correct coding for laboratory procedures during assisted reproductive technology cycles.

    PubMed

    2016-04-01

    This document provides updated coding information for services related to assisted reproductive technology procedures. This document replaces the 2012 ASRM document of the same name. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. FERRET data analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.

  18. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.

    This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less

  19. Empirical transfer functions for stations in the Central California seismological network

    USGS Publications Warehouse

    Bakun, W.H.; Dratler, Jay

    1976-01-01

    A sequence of calibration signals composed of a station identification code, a transient from the release of the seismometer mass at rest from a known displacement from the equilibrium position, and a transient from a known step in voltage to the amplifier input are generated by the automatic daily calibration system (ADCS) now operational in the U.S. Geological Survey central California seismographic network. Documentation of a sequence of interactive programs to compute, from the calibration data, the complex transfer functions for the seismographic system (ground motion through digitizer) the electronics (amplifier through digitizer), and the seismometer alone are presented. The analysis utilizes the Fourier transform technique originally suggested by Espinosa et al (1962). Section I is a general description of seismographic calibration. Section II contrasts the 'Fourier transform' and the 'least-squares' techniques for analyzing transient calibration signals. Theoretical consideration for the Fourier transform technique used here are described in Section III. Section IV is a detailed description of the sequence of calibration signals generated by the ADCS. Section V is a brief 'cookbook description' of the calibration programs; Section VI contains a detailed sample program execution. Section VII suggests the uses of the resultant empirical transfer functions. Supplemental interactive programs by which smooth response functions, suitable for reducing seismic data to ground motion, are also documented in Section VII. Appendices A and B contain complete listings of the Fortran source Codes while Appendix C is an update containing preliminary results obtained from an analysis of some of the calibration signals from stations in the seismographic network near Oroville, California.

  20. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  1. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 4: Maintenance document (version 1.1)

    NASA Technical Reports Server (NTRS)

    Baruah, P. K.; Bussoletti, J. E.; Chiang, D. T.; Massena, W. A.; Nelson, F. D.; Furdon, D. J.; Tsurusaki, K.

    1981-01-01

    The Maintenance Document is a guide to the PAN AIR software system, a system which computes the subsonic or supersonic linear potential flow about a body of nearly arbitrary shape, using a higher order panel method. The document describes the over-all system and each program module of the system. Sufficient detail is given for program maintenance, updating and modification. It is assumed that the reader is familiar with programming and CDC (Control Data Corporation) computer systems. The PAN AIR system was written in FORTRAN 4 language except for a few COMPASS language subroutines which exist in the PAN AIR library. Structured programming techniques were used to provide code documentation and maintainability. The operating systems accommodated are NOS 1.2, NOS/BE and SCOPE 2.1.3 on the CDC 6600, 7600 and Cyber 175 computing systems. The system is comprised of a data management system, a program library, an execution control module and nine separate FORTRAN technical modules. Each module calculates part of the posed PAN AIR problem. The data base manager is used to communicate between modules and within modules. The technical modules must be run in a prescribed fashion for each PAN AIR problem. In order to ease the problem of supplying the many JCL cards required to execute the modules, a separate module called MEC (Module Execution Control) was created to automatically supply most of the JCL cards. In addition to the MEC generated JCL, there is an additional set of user supplied JCL cards to initiate the JCL sequence stored on the system.

  2. Assessment of MSIV full closure for Santa Maria de Garona Nuclear Power Plant using TRAC-BF1 (G1J1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crespo, J.L.; Fernandez, R.A.

    1993-06-01

    This document presents a spurious Main Steam Isolation Value (MSIV) closure analysis for Santa Maria de Garorta Nuclear Power Plan describing the problems found when comparing calculated and real data. The plant is a General Electric Boiling Water Reactor 3, containment type Mark 1. It is operated by NUCLENOR, S.A. and was connected to the grid in 1971. The analysis has been performed by the Apphed Physics Department from the University of Cantabria and the Analysis and Operation Section from NUCLENOR, S.A. as a part of an agreement for developing an engineering simulator of operational transients and accidents for Santamore » Maria de Gamma Power Plant. The analysis was performed using the frozen version of TRAC-BFI (GlJl) code and is the second of two NUCLENOR contributions to the International Code Applications and Assessment Program (ICAP). The code was run in a Cyber 932 with operating system NOS/VE, property of NUCLENOR, S.A.. A programming effort was carried out in order to provide suitable graphics from the output file.« less

  3. Comparisons of ANS, ASME, AWS, and NFPA standards cited in the NRC standard review plan, NUREG-0800, and related documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.

    This report provides the results of comparisons of the cited and latest versions of ANS, ASME, AWS and NFPA standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC`s Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented.more » Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review.« less

  4. Documentation for the machine-readable character coded version of the SKYMAP catalogue

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.

    1981-01-01

    The SKYMAP catalogue is a compilation of astronomical data prepared primarily for purposes of attitude guidance for satellites. In addition to the SKYMAP Master Catalogue data base, a software package of data base management and utility programs is available. The tape version of the SKYMAP Catalogue, as received by the Astronomical Data Center (ADC), contains logical records consisting of a combination of binary and EBCDIC data. Certain character coded data in each record are redundant in that the same data are present in binary form. In order to facilitate wider use of all SKYMAP data by the astronomical community, a formatted (character) version was prepared by eliminating all redundant character data and converting all binary data to character form. The character version of the catalogue is described. The document is intended to fully describe the formatted tape so that users can process the data problems and guess work; it should be distributed with any character version of the catalogue.

  5. Integrated Mission Simulation (IMSim): Multiphase Initialization Design with Late Joiners, Rejoiners and Federation Save & Restore

    NASA Technical Reports Server (NTRS)

    Dexter, Daniel E.; Varesic, Tony E.

    2015-01-01

    This document describes the design of the Integrated Mission Simulation (IMSim) federate multiphase initialization process. The main goal of multiphase initialization is to allow for data interdependencies during the federate initialization process. IMSim uses the High Level Architecture (HLA) IEEE 1516 [1] to provide the communication and coordination between the distributed parts of the simulation. They are implemented using the Runtime Infrastructure (RTI) from Pitch Technologies AB. This document assumes a basic understanding of IEEE 1516 HLA, and C++ programming. In addition, there are several subtle points in working with IEEE 1516 and the Pitch RTI that need to be understood, which are covered in Appendix A. Please note the C++ code samples shown in this document are for the IEEE 1516-2000 standard.

  6. HEPLIB `91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  7. HEPLIB 91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  8. Radiation and scattering from bodies of translation. Volume 2: User's manual, computer program documentation

    NASA Astrophysics Data System (ADS)

    Medgyesi-Mitschang, L. N.; Putnam, J. M.

    1980-04-01

    A hierarchy of computer programs implementing the method of moments for bodies of translation (MM/BOT) is described. The algorithm treats the far-field radiation and scattering from finite-length open cylinders of arbitrary cross section as well as the near fields and aperture-coupled fields for rectangular apertures on such bodies. The theoretical development underlying the algorithm is described in Volume 1. The structure of the computer algorithm is such that no a priori knowledge of the method of moments technique or detailed FORTRAN experience are presupposed for the user. A set of carefully drawn example problems illustrates all the options of the algorithm. For more detailed understanding of the workings of the codes, special cross referencing to the equations in Volume 1 is provided. For additional clarity, comment statements are liberally interspersed in the code listings, summarized in the present volume.

  9. Computer Program for Point Location And Calculation of ERror (PLACER)

    USGS Publications Warehouse

    Granato, Gregory E.

    1999-01-01

    A program designed for point location and calculation of error (PLACER) was developed as part of the Quality Assurance Program of the Federal Highway Administration/U.S. Geological Survey (USGS) National Data and Methodology Synthesis (NDAMS) review process. The program provides a standard method to derive study-site locations from site maps in highwayrunoff, urban-runoff, and other research reports. This report provides a guide for using PLACER, documents methods used to estimate study-site locations, documents the NDAMS Study-Site Locator Form, and documents the FORTRAN code used to implement the method. PLACER is a simple program that calculates the latitude and longitude coordinates of one or more study sites plotted on a published map and estimates the uncertainty of these calculated coordinates. PLACER calculates the latitude and longitude of each study site by interpolating between the coordinates of known features and the locations of study sites using any consistent, linear, user-defined coordinate system. This program will read data entered from the computer keyboard and(or) from a formatted text file, and will write the results to the computer screen and to a text file. PLACER is readily transferable to different computers and operating systems with few (if any) modifications because it is written in standard FORTRAN. PLACER can be used to calculate study site locations in latitude and longitude, using known map coordinates or features that are identifiable in geographic information data bases such as USGS Geographic Names Information System, which is available on the World Wide Web.

  10. Critical Infrastructure: Control Systems and the Terrorist Threat

    DTIC Science & Technology

    2004-01-20

    Congressional Research Service ˜ The Library of Congress CRS Report for Congress Received through the CRS Web Order Code RL31534 Critical...http://www.pnl.gov/main/sectors/homeland.html]. 68 Rolf Carlson, “Sandia SCADA Program High-Security SCADA LDRD Final Report ,” Sandia Report SAND2002...and Industry Division Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to

  11. Critical Infrastructure: Control Systems and the Terrorist Threat

    DTIC Science & Technology

    2003-07-14

    Congressional Research Service ˜ The Library of Congress CRS Report for Congress Received through the CRS Web Order Code RL31534 Critical...available online at [http://www.pnl.gov/main/sectors/homeland.html]. 56 Rolf Carlson, “Sandia SCADA Program High-Security SCADA LDRD Final Report ...Industry Division Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burge, S.W.

    This report describes the theory and structure of the FORCE2 flow program. The manual describes the governing model equations, solution procedure and their implementation in the computer program. FORCE2 is an extension of an existing B&V multidimensional, two-phase flow program. FORCE2 was developed for application to fluid beds by flow implementing a gas-solids modeling technology derived, in part, during a joint government -- industry research program, ``Erosion of FBC Heat Transfer Tubes,`` coordinated by Argonne National Laboratory. The development of FORCE2 was sponsored by ASEA-Babcock, an industry participant in this program. This manual is the principal documentation for the programmore » theory and organization. Program usage and post-processing of code predictions with the FORCE2 post-processor are described in a companion report, FORCE2 -- A Multidimensional Flow Program for Fluid Beds, User`s Guide. This manual is segmented into sections to facilitate its usage. In section 2.0, the mass and momentum conservation principles, the basis for the code, are presented. In section 3.0, the constitutive relations used in modeling gas-solids hydrodynamics are given. The finite-difference model equations are derived in section 4.0 and the solution procedures described in sections 5.0 and 6.0. Finally, the implementation of the model equations and solution procedure in FORCE2 is described in section 7.0.« less

  13. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 4: Maintenance document (version 3.0)

    NASA Technical Reports Server (NTRS)

    Purdon, David J.; Baruah, Pranab K.; Bussoletti, John E.; Epton, Michael A.; Massena, William A.; Nelson, Franklin D.; Tsurusaki, Kiyoharu

    1990-01-01

    The Maintenance Document Version 3.0 is a guide to the PAN AIR software system, a system which computes the subsonic or supersonic linear potential flow about a body of nearly arbitrary shape, using a higher order panel method. The document describes the overall system and each program module of the system. Sufficient detail is given for program maintenance, updating, and modification. It is assumed that the reader is familiar with programming and CRAY computer systems. The PAN AIR system was written in FORTRAN 4 language except for a few CAL language subroutines which exist in the PAN AIR library. Structured programming techniques were used to provide code documentation and maintainability. The operating systems accommodated are COS 1.11, COS 1.12, COS 1.13, and COS 1.14 on the CRAY 1S, 1M, and X-MP computing systems. The system is comprised of a data base management system, a program library, an execution control module, and nine separate FORTRAN technical modules. Each module calculates part of the posed PAN AIR problem. The data base manager is used to communicate between modules and within modules. The technical modules must be run in a prescribed fashion for each PAN AIR problem. In order to ease the problem of supplying the many JCL cards required to execute the modules, a set of CRAY procedures (PAPROCS) was created to automatically supply most of the JCL cards. Most of this document has not changed for Version 3.0. It now, however, strictly applies only to PAN AIR version 3.0. The major changes are: (1) additional sections covering the new FDP module (which calculates streamlines and offbody points); (2) a complete rewrite of the section on the MAG module; and (3) strict applicability to CRAY computing systems.

  14. A component-centered meta-analysis of family-based prevention programs for adolescent substance use.

    PubMed

    Van Ryzin, Mark J; Roseth, Cary J; Fosco, Gregory M; Lee, You-Kyung; Chen, I-Chien

    2016-04-01

    Although research has documented the positive effects of family-based prevention programs, the field lacks specific information regarding why these programs are effective. The current study summarized the effects of family-based programs on adolescent substance use using a component-based approach to meta-analysis in which we decomposed programs into a set of key topics or components that were specifically addressed by program curricula (e.g., parental monitoring/behavior management,problem solving, positive family relations, etc.). Components were coded according to the amount of time spent on program services that targeted youth, parents, and the whole family; we also coded effect sizes across studies for each substance-related outcome. Given the nested nature of the data, we used hierarchical linear modeling to link program components (Level 2) with effect sizes (Level 1). The overall effect size across programs was .31, which did not differ by type of substance. Youth-focused components designed to encourage more positive family relationships and a positive orientation toward the future emerged as key factors predicting larger than average effect sizes. Our results suggest that, within the universe of family-based prevention, where components such as parental monitoring/behavior management are almost universal, adding or expanding certain youth-focused components may be able to enhance program efficacy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  16. A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Wyss, Gregory Dane

    2004-07-01

    This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a librarymore » that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.« less

  17. Aeroacoustic Codes for Rotor Harmonic and BVI Noise. CAMRAD.Mod1/HIRES: Methodology and Users' Manual

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.

    1998-01-01

    This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  19. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  20. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  1. 77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... the process. The Code Revision Process contains four basic steps that are followed for developing new documents as well as revising existing documents. Step 1: Public Input Stage, which results in the First Draft Report (formerly ROP); Step 2: Comment Stage, which results in the Second Draft Report (formerly...

  2. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  3. Clinical integration of billing for a pediatric nephrology and transplant program.

    PubMed

    Tietjen, Andrea L; Orsini, Jenoveva; Mulgaonkar, Shamkant; Morgan, Debbie

    2003-09-01

    To develop and implement a billing process that fully integrates all activities of a pediatric nephrology and transplant program, by facilitating and coordinating data from patients, physicians, hospitals, and third-party billing services to maximize revenues. Financial operations were analyzed via a randomized audit of patient charts that focused on office procedures and revenue collection. Results based on monthly reports documenting revenue received and outstanding, procedures billed, and patient registration accuracy. The combination of improvements in patient registration, chart documentation, new billing sheets with procedure and diagnosis codes, physician in-service education, upgraded charges, and the recredentialing of all practice physicians realized an increase in revenue collections from 18% in 2000 to 89% in 2001. The need to integrate and coordinate information is vital for both billing accuracy and revenue collections. Integration of clinical services and billing procedures has maximized performance, profitability, and accuracy while decreasing administrative time and costs.

  4. Validation of Framework Code Approach to a Life Prediction System for Fiber Reinforced Composites

    NASA Technical Reports Server (NTRS)

    Gravett, Phillip

    1997-01-01

    The grant was conducted by the MMC Life Prediction Cooperative, an industry/government collaborative team, Ohio Aerospace Institute (OAI) acted as the prime contractor on behalf of the Cooperative for this grant effort. See Figure I for the organization and responsibilities of team members. The technical effort was conducted during the period August 7, 1995 to June 30, 1996 in cooperation with Erwin Zaretsky, the LERC Program Monitor. Phil Gravett of Pratt & Whitney was the principal technical investigator. Table I documents all meeting-related coordination memos during this period. The effort under this grant was closely coordinated with an existing USAF sponsored program focused on putting into practice a life prediction system for turbine engine components made of metal matrix composites (MMC). The overall architecture of the NMC life prediction system was defined in the USAF sponsored program (prior to this grant). The efforts of this grant were focussed on implementing and tailoring of the life prediction system, the framework code within it and the damage modules within it to meet the specific requirements of the Cooperative. T'he tailoring of the life prediction system provides the basis for pervasive and continued use of this capability by the industry/government cooperative. The outputs of this grant are: 1. Definition of the framework code to analysis modules interfaces, 2. Definition of the interface between the materials database and the finite element model, and 3. Definition of the integration of the framework code into an FEM design tool.

  5. Intercomparison of Radiation Codes in Climate Models (ICRCCM) Infrared (Clear-Sky) Line-by Line Radiative Fluxes (DB1002)

    DOE Data Explorer

    Arking, A.; Ridgeway, B.; Clough, T.; Iacono, M.; Fomin, B.; Trotsenko, A.; Freidenreich, S.; Schwarzkopf, D.

    1994-01-01

    The intercomparison of Radiation Codes in Climate Models (ICRCCM) study was launched under the auspices of the World Meteorological Organization and with the support of the U.S. Department of Energy to document differences in results obtained with various radiation codes and radiation parameterizations in general circulation models (GCMs). ICRCCM produced benchmark, longwave, line-by-line (LBL) fluxes that may be compared against each other and against models of lower spectral resolution. During ICRCCM, infrared fluxes and cooling rates for several standard model atmospheres with varying concentrations of water vapor, carbon dioxide, and ozone were calculated with LBL methods at resolutions of 0.01 cm-1 or higher. For comparison with other models, values were summed for the IR spectrum and given at intervals of 5 or 10 cm-1. This archive contains fluxes for ICRCCM-prescribed clear-sky cases. Radiative flux and cooling-rate profiles are given for specified atmospheric profiles for temperature, water vapor, and ozone-mixing ratios. The archive contains 328 files, including spectral summaries, formatted data files, and a variety of programs (i.e., C-shell scripts, FORTRAN codes, and IDL programs) to read, reformat, and display data. Collectively, these files require approximately 59 MB of disk space.

  6. Euler Technology Assessment program for preliminary aircraft design employing SPLITFLOW code with Cartesian unstructured grid method

    NASA Technical Reports Server (NTRS)

    Finley, Dennis B.

    1995-01-01

    This report documents results from the Euler Technology Assessment program. The objective was to evaluate the efficacy of Euler computational fluid dynamics (CFD) codes for use in preliminary aircraft design. Both the accuracy of the predictions and the rapidity of calculations were to be assessed. This portion of the study was conducted by Lockheed Fort Worth Company, using a recently developed in-house Cartesian-grid code called SPLITFLOW. The Cartesian grid technique offers several advantages for this study, including ease of volume grid generation and reduced number of cells compared to other grid schemes. SPLITFLOW also includes grid adaptation of the volume grid during the solution convergence to resolve high-gradient flow regions. This proved beneficial in resolving the large vortical structures in the flow for several configurations examined in the present study. The SPLITFLOW code predictions of the configuration forces and moments are shown to be adequate for preliminary design analysis, including predictions of sideslip effects and the effects of geometry variations at low and high angles of attack. The time required to generate the results from initial surface definition is on the order of several hours, including grid generation, which is compatible with the needs of the design environment.

  7. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  8. Alloy 740H Component Manufacturing Development

    NASA Astrophysics Data System (ADS)

    de Barbadillo, J. J.; Baker, B. A.; Gollihue, R. D.; Patel, S. J.

    Alloy 740H was developed specifically for use in A-USC power plants. This alloy has been intensively evaluated in collaborative programs throughout the world, and the key properties have been verified and documented. In 2011 the alloy was approved for use in welded construction under ASME Code Case 2702. At present, alloy 740H is the only age-hardened nickel-base alloy that is ASME code approved. The emphasis for A-USC materials development is now on verification of the metalworking industry's capability to make the full range of mill product forms and sizes and to produce fittings and fabrications required for construction of a power plant. This paper presents the results of recent developments in component manufacture and evaluation.

  9. Documents Pertaining to Resource Conservation and Recovery Act Corrective Action Event Codes

    EPA Pesticide Factsheets

    Document containing RCRA Corrective Action event codes and definitions, including national requirements, initiating sources, dates, and guidance, from the first facility assessment until the Corrective Action is terminated.

  10. QTCM Software Documentation. Volume 1. Programmers’s Manual

    DTIC Science & Technology

    1990-11-01

    technologies are implemented. Since the PSNs are comprised of private companies, the OMNCS has little direct control over how they operate and the... control equipment at each node to prevent any congestion at the switches. Therefore, all call blocking is modeled to occur at the trunks rather than at the...main program is linked with separately compiled subroutines that can be grouped intoI seven distinct functional areas as follows: Control code File

  11. C++ Coding Standards for the AMP Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less

  12. Approaches in highly parameterized inversion - PEST++, a Parameter ESTimation code optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.

    2012-01-01

    An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.

  13. Does incorporation of a clinical support template in the electronic medical record improve capture of wound care data in a cohort of veterans with diabetic foot ulcers?

    PubMed

    Lowe, Jeanne R; Raugi, Gregory J; Reiber, Gayle E; Whitney, Joanne D

    2013-01-01

    The purpose of this cohort study was to evaluate the effect of a 1-year intervention of an electronic medical record wound care template on the completeness of wound care documentation and medical coding compared to a similar time interval for the fiscal year preceding the intervention. From October 1, 2006, to September 30, 2007, a "good wound care" intervention was implemented at a rural Veterans Affairs facility to prevent amputations in veterans with diabetes and foot ulcers. The study protocol included a template with foot ulcer variables embedded in the electronic medical record to facilitate data collection, support clinical decision making, and improve ordering and medical coding. The intervention group showed significant differences in complete documentation of good wound care compared to the historic control group (χ = 15.99, P < .001), complete documentation of coding for diagnoses and procedures (χ = 30.23, P < .001), and complete documentation of both good wound care and coding for diagnoses and procedures (χ = 14.96, P < .001). An electronic wound care template improved documentation of evidence-based interventions and facilitated coding for wound complexity and procedures.

  14. User's manual for the National Water Information Systemof the U.S. Geological Survey Water-Quality System

    USGS Publications Warehouse

    Gellenbeck, Dorinda J.; Oblinger, Carolyn J.; Runkle, Donna L.; Schertz, Terry L.; Scott, Jonathon C.; Stoker, Yvonne E.; Taylor, Robert L.

    2006-01-01

    This user documentation is designed to be a reference for the Water-Quality System (QWDATA) within the National Water Information System (NWIS). For the new user, the 'Introduction' and 'Getting Started' sections are the recommended places to begin. The experienced user may want to go straight to the details provided in the program section (section 3). Code lists and some miscellaneous reference materials are provided in the Appendices. The last section, 'Tip Sheets,' is a collection of suggestions for accomplishing selected tasks, some of which are basic and some of which are advanced. Where appropriate, these Tip Sheets are referenced in the main text of the documentation.

  15. User's manual for the National Water Information System of the U.S. Geological Survey (USGS), Water-quality System (QWDATA)

    USGS Publications Warehouse

    Gellenbeck, Dorinda; Oblinger, Carolyn J.; Runkle, Donna L.; Schertz, Terry L.; Scott, Jonathon C.; Taylor, Robert L.

    2005-01-01

    This user documentation is designed to be a reference for the Water-Quality System (QWDATA) within the National Water Information System (NWIS). For the new user, the 'Introduction' and 'Getting Started' sections are the recommended places to begin. The experienced user may want to go straight to the details provided in the program section (section 3). Code lists and some miscellaneous reference materials are provided in the Appendices. The last section, 'Tip Sheets,' is a collection of suggestions for accomplishing selected tasks, some of which are basic and some of which are advanced. Where appropriate, these Tip Sheets are referenced in the main text of the documentation.

  16. Documentation of computer program VS2D to solve the equations of fluid flow in variably saturated porous media

    USGS Publications Warehouse

    Lappala, E.G.; Healy, R.W.; Weeks, E.P.

    1987-01-01

    This report documents FORTRAN computer code for solving problems involving variably saturated single-phase flow in porous media. The flow equation is written with total hydraulic potential as the dependent variable, which allows straightforward treatment of both saturated and unsaturated conditions. The spatial derivatives in the flow equation are approximated by central differences, and time derivatives are approximated either by a fully implicit backward or by a centered-difference scheme. Nonlinear conductance and storage terms may be linearized using either an explicit method or an implicit Newton-Raphson method. Relative hydraulic conductivity is evaluated at cell boundaries by using either full upstream weighting, the arithmetic mean, or the geometric mean of values from adjacent cells. Nonlinear boundary conditions treated by the code include infiltration, evaporation, and seepage faces. Extraction by plant roots that is caused by atmospheric demand is included as a nonlinear sink term. These nonlinear boundary and sink terms are linearized implicitly. The code has been verified for several one-dimensional linear problems for which analytical solutions exist and against two nonlinear problems that have been simulated with other numerical models. A complete listing of data-entry requirements and data entry and results for three example problems are provided. (USGS)

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.

  18. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  19. PlasmaPy: beginning a community developed Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration

    2016-10-01

    In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.

  20. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  1. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  2. Use of the Physician Orders for Life-Sustaining Treatment program for patients being discharged from the hospital to the nursing facility.

    PubMed

    Hickman, Susan E; Nelson, Christine A; Smith-Howell, Esther; Hammes, Bernard J

    2014-01-01

    The Physician Orders for Life-Sustaining Treatment (POLST) documents patient preferences as medical orders that transfer across settings with patients. The objectives were to pilot test methods and gather preliminary data about POLST including (1) use at time of hospital discharge, (2) transfers across settings, and (3) consistency with prior decisions. Descriptive with chart abstraction and interviews. Participants were hospitalized patients discharged to a nursing facility and/or their surrogates in La Crosse County, Wisconsin. POLST forms were abstracted from hospital records for 151 patients. Hospital and nursing facility chart data were abstracted and interviews were conducted with an additional 39 patients/surrogates. Overall, 176 patients had valid POLST forms at the time of discharge from the hospital, and many (38.6%; 68/176) only documented code status. When the whole POLST was completed, orders were more often marked as based on a discussion with the patient and/or surrogate than when the form was used just for code status (95.1% versus 13.8%, p<.001). In the follow-up and interview sample, a majority (90.6%; 29/32) of POLST forms written in the hospital were unchanged up to three weeks after nursing facility admission. Most (71.9%; 23/32) appeared consistent with patient or surrogate recall of prior treatment decisions. POLST forms generated in the hospital do transfer with patients across settings, but are often used only to document code status. POLST orders appeared largely consistent with prior treatment decisions. Further research is needed to assess the quality of POLST decisions.

  3. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less

  4. The development and validation of The Inquiry Science Observation Coding Sheet.

    PubMed

    Brandon, P R; Taum, A K H; Young, D B; Pottenger, F M

    2008-08-01

    Evaluation reports increasingly document the degree of program implementation, particularly the extent to which programs adhere to prescribed steps and procedures. Many reports are cursory, however, and few, if any, fully portray the long and winding path taken when developing evaluation instruments, particularly observation instruments. In this article, we describe the development of an observational method for evaluating the degree to which K-12 inquiry science programs are implemented, including the many steps and decisions that occurred during the development, and present evidence for the reliability and validity of the data that we collected with the instrument. The article introduces a method for measuring the adherence of inquiry science implementation and gives evaluators a full picture of what they might expect when developing observation instruments for assessing the degree of program implementation.

  5. Analytical solutions for one-, two-, and three-dimensional solute transport in ground-water systems with uniform flow

    USGS Publications Warehouse

    Wexler, Eliezer J.

    1992-01-01

    Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems having uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of selected solutions, source codes for the computer programs, and samples of program input and output also are included.

  6. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model--Documentation of the SEAWAT-2000 Version with the Variable-Density Flow Process (VDF) and the Integrated MT3DMS Transport Process (IMT)

    USGS Publications Warehouse

    Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing

    2003-01-01

    SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.

  7. 3DGRAPE/AL User's Manual

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Alter, Stephen J.

    1995-01-01

    This document is a users' manual for a new three-dimensional structured multiple-block volume g generator called 3DGRAPE/AL. It is a significantly improved version of the previously-released a widely-distributed programs 3DGRAPE and 3DMAGGS. It generates volume grids by iteratively solving the Poisson Equations in three-dimensions. The right-hand-side terms are designed so that user-specific; grid cell heights and user-specified grid cell skewness near boundary surfaces result automatically, with little user intervention. The code is written in Fortran-77, and can be installed with or without a simple graphical user interface which allows the user to watch as the grid is generated. An introduction describing the improvements over the antecedent 3DGRAPE code is presented first. Then follows a chapter on the basic grid generator program itself, and comments on installing it. The input is then described in detail. After that is a description of the Graphical User Interface. Five example cases are shown next, with plots of the results. Following that is a chapter on two input filters which allow use of input data generated elsewhere. Last is a treatment of the theory embodied in the code.

  8. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  9. Program Descriptions for Interactive Signal and Pattern Analysis and Recognition System (ISPARS).

    DTIC Science & Technology

    1984-03-01

    procedures for the ISPARS components developed at the David Taylor Naval Ship Reserach and Development Center (DTNSRDC), which are not documented in other...an alphabetic character. Some commands may consist of a letter and one or two two-digit numbers, separated by a space as specified in the table. 81 I-I... PAPERS INTENDED FOR IN- TERNAL USE. THEY CARRY AN IDENTIFYING NUMBER WHICH INDICATES THEIR TYPE AND THE NUMERICAL CODE OF THE ORIGINATING DEPARTMENT

  10. Studies of Millimeter-Wave Diffraction Devices and Materials

    DTIC Science & Technology

    1984-12-28

    7.0 REFERENCES 1. Andrenko, S . d., Devyatkov, Acad. N. D., and Shestopalov, V. P., "Millimeter Field Band Antenna Arrays", Dokl. Akad. 4auk SSSR, Vol... S UNCLASSTFIED I* .RIT.Y CL.ASSIFICATION OF THIS PAGE REPORT DOCUMENTATION PAGE :kFPOO- SEURITY CLASSIFICATION 1-b. RESTRICTIVE MARKINGS .EM...State and ZIP Code) 10. SOURCE OF FUNDIN.G NOS. ______ C)c \\~ S PROGRAM PROJECT TASK WORK UNIT 2~~V \\~ ~(~ELEMENT NO. NO. No. NO. ATEinciude Security

  11. VESUVIO Data Analysis Goes MANTID

    NASA Astrophysics Data System (ADS)

    Jackson, S.; Krzystyniak, M.; Seel, A. G.; Gigg, M.; Richards, S. E.; Fernandez-Alonso, F.

    2014-12-01

    This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base.

  12. Rapid Prototyping of Application Specific Signal Processors (RASSP)

    DTIC Science & Technology

    1993-12-23

    Compilers 2-9 - Cadre Teamwork 2-13 - CodeCenter (Centerline) 2-15 - dbx/dbxtool (UNIXm) 2-17 - Falcon (Mentor) 2-19 - FrameMaker (Frame Tech) 2-21 - gprof...UNIXm C debuggers Falcon Mentor ECAD Framework FrameMaker Frame Tech Word Processing gcc GNU CIC++ compiler gprof GNU Software profiling tool...organization can put their own documentation on-line using the BOLD Com- poser for Framemaker . " The AMPLE programming language is a C like language used for

  13. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  14. Identifying and Mitigating Risks in Security Sector Assistance for Africa’s Fragile States

    DTIC Science & Technology

    2015-01-01

    The Logframe Handbook: A Logical Framework Approach to Project Cycle Management , Washington, D.C., 2005. 34 Identifying and Mitigating Risks in SSA...Fragile States 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER... Project Unique Identification Code (PUIC) for the project that produced this document is HQD126409. v Contents Preface

  15. Integrated Support for Manipulation and Display of 3D Objects for the Command and Control Workstation of the Future

    DTIC Science & Technology

    1989-06-01

    Science Unclassified SECURITY CLASSIFICATION OF THIS PAGE REPORT DOCUMENTATION PAGE la. REPORT SECURITY CLASS’r!CATION )b RESTRICTIVE MARKINGS UNCLASSIFIED...2a. SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release; Zb. DECLASSIFICATION I DOWNGRADING SCHEDULE...ZIP Code) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT Monterey, CA. 93943 FLEMENT NO. NO. NO ACCESSION NO. 11. TITLE (Include Security

  16. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  17. The process of implementing a rural VA wound care program for diabetic foot ulcer patients.

    PubMed

    Reiber, Gayle E; Raugi, Gregory J; Rowberg, Donald

    2007-10-01

    Delivering and documenting evidence-based treatment to all Department of Veterans Affairs (VA) foot ulcer patients has wide appeal. However, primary and secondary care medical centers where 52% of these patients receive care are at a disadvantage given the frequent absence of trained specialists to manage diabetic foot ulcers. A retrospective review of diabetic foot ulcer patient records and a provider survey were conducted to document the foot ulcer problem and to assess practitioner needs. Results showed of the 125 persons with foot ulcers identified through administrative data, only, 21% of diabetic foot patients were correctly coded. Chronic Care and Microsystem models were used to prepare a tailored intervention in a VA primary care medical center. The site Principal Investigators, a multidisciplinary site wound care team, and study investigators jointly implemented a diabetic foot ulcer program. Intervention components include wound care team education and training, standardized good wound care practices based on strong scientific evidence, and a wound care template embedded in the electronic medical record to facilitate data collection, clinical decision making, patient ordering, and coding. A strategy for delivering offloading pressure devices, regular case management support, and 24/7 emergency assistance also was developed. It took 9 months to implement the model. Patients were enrolled and followed for 1 year. Process and outcome evaluations are on-going.

  18. Miscoding and other user errors: importance of ongoing education for proper blood glucose monitoring procedures.

    PubMed

    Schrock, Linda E

    2008-07-01

    This article reviews the literature to date and reports on a new study that documented the frequency of manual code-requiring blood glucose (BG) meters that were miscoded at the time of the patient's initial appointment in a hospital-based outpatient diabetes education program. Between January 1 and May 31, 2007, the type of BG meter and the accuracy of the patient's meter code (if required) and procedure for checking BG were checked during the initial appointment with the outpatient diabetes educator. If indicated, reeducation regarding the procedure for the BG meter code entry and/or BG test was provided. Of the 65 patients who brought their meter requiring manual entry of a code number or code chip to the initial appointment, 16 (25%) were miscoded at the time of the appointment. Two additional problems, one of dead batteries and one of improperly stored test strips, were identified and corrected at the first appointment. These findings underscore the importance of checking the patient's BG meter code (if required) and procedure for testing BG at each encounter with a health care professional or providing the patient with a meter that does not require manual entry of a code number or chip to match the container of test strips (i.e., an autocode meter).

  19. Crowdsourcing the Measurement of Interstate Conflict

    PubMed Central

    2016-01-01

    Much of the data used to measure conflict is extracted from news reports. This is typically accomplished using either expert coders to quantify the relevant information or machine coders to automatically extract data from documents. Although expert coding is costly, it produces quality data. Machine coding is fast and inexpensive, but the data are noisy. To diminish the severity of this tradeoff, we introduce a method for analyzing news documents that uses crowdsourcing, supplemented with computational approaches. The new method is tested on documents about Militarized Interstate Disputes, and its accuracy ranges between about 68 and 76 percent. This is shown to be a considerable improvement over automated coding, and to cost less and be much faster than expert coding. PMID:27310427

  20. RELAP-7 Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Ray Alden; Zou, Ling; Zhao, Haihua

    This document summarizes the physical models and mathematical formulations used in the RELAP-7 code. In summary, the MOOSE based RELAP-7 code development is an ongoing effort. The MOOSE framework enables rapid development of the RELAP-7 code. The developmental efforts and results demonstrate that the RELAP-7 project is on a path to success. This theory manual documents the main features implemented into the RELAP-7 code. Because the code is an ongoing development effort, this RELAP-7 Theory Manual will evolve with periodic updates to keep it current with the state of the development, implementation, and model additions/revisions.

  1. A singularity free analytical solution of artificial satellite motion with drag

    NASA Technical Reports Server (NTRS)

    Scheifele, G.; Mueller, A. C.; Starke, S. E.

    1977-01-01

    The connection between the existing Delaunay-Similar and Poincare-Similar satellite theories in the true anomaly version is outlined for the J(2) perturbation and the new drag approach. An overall description of the concept of the approach is given while the necessary expansions and the procedure to arrive at the computer program for the canonical forces is delineated. The procedure for the analytical integration of these developed equations is described. In addition, some numerical results are given. The computer program for the algebraic multiplication of the Fourier series which creates the FORTRAN coding in an automatic manner is described and documented.

  2. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    PubMed

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  3. 76 FR 77593 - Proposed Exemptions From Certain Prohibited Transaction Restrictions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ...This document contains notices of pendency before the Department of Labor (the Department) of proposed exemptions from certain of the prohibited transaction restrictions of the Employee Retirement Income Security Act of 1974 (ERISA or the Act) and/or the Internal Revenue Code of 1986 (the Code). This notice includes the following proposed exemptions: D-11517, JPMorgan Chase & Co. and its Current and Future Affiliates and Subsidiaries (JPMorgan Chase); D- 11579, Delaware Charter Guarantee & Trust Co. d\\b\\a\\ Principle Trust Company (Principle Trust); D-11628, Aztec Well Servicing Company and Related Companies Medical Plan Trust Fund (the Plan); D-11669, Genzyme Corporation 401(k) Plan (the Plan or the Applicant); and Retirement Program for Employees of EnPro Industries (the Plan), D-11662 et al.

  4. 3 CFR - Delegation of Reporting Functions Specified in Section 491 of Title 10, United State Code

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 3 The President 1 2014-01-01 2014-01-01 false Delegation of Reporting Functions Specified in Section 491 of Title 10, United State Code Presidential Documents Other Presidential Documents Memorandum of June 19, 2013 Delegation of Reporting Functions Specified in Section 491 of Title 10, United State Code Memorandum for the Secretary of Defense B...

  5. 3 CFR - Delegation of Functions and Authority Under Sections 315 and 325 of Title 32, United States Code

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 3 The President 1 2012-01-01 2012-01-01 false Delegation of Functions and Authority Under Sections 315 and 325 of Title 32, United States Code Presidential Documents Other Presidential Documents Memorandum of April 14, 2011 Delegation of Functions and Authority Under Sections 315 and 325 of Title 32, United States Code Memorandum for the...

  6. Guide to Using Onionskin Analysis Code (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, Michael Lynn; Morzinski, Jerome Arthur

    2016-09-15

    This document is a guide to using R-code written for the purpose of analyzing onionskin experiments. We expect the user to be very familiar with statistical methods and the R programming language. For more details about onionskin experiments and the statistical methods mentioned in this document see Storlie, Fugate, et al. (2013). Engineers at LANL experiment with detonators and high explosives to assess performance. The experimental unit, called an onionskin, is a hemisphere consisting of a detonator and a booster pellet surrounded by explosive material. When the detonator explodes, a streak camera mounted above the pole of the hemisphere recordsmore » when the shock wave arrives at the surface. The output from the camera is a two-dimensional image that is transformed into a curve that shows the arrival time as a function of polar angle. The statistical challenge is to characterize a baseline population of arrival time curves and to compare the baseline curves to curves from a new, so-called, test series. The hope is that the new test series of curves is statistically similar to the baseline population.« less

  7. Sierra/Aria 4.48 Verification Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal Fluid Development Team

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  8. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  9. Mars Global Reference Atmospheric Model 2001 Version (Mars-GRAM 2001): Users Guide

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Johnson, D. L.

    2001-01-01

    This document presents Mars Global Reference Atmospheric Model 2001 Version (Mars-GRAM 2001) and its new features. As with the previous version (mars-2000), all parameterizations fro temperature, pressure, density, and winds versus height, latitude, longitude, time of day, and season (Ls) use input data tables from NASA Ames Mars General Circulation Model (MGCM) for the surface through 80-km altitude and the University of Arizona Mars Thermospheric General Circulation Model (MTGCM) for 80 to 70 km. Mars-GRAM 2001 is based on topography from the Mars Orbiter Laser Altimeter (MOLA) and includes new MGCM data at the topographic surface. A new auxiliary program allows Mars-GRAM output to be used to compute shortwave (solar) and longwave (thermal) radiation at the surface and top of atmosphere. This memorandum includes instructions on obtaining Mars-GRAN source code and data files and for running the program. It also provides sample input and output and an example for incorporating Mars-GRAM as an atmospheric subroutine in a trajectory code.

  10. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  11. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  12. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  13. LTSS compendium: an introduction to the CDC 7600 and the Livermore Timesharing System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fong, K. W.

    1977-08-15

    This report is an introduction to the CDC 7600 computer and to the Livermore Timesharing System (LTSS) used by the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network) on their 7600's. This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been broadened to point out differences in implementation at LLLCC. It also contains information about LLLCC not relevant to NMFECC. This report is written for computational physicists who want to prepare large production codes to run under LTSSmore » on the 7600's. The generalized discussion of the operating system focuses on creating and executing controllees. This document and its companion, UCID-17557, CDC 7600 LTSS Programming Stratagems, provide a basis for understanding more specialized documents about individual parts of the system.« less

  14. Development and Validation of a Natural Language Processing Tool to Identify Patients Treated for Pneumonia across VA Emergency Departments.

    PubMed

    Jones, B E; South, B R; Shao, Y; Lu, C C; Leng, J; Sauer, B C; Gundlapalli, A V; Samore, M H; Zeng, Q

    2018-01-01

    Identifying pneumonia using diagnosis codes alone may be insufficient for research on clinical decision making. Natural language processing (NLP) may enable the inclusion of cases missed by diagnosis codes. This article (1) develops a NLP tool that identifies the clinical assertion of pneumonia from physician emergency department (ED) notes, and (2) compares classification methods using diagnosis codes versus NLP against a gold standard of manual chart review to identify patients initially treated for pneumonia. Among a national population of ED visits occurring between 2006 and 2012 across the Veterans Affairs health system, we extracted 811 physician documents containing search terms for pneumonia for training, and 100 random documents for validation. Two reviewers annotated span- and document-level classifications of the clinical assertion of pneumonia. An NLP tool using a support vector machine was trained on the enriched documents. We extracted diagnosis codes assigned in the ED and upon hospital discharge and calculated performance characteristics for diagnosis codes, NLP, and NLP plus diagnosis codes against manual review in training and validation sets. Among the training documents, 51% contained clinical assertions of pneumonia; in the validation set, 9% were classified with pneumonia, of which 100% contained pneumonia search terms. After enriching with search terms, the NLP system alone demonstrated a recall/sensitivity of 0.72 (training) and 0.55 (validation), and a precision/positive predictive value (PPV) of 0.89 (training) and 0.71 (validation). ED-assigned diagnostic codes demonstrated lower recall/sensitivity (0.48 and 0.44) but higher precision/PPV (0.95 in training, 1.0 in validation); the NLP system identified more "possible-treated" cases than diagnostic coding. An approach combining NLP and ED-assigned diagnostic coding classification achieved the best performance (sensitivity 0.89 and PPV 0.80). System-wide application of NLP to clinical text can increase capture of initial diagnostic hypotheses, an important inclusion when studying diagnosis and clinical decision-making under uncertainty. Schattauer GmbH Stuttgart.

  15. Computer programs for producing single-event aircraft noise data for specific engine power and meteorological conditions for use with USAF (United States Air Force) community noise model (NOISEMAP)

    NASA Astrophysics Data System (ADS)

    Mohlman, H. T.

    1983-04-01

    The Air Force community noise prediction model (NOISEMAP) is used to describe the aircraft noise exposure around airbases and thereby aid airbase planners to minimize exposure and prevent community encroachment which could limit mission effectiveness of the installation. This report documents two computer programs (OMEGA 10 and OMEGA 11) which were developed to prepare aircraft flight and ground runup noise data for input to NOISEMAP. OMEGA 10 is for flight operations and OMEGA 11 is for aircraft ground runups. All routines in each program are documented at a level useful to a programmer working with the code or a reader interested in a general overview of what happens within a specific subroutine. Both programs input normalized, reference aircraft noise data; i.e., data at a standard reference distance from the aircraft, for several fixed engine power settings, a reference airspeed and standard day meteorological conditions. Both programs operate on these normalized, reference data in accordance with user-defined, non-reference conditions to derive single-event noise data for 22 distances (200 to 25,000 feet) in a variety of physical and psycho-acoustic metrics. These outputs are in formats ready for input to NOISEMAP.

  16. Status of SFR Codes and Methods QA Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia J.; Briggs, Laural L.; Fanning, Thomas H.

    2017-01-31

    This report details development of the SAS4A/SASSYS-1 SQA Program and describes the initial stages of Program implementation planning. The provisional Program structure, which is largely focused on the establishment of compliant SQA documentation, is outlined in detail, and Program compliance with the appropriate SQA requirements is highlighted. Additional program activities, such as improvements to testing methods and Program surveillance, are also described in this report. Given that the programmatic resources currently granted to development of the SAS4A/SASSYS-1 SQA Program framework are not sufficient to adequately address all SQA requirements (e.g. NQA-1, NUREG/BR-0167, etc.), this report also provides an overview ofmore » the gaps that remain the SQA program, and highlights recommendations on a path forward to resolution of these issues. One key finding of this effort is the identification of the need for an SQA program sustainable over multiple years within DOE annual R&D funding constraints.« less

  17. How does culture affect experiential training feedback in exported Canadian health professional curricula?

    PubMed

    Wilbur, Kerry; Mousa Bacha, Rasha; Abdelaziz, Somaia

    2017-03-17

    To explore feedback processes of Western-based health professional student training curricula conducted in an Arab clinical teaching setting. This qualitative study employed document analysis of in-training evaluation reports (ITERs) used by Canadian nursing, pharmacy, respiratory therapy, paramedic, dental hygiene, and pharmacy technician programs established in Qatar. Six experiential training program coordinators were interviewed between February and May 2016 to explore how national cultural differences are perceived to affect feedback processes between students and clinical supervisors. Interviews were recorded, transcribed, and coded according to a priori cultural themes. Document analysis found all programs' ITERs outlined competency items for students to achieve. Clinical supervisors choose a response option corresponding to their judgment of student performance and may provide additional written feedback in spaces provided. Only one program required formal face-to-face feedback exchange between students and clinical supervisors. Experiential training program coordinators identified that no ITER was expressly culturally adapted, although in some instances, modifications were made for differences in scopes of practice between Canada and Qatar.  Power distance was recognized by all coordinators who also identified both student and supervisor reluctance to document potentially negative feedback in ITERs. Instances of collectivism were described as more lenient student assessment by clinical supervisors of the same cultural background. Uncertainty avoidance did not appear to impact feedback processes. Our findings suggest that differences in specific cultural dimensions between Qatar and Canada have implications on the feedback process in experiential training which may be addressed through simple measures to accommodate communication preferences.

  18. The Numerical Electromagnetics Code (NEC) - A Brief History

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, G J; Miller, E K; Poggio, A J

    The Numerical Electromagnetics Code, NEC as it is commonly known, continues to be one of the more widely used antenna modeling codes in existence. With several versions in use that reflect different levels of capability and availability, there are now 450 copies of NEC4 and 250 copies of NEC3 that have been distributed by Lawrence Livermore National Laboratory to a limited class of qualified recipients, and several hundred copies of NEC2 that had a recorded distribution by LLNL. These numbers do not account for numerous copies (perhaps 1000s) that were acquired through other means capitalizing on the open source code,more » the absence of distribution controls prior to NEC3 and the availability of versions on the Internet. In this paper we briefly review the history of the code that is concisely displayed in Figure 1. We will show how it capitalized on the research of prominent contributors in the early days of computational electromagnetics, how a combination of events led to the tri-service-supported code development program that ultimately led to NEC and how it evolved to the present day product. The authors apologize that space limitations do not allow us to provide a list of references or to acknowledge the numerous contributors to the code both of which can be found in the code documents.« less

  19. 78 FR 51139 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... (Code 324), Field Border (Code 386), Filter Strip (Code 393), Land Smoothing (Code 466), Livestock... the implementation requirement document to the specifications and plans. Filter Strip (Code 393)--The...

  20. RATFOR user's guide version 2.0

    NASA Technical Reports Server (NTRS)

    Helmle, L. C.

    1985-01-01

    This document is a user's guide for RATFOR at Ames Research Center. The main part of the document is a general description of RATFOR, and the appendix is devoted to a machine specific implementation for the Cray X-MP. The general stylistic features of RATFOR are discussed, including the block structure, keywords, source code, format, and the notion of tokens. There is a section on the basic control structures (IF-ELSE, ELSE IF, WHILE, FOR, DO, REPEAT-UNTIL, BREAK, NEXT), and there is a section on the statements that extend FORTRAN's capabilities (DEFINE, MACRO, INCLUDE, STRING). THE appendix discusses everything needed to compile and run a basic job, the preprocessor options, the supported character sets, the generated listings, fatal errors, and program limitations and the differences from standard FORTRAN.

  1. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  2. ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goel, Supriya; Rosenberg, Michael I.

    This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either ofmore » those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.« less

  3. ICAM (Conceptual Design for Computer-Integrated Manufacturing. Volume 2. Part 6. Task B - Establishment of the Factory of the Future Conceptual Framework Conceptual Framework Document, (MMR)

    DTIC Science & Technology

    1984-06-29

    effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls

  4. Research Methods to Develop Measures of Effectiveness of the United States Coast Guard’s Vessel Inspection and Boarding Program. Executive Summary - Volume I

    DTIC Science & Technology

    1995-05-01

    Service, Springfield, Virginia 22161 Prepared for: LJCý U.S. Coast Guard Research and Development Center 1082 Shennecossett Road < Groton, Connecticut...States Coast Guard Research & Development Center 1082 Shennecossett Road Groton, CT 06340-6096 ii Technical Report Documentation Paae 1. Report No. 2...U.S. Coast Guard 14. Sponsoring Agency Code Research and Development Center Office of Engineering, Logistics, 1082 Shennecossett Road and Development

  5. HECWRC, Flood Flow Frequency Analysis Computer Program 723-X6-L7550

    DTIC Science & Technology

    1989-02-14

    AGENCY NAME AND ADDRESS, ORDER NO., ETC. (1 NTS sells, leave blank) 11. PRICE INFORMA-ION Price includes documentation: Price code: DO1 $50.00 12 ...required is 256 K. Math coprocessor (8087/80287/80387) is highly recommended but not required. 16. DATA FILE TECHNICAL DESCRIPTION The software is...disk drive (360 KB or 1.2 MB). A 10 MB or larger hard disk is recommended. Math coprocessor (8087/80287/80387) is highly recommended but not renuired

  6. The ADAMS interactive interpreter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  7. Marine Corps Transition Team Program in Iraq: Mission Accomplished

    DTIC Science & Technology

    2011-03-02

    views in 2005. · Tablc2 looot Bcn:Jficial Training I lmprovomont Noodod Code CQIO!!OfY 04 o/n1 Chllnge• Ill I MnM llele• mnt 1 !. 1 ,.._.,.~ 111!/tun...REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour...PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1 . REPORT DATE (DD-MM-YYYY) 12. REPORT TYPE 3. DATES COVERED (From- To) 02-03-2011 Master of

  8. Automatic Certification of Kalman Filters for Reliable Code Generation

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann; Richardson, Julian

    2005-01-01

    AUTOFILTER is a tool for automatically deriving Kalman filter code from high-level declarative specifications of state estimation problems. It can generate code with a range of algorithmic characteristics and for several target platforms. The tool has been designed with reliability of the generated code in mind and is able to automatically certify that the code it generates is free from various error classes. Since documentation is an important part of software assurance, AUTOFILTER can also automatically generate various human-readable documents, containing both design and safety related information. We discuss how these features address software assurance standards such as DO-178B.

  9. A Comparison of Three Navier-Stokes Solvers for Exhaust Nozzle Flowfields

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Yoder, Dennis A.; Debonis, James R.

    1999-01-01

    A comparison of the NPARC, PAB, and WIND (previously known as NASTD) Navier-Stokes solvers is made for two flow cases with turbulent mixing as the dominant flow characteristic, a two-dimensional ejector nozzle and a Mach 1.5 elliptic jet. The objective of the work is to determine if comparable predictions of nozzle flows can be obtained from different Navier-Stokes codes employed in a multiple site research program. A single computational grid was constructed for each of the two flows and used for all of the Navier-Stokes solvers. In addition, similar k-e based turbulence models were employed in each code, and boundary conditions were specified as similarly as possible across the codes. Comparisons of mass flow rates, velocity profiles, and turbulence model quantities are made between the computations and experimental data. The computational cost of obtaining converged solutions with each of the codes is also documented. Results indicate that all of the codes provided similar predictions for the two nozzle flows. Agreement of the Navier-Stokes calculations with experimental data was good for the ejector nozzle. However, for the Mach 1.5 elliptic jet, the calculations were unable to accurately capture the development of the three dimensional elliptic mixing layer.

  10. The Proteus Navier-Stokes code

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.; Bui, Trong T.; Cavicchi, Richard H.; Conley, Julianne M.; Molls, Frank B.; Schwab, John R.

    1992-01-01

    An effort is currently underway at NASA Lewis to develop two- and three-dimensional Navier-Stokes codes, called Proteus, for aerospace propulsion applications. The emphasis in the development of Proteus is not algorithm development or research on numerical methods, but rather the development of the code itself. The objective is to develop codes that are user-oriented, easily-modified, and well-documented. Well-proven, state-of-the-art solution algorithms are being used. Code readability, documentation (both internal and external), and validation are being emphasized. This paper is a status report on the Proteus development effort. The analysis and solution procedure are described briefly, and the various features in the code are summarized. The results from some of the validation cases that have been run are presented for both the two- and three-dimensional codes.

  11. Analytical solutions for one-, two-, and three-dimensional solute transport in ground-water systems with uniform flow

    USGS Publications Warehouse

    Wexler, Eliezer J.

    1989-01-01

    Analytical solutions to the advective-dispersive solute-transport equation are useful in predicting the fate of solutes in ground water. Analytical solutions compiled from available literature or derived by the author are presented in this report for a variety of boundary condition types and solute-source configurations in one-, two-, and three-dimensional systems with uniform ground-water flow. A set of user-oriented computer programs was created to evaluate these solutions and to display the results in tabular and computer-graphics format. These programs incorporate many features that enhance their accuracy, ease of use, and versatility. Documentation for the programs describes their operation and required input data, and presents the results of sample problems. Derivations of select solutions, source codes for the computer programs, and samples of program input and output also are included.

  12. How to Link to Official Documents from the Government Publishing Office (GPO)

    EPA Pesticide Factsheets

    The most consistent way to present up-to-date content from the Federal Register, US Code, Code of Federal Regulations (CFR), and so on is to link to the official version of the document on the GPO's Federal Digital System (FDSys) website.

  13. Health Physics Code System for Evaluating Accidents Involving Radioactive Materials.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-10-01

    Version 03 The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculational tool for evaluating accidents involving radioactive materials. HOTSPOT codes provide a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. The developer's website is: http://www.llnl.gov/nhi/hotspot/. Four general programs, PLUME, EXPLOSION, FIRE, and RESUSPENSION, calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Additional programs deal specifically with the release of plutonium, uranium, and tritium to expedite an initial assessmentmore » of accidents involving nuclear weapons. The FIDLER program can calibrate radiation survey instruments for ground survey measurements and initial screening of personnel for possible plutonium uptake in the lung. The HOTSPOT codes are fast, portable, easy to use, and fully documented in electronic help files. HOTSPOT supports color high resolution monitors and printers for concentration plots and contours. The codes have been extensively used by the DOS community since 1985. Tables and graphical output can be directed to the computer screen, printer, or a disk file. The graphical output consists of dose and ground contamination as a function of plume centerline downwind distance, and radiation dose and ground contamination contours. Users have the option of displaying scenario text on the plots. HOTSPOT 3.0.1 fixes three significant Windows 7 issues: Executable installed properly under "Program Files/HotSpot 3.0". Installation package now smaller: removed dependency on older Windows DLL files which previously needed to; Forms now properly scale based on DPI instead of font for users who change their screen resolution to something other than 100%. This is a more common feature in Windows 7; Windows installer was starting everytime most users started the program, even after HotSpot was already installed. Now, after the program is installed the installer may come up once for each new user but only the first time they run HotSpot on a particular machine. So no user should see the installer come up more than once over many uses; and GPS capability updated to directly use a serial port through a USB connection. Non-USB connections should still work. Fixed table output inconsistencies for fire scenarios.« less

  14. The Library Systems Act and Rules for Administering the Library Systems Act.

    ERIC Educational Resources Information Center

    Texas State Library, Austin. Library Development Div.

    This document contains the Texas Library Systems Act and rules for administering the Library Systems Act. Specifically, it includes the following documents: Texas Library Systems Act; Summary of Codes;Texas Administrative Code: Service Complaints and Protest Procedure; Criteria For Texas Library System Membership; and Certification Requirements…

  15. TranAir: A full-potential, solution-adaptive, rectangular grid code for predicting subsonic, transonic, and supersonic flows about arbitrary configurations. Theory document

    NASA Technical Reports Server (NTRS)

    Johnson, F. T.; Samant, S. S.; Bieterman, M. B.; Melvin, R. G.; Young, D. P.; Bussoletti, J. E.; Hilmes, C. L.

    1992-01-01

    A new computer program, called TranAir, for analyzing complex configurations in transonic flow (with subsonic or supersonic freestream) was developed. This program provides accurate and efficient simulations of nonlinear aerodynamic flows about arbitrary geometries with the ease and flexibility of a typical panel method program. The numerical method implemented in TranAir is described. The method solves the full potential equation subject to a set of general boundary conditions and can handle regions with differing total pressure and temperature. The boundary value problem is discretized using the finite element method on a locally refined rectangular grid. The grid is automatically constructed by the code and is superimposed on the boundary described by networks of panels; thus no surface fitted grid generation is required. The nonlinear discrete system arising from the finite element method is solved using a preconditioned Krylov subspace method embedded in an inexact Newton method. The solution is obtained on a sequence of successively refined grids which are either constructed adaptively based on estimated solution errors or are predetermined based on user inputs. Many results obtained by using TranAir to analyze aerodynamic configurations are presented.

  16. Data Quality Objectives for Tank Farms Waste Compatibility Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING, D.L.

    1999-07-02

    There are 177 waste storage tanks containing over 210,000 m{sup 3} (55 million gal) of mixed waste at the Hanford Site. The River Protection Project (RPP) has adopted the data quality objective (DQO) process used by the U.S. Environmental Protection Agency (EPA) (EPA 1994a) and implemented by RPP internal procedure (Banning 1999a) to identify the information and data needed to address safety issues. This DQO document is based on several documents that provide the technical basis for inputs and decision/action levels used to develop the decision rules that evaluate the transfer of wastes. A number of these documents are presentlymore » in the process of being revised. This document will need to be revised if there are changes to the technical criteria in these supporting documents. This DQO process supports various documents, such as sampling and analysis plans and double-shell tank (DST) waste analysis plans. This document identifies the type, quality, and quantity of data needed to determine whether transfer of supernatant can be performed safely. The requirements in this document are designed to prevent the mixing of incompatible waste as defined in Washington Administrative Code (WAC) 173-303-040. Waste transfers which meet the requirements contained in this document and the Double-Shell Tank Waste Analysis Plan (Mulkey 1998) are considered to be compatible, and prevent the mixing of incompatible waste.« less

  17. Mendel-GPU: haplotyping and genotype imputation on graphics processing units

    PubMed Central

    Chen, Gary K.; Wang, Kai; Stram, Alex H.; Sobel, Eric M.; Lange, Kenneth

    2012-01-01

    Motivation: In modern sequencing studies, one can improve the confidence of genotype calls by phasing haplotypes using information from an external reference panel of fully typed unrelated individuals. However, the computational demands are so high that they prohibit researchers with limited computational resources from haplotyping large-scale sequence data. Results: Our graphics processing unit based software delivers haplotyping and imputation accuracies comparable to competing programs at a fraction of the computational cost and peak memory demand. Availability: Mendel-GPU, our OpenCL software, runs on Linux platforms and is portable across AMD and nVidia GPUs. Users can download both code and documentation at http://code.google.com/p/mendel-gpu/. Contact: gary.k.chen@usc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22954633

  18. Aircraft noise prediction program theoretical manual: Rotorcraft System Noise Prediction System (ROTONET), part 4

    NASA Technical Reports Server (NTRS)

    Weir, Donald S.; Jumper, Stephen J.; Burley, Casey L.; Golub, Robert A.

    1995-01-01

    This document describes the theoretical methods used in the rotorcraft noise prediction system (ROTONET), which is a part of the NASA Aircraft Noise Prediction Program (ANOPP). The ANOPP code consists of an executive, database manager, and prediction modules for jet engine, propeller, and rotor noise. The ROTONET subsystem contains modules for the prediction of rotor airloads and performance with momentum theory and prescribed wake aerodynamics, rotor tone noise with compact chordwise and full-surface solutions to the Ffowcs-Williams-Hawkings equations, semiempirical airfoil broadband noise, and turbulence ingestion broadband noise. Flight dynamics, atmosphere propagation, and noise metric calculations are covered in NASA TM-83199, Parts 1, 2, and 3.

  19. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  20. United States Air Force Summer Research Program 1991. Summer Faculty Research Program (SFRP) Reports. Volume 3. Phillips Laboratory, Civil Engineering Laboratory

    DTIC Science & Technology

    1992-01-09

    34Wia’SAW OF SCIENTIFIC RESEARCH AFOSR-TR. 9 2 0169 • ,.’./.HCH (AK?£T) i. j .>< 5 bef-n reviewed and f^ -.-■ release IAW AFH 190-12 rcl...ersc • WAC« »Murt’On »•ciec1(0?C4-0󈨜). Aasnnct.-n ;c :35C j 1. AGENCY USE ONLY (leave blink) 2. REPORT DATE 9 January lf>": 4. TITLE...DTIC TAG D U.-ti.t; ou.xed Q J i-tiiication Availability Codes Dist Pr Avail a.’id / or OpC’Clal PREFACE Reports in this document are

  1. High Speed Research Program Structural Acoustics Multi-Year Summary Report

    NASA Technical Reports Server (NTRS)

    Beier, Theodor H.; Bhat, Waman V.; Rizzi, Stephen A.; Silcox, Richard J.; Simpson, Myles A.

    2005-01-01

    This report summarizes the work conducted by the Structural Acoustics Integrated Technology Development (ITD) Team under NASA's High Speed Research (HSR) Phase II program from 1993 to 1999. It is intended to serve as a reference for future researchers by documenting the results of the interior noise and sonic fatigue technology development activities conducted during this period. For interior noise, these activities included excitation modeling, structural acoustic response modeling, development of passive treatments and active controls, and prediction of interior noise. For sonic fatigue, these activities included loads prediction, materials characterization, sonic fatigue code development, development of response reduction techniques, and generation of sonic fatigue design requirements. Also included are lessons learned and recommendations for future work.

  2. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    PubMed

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  3. User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1999-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  4. Development and Validation of a Primary Care-Based Family Health History and Decision Support Program (MeTree)

    PubMed Central

    Orlando, Lori A.; Buchanan, Adam H.; Hahn, Susan E.; Christianson, Carol A.; Powell, Karen P.; Skinner, Celette Sugg; Chesnut, Blair; Blach, Colette; Due, Barbara; Ginsburg, Geoffrey S.; Henrich, Vincent C.

    2016-01-01

    INTRODUCTION Family health history is a strong predictor of disease risk. To reduce the morbidity and mortality of many chronic diseases, risk-stratified evidence-based guidelines strongly encourage the collection and synthesis of family health history to guide selection of primary prevention strategies. However, the collection and synthesis of such information is not well integrated into clinical practice. To address barriers to collection and use of family health histories, the Genomedical Connection developed and validated MeTree, a Web-based, patient-facing family health history collection and clinical decision support tool. MeTree is designed for integration into primary care practices as part of the genomic medicine model for primary care. METHODS We describe the guiding principles, operational characteristics, algorithm development, and coding used to develop MeTree. Validation was performed through stakeholder cognitive interviewing, a genetic counseling pilot program, and clinical practice pilot programs in 2 community-based primary care clinics. RESULTS Stakeholder feedback resulted in changes to MeTree’s interface and changes to the phrasing of clinical decision support documents. The pilot studies resulted in the identification and correction of coding errors and the reformatting of clinical decision support documents. MeTree’s strengths in comparison with other tools are its seamless integration into clinical practice and its provision of action-oriented recommendations guided by providers’ needs. LIMITATIONS The tool was validated in a small cohort. CONCLUSION MeTree can be integrated into primary care practices to help providers collect and synthesize family health history information from patients with the goal of improving adherence to risk-stratified evidence-based guidelines. PMID:24044145

  5. TFaNS Tone Fan Noise Design/Prediction System. Volume 1; System Description, CUP3D Technical Documentation and Manual for Code Developers

    NASA Technical Reports Server (NTRS)

    Topol, David A.

    1999-01-01

    TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFaNS consists of: The codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. Cup3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report provides technical background for TFaNS including the organization of the system and CUP3D technical documentation. This document also provides information for code developers who must write Acoustic Property Files in the CUP3D format. This report is divided into three volumes: Volume I: System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume II: User's Manual, TFaNS Vers. 1.4; Volume III: Evaluation of System Codes.

  6. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  7. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  8. The Nuremberg Code-A critique.

    PubMed

    Ghooi, Ravindra B

    2011-04-01

    The Nuremberg Code drafted at the end of the Doctor's trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.

  9. Color-Coded Labels Cued Nurses to Adhere to Central Line Connector Change.

    PubMed

    Morrison, Theresa Lynch; Laney, Christina; Foglesong, Jan; Brennaman, Laura

    2016-01-01

    This study examined nurses' adherence to policies regarding needleless connector changes using a novel, day-of-the-week, color-coded label compared with usual care that relied on electronic medical record (EMR) documentation. This was a prospective, comparative study. The study was performed on 4 medical-surgical units in a seasonally fluctuating, 715-bed healthcare system composed of 2 community hospitals. Convenience sample was composed of adults with central lines hospitalized for 4 or more days. At 4-day intervals, investigators observed bedside label use and EMR needleless connector change documentation. Control patients received standard care-needleless connector change with associated documentation in the EMR. Intervention patients, in addition to standard care, had a day-of-the-week, color-coded label placed on each needleless connector. To account for clustering within unit, multinomial logistic regression models using survey sampling methodology were used to conduct Wald χ tests. A multinominal odds ratio and 95% confidence interval (CI) provided an estimate of using labels that were provided on units relative to usual care documentation of needleless connector change in the EMR. In 335 central line observations, the units with labels (n = 205) had a 321% increase rate of documentation of needleless connector change in the EMR (odds ratio, 4.21; 95% CI, 1.76-10.10; P = .003) compared with the usual care control patients. For units with labels, when labels were present, placement of labels on needleless connectors increased the odds that nurses documented connector changes per policy (4.72; 95% CI, 2.02, 10.98; P = .003). Day-of-the-week, color-coded labels cued nurses to document central line needleless connector change in the EMR, which increased adherence to the needleless connector change policy. Providing day-of-the-week, color-coded needleless connector labels increased EMR documentation of timely needleless connector changes. Timely needleless connector changes may lower the incidence of central line-associated bloodstream infection.

  10. Provisional Coding Practices: Are They Really a Waste of Time?

    PubMed

    Krypuy, Matthew; McCormack, Lena

    2006-11-01

    In order to facilitate effective clinical coding and hence the precise financial reimbursement of acute services, in 2005 Western District Health Service (WDHS) (located in regional Victoria, Australia) undertook a provisional coding trial for inpatient medical episodes to determine the magnitude and accuracy of clinical documentation. Utilising clinical coding software installed on a laptop computer, provisional coding was undertaken for all current overnight inpatient episodes under each physician one day prior to attending their daily ward round. The provisionally coded episodes were re-coded upon the completion of the discharge summary and the final Diagnostic Related Group (DRG) allocation and weight were compared to the provisional DRG assignment. A total of 54 out of 220 inpatient medical episodes were provisionally coded. This represented approximately a 25% cross section of the population selected for observation. Approximately 67.6% of the provisionally allocated DRGs were accurate in contrast to 32.4% which were subject to change once the discharge summary was completed. The DRG changes were primarily due to: disease progression of a patient during their care episode which could not be identified by clinical coding staff due to discharge prior to the following scheduled ward round; the discharge destination of particular patients; and the accuracy of clinical documentation on the discharge summary. The information gathered from the provisional coding trial supported the hypothesis that clinical documentation standards were sufficient and adequate to support precise clinical coding and DRG assignment at WDHS. The trial further highlighted the importance of a complete and accurate discharge summary available during the coding process of acute inpatient episodes.

  11. Acoustic Prediction State of the Art Assessment

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2007-01-01

    The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.

  12. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.

  13. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  14. A Strategy for Reusing the Data of Electronic Medical Record Systems for Clinical Research.

    PubMed

    Matsumura, Yasushi; Hattori, Atsushi; Manabe, Shiro; Tsuda, Tsutomu; Takeda, Toshihiro; Okada, Katsuki; Murata, Taizo; Mihara, Naoki

    2016-01-01

    There is a great need to reuse data stored in electronic medical records (EMR) databases for clinical research. We previously reported the development of a system in which progress notes and case report forms (CRFs) were simultaneously recorded using a template in the EMR in order to exclude redundant data entry. To make the data collection process more efficient, we are developing a system in which the data originally stored in the EMR database can be populated within a frame in a template. We developed interface plugin modules that retrieve data from the databases of other EMR applications. A universal keyword written in a template master is converted to a local code using a data conversion table, then the objective data is retrieved from the corresponding database. The template element data, which are entered by a template, are stored in the template element database. To retrieve the data entered by other templates, the objective data is designated by the template element code with the template code, or by the concept code if it is written for the element. When the application systems in the EMR generate documents, they also generate a PDF file and a corresponding document profile XML, which includes important data, and send them to the document archive server and the data sharing saver, respectively. In the data sharing server, the data are represented by an item with an item code with a document class code and its value. By linking a concept code to an item identifier, an objective data can be retrieved by designating a concept code. We employed a flexible strategy in which a unique identifier for a hospital is initially attached to all of the data that the hospital generates. The identifier is secondarily linked with concept codes. The data that are not linked with a concept code can also be retrieved using the unique identifier of the hospital. This strategy makes it possible to reuse any of a hospital's data.

  15. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  16. User’s manual for the National Water Information System of the U.S. Geological Survey: Water-Quality System

    USGS Publications Warehouse

    Dupré, David H.; Scott, Jonathon C.; Clark, Melanie L.; Canova, Michael G.; Stoker, Yvonne E.

    2013-01-01

    This user documentation is designed to be a reference for the quality of water (QW) programs within the National Water Information System (NWIS). If you are a new user, the “Introduction” and “Getting Started” sections may be the right place for you to start. If you are an experienced user, you may want to go straight to the details provided in the “Program” section (section 3). Code lists and some miscellaneous reference materials are provided in the Appendices. The last section, “Tip Sheets,” is a collection of suggestions for accomplishing selected tasks, some of which are basic and some are advanced. These tip sheets are referenced in the main text of the documentation where appropriate.

  17. X-ray simulation algorithms used in ISP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, John P.

    ISP is a simulation code which is sometimes used in the USNDS program. ISP is maintained by Sandia National Lab. However, the X-ray simulation algorithm used by ISP was written by scientists at LANL – mainly by Ed Fenimore with some contributions from John Sullivan and George Neuschaefer and probably others. In email to John Sullivan on July 25, 2016, Jill Rivera, ISP project lead, said “ISP uses the function xdosemeters_sim from the xgen library.” The is a fortran subroutine which is also used to simulate the X-ray response in consim (a descendant of xgen). Therefore, no separate documentation ofmore » the X-ray simulation algorithms in ISP have been written – the documentation for the consim simulation can be used.« less

  18. A realist review of family-based interventions for children of substance abusing parents.

    PubMed

    Usher, Amelia M; McShane, Kelly E; Dwyer, Candice

    2015-12-18

    Millions of children across North America and Europe live in families with alcohol or drug abusing parents. These children are at risk for a number of negative social, emotional and developmental outcomes, including an increased likelihood of developing a substance use disorder later in life. Family-based intervention programs for children with substance abusing parents can yield positive outcomes. This study is a realist review of evaluations of family-based interventions aimed at improving psychosocial outcomes for children of substance abusing parents (COSAPs). The primary objectives were to uncover patterns of contextual factors and mechanisms that generate program outcomes, and advance program theory in this field. Realist review methodology was chosen as the most appropriate method of systematic review because it is a theory-driven approach that seeks to explore mechanisms underlying program effectiveness (or lack thereof). A systematic and comprehensive search of academic and grey literature uncovered 32 documents spanning 7 different intervention programs. Data was extracted from the included documents using abstraction templates designed to code for contexts, mechanisms and outcomes of each program. Two candidate program theories of family addiction were used to guide data analysis: the family disease model and the family prevention model. Data analysis was undertaken by a research team using an iterative process of comparison and checking with original documents to determine patterns within the data. Programs originating in both the family disease model and the family prevention model were uncovered, along with hybrid programs that successfully included components from each candidate program theory. Four demi-regularities were found to account for the effectiveness of programs included in this review: (1) opportunities for positive parent-child interactions, (2) supportive peer-to-peer relationships, (3) the power of knowledge, and (4) engaging hard to reach families using strategies that are responsive to socio-economic needs and matching services to client lived experience. This review yielded new findings that had not otherwise been explored in COSAP program research and are discussed in order to help expand program theory. Implications for practice and evaluation are further discussed.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  20. The SIFT hardware/software systems. Volume 2: Software listings

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  1. Documentation of the GLAS fourth order general calculation model. Volume 3: Vectorized code for the Cyber 205

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 3 of a 3-volume technical memoranda which contains documentation of the GLAS fourth order genera circulation model is presented. The volume contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A dictionary of FORTRAN variables used in the Scalar Version, and listings of the FORTRAN Code compiled with the C-option, are included. Cross reference maps of local variables are included for each subroutine.

  2. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    PubMed

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  3. 76 FR 11191 - Hazardous Materials: Adoption of ASME Code Section XII and the National Board Inspection Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ... parts of the National Board Inspection Code at http://www.nationalboard.org . DATES: The comment period... edition of the National Board Inspection Code for public review at www.nationalboard.org . Both documents...

  4. Mads.jl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo

    2016-07-01

    Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less

  5. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic simulations with minimal manual intervention. This document is formatted to provide MATLAB source files and descriptions of how to utilize them. It is assumed that the user has a basic understanding of how MATLAB scripts work and some MATLAB programming experience.

  6. Integrated verification and testing system (IVTS) for HAL/S programs

    NASA Technical Reports Server (NTRS)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  7. How does culture affect experiential training feedback in exported Canadian health professional curricula?

    PubMed Central

    Mousa Bacha, Rasha; Abdelaziz, Somaia

    2017-01-01

    Objectives To explore feedback processes of Western-based health professional student training curricula conducted in an Arab clinical teaching setting. Methods This qualitative study employed document analysis of in-training evaluation reports (ITERs) used by Canadian nursing, pharmacy, respiratory therapy, paramedic, dental hygiene, and pharmacy technician programs established in Qatar. Six experiential training program coordinators were interviewed between February and May 2016 to explore how national cultural differences are perceived to affect feedback processes between students and clinical supervisors. Interviews were recorded, transcribed, and coded according to a priori cultural themes. Results Document analysis found all programs’ ITERs outlined competency items for students to achieve. Clinical supervisors choose a response option corresponding to their judgment of student performance and may provide additional written feedback in spaces provided. Only one program required formal face-to-face feedback exchange between students and clinical supervisors. Experiential training program coordinators identified that no ITER was expressly culturally adapted, although in some instances, modifications were made for differences in scopes of practice between Canada and Qatar.  Power distance was recognized by all coordinators who also identified both student and supervisor reluctance to document potentially negative feedback in ITERs. Instances of collectivism were described as more lenient student assessment by clinical supervisors of the same cultural background. Uncertainty avoidance did not appear to impact feedback processes. Conclusions Our findings suggest that differences in specific cultural dimensions between Qatar and Canada have implications on the feedback process in experiential training which may be addressed through simple measures to accommodate communication preferences. PMID:28315858

  8. The positive financial impact of using an Intensive Care Information System in a tertiary Intensive Care Unit.

    PubMed

    Levesque, Eric; Hoti, Emir; de La Serna, Sofia; Habouchi, Houssam; Ichai, Philippe; Saliba, Faouzi; Samuel, Didier; Azoulay, Daniel

    2013-03-01

    In the French healthcare system, the intensive care budget allocated is directly dependent on the activity level of the center. To evaluate this activity level, it is necessary to code the medical diagnoses and procedures performed on Intensive Care Unit (ICU) patients. The aim of this study was to evaluate the effects of using an Intensive Care Information System (ICIS) on the incidence of coding errors and its impact on the ICU budget allocated. Since 2005, the documentation on and monitoring of every patient admitted to our ICU has been carried out using an ICIS. However, the coding process was performed manually until 2008. This study focused on two periods: the period of manual coding (year 2007) and the period of computerized coding (year 2008) which covered a total of 1403 ICU patients. The time spent on the coding process, the rate of coding errors (defined as patients missed/not coded or wrongly identified as undergoing major procedure/s) and the financial impact were evaluated for these two periods. With computerized coding, the time per admission decreased significantly (from 6.8 ± 2.8 min in 2007 to 3.6 ± 1.9 min in 2008, p<0.001). Similarly, a reduction in coding errors was observed (7.9% vs. 2.2%, p<0.001). This decrease in coding errors resulted in a reduced difference between the potential and real ICU financial supplements obtained in the respective years (€194,139 loss in 2007 vs. a €1628 loss in 2008). Using specific computer programs improves the intensive process of manual coding by shortening the time required as well as reducing errors, which in turn positively impacts the ICU budget allocation. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Stochastic Analysis of Orbital Lifetimes of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sasamoto, Washito; Goodliff, Kandyce; Cornelius, David

    2008-01-01

    A document discusses (1) a Monte-Carlo-based methodology for probabilistic prediction and analysis of orbital lifetimes of spacecraft and (2) Orbital Lifetime Monte Carlo (OLMC)--a Fortran computer program, consisting of a previously developed long-term orbit-propagator integrated with a Monte Carlo engine. OLMC enables modeling of variances of key physical parameters that affect orbital lifetimes through the use of probability distributions. These parameters include altitude, speed, and flight-path angle at insertion into orbit; solar flux; and launch delays. The products of OLMC are predicted lifetimes (durations above specified minimum altitudes) for the number of user-specified cases. Histograms generated from such predictions can be used to determine the probabilities that spacecraft will satisfy lifetime requirements. The document discusses uncertainties that affect modeling of orbital lifetimes. Issues of repeatability, smoothness of distributions, and code run time are considered for the purpose of establishing values of code-specific parameters and number of Monte Carlo runs. Results from test cases are interpreted as demonstrating that solar-flux predictions are primary sources of variations in predicted lifetimes. Therefore, it is concluded, multiple sets of predictions should be utilized to fully characterize the lifetime range of a spacecraft.

  10. Application Reuse Library for Software, Requirements, and Guidelines

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  11. A proto-code of ethics and conduct for European nurse directors.

    PubMed

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  12. SINDA'85/FLUINT - SYSTEMS IMPROVED NUMERICAL DIFFERENCING ANALYZER AND FLUID INTEGRATOR (CONVEX VERSION)

    NASA Technical Reports Server (NTRS)

    Cullimore, B.

    1994-01-01

    SINDA, the Systems Improved Numerical Differencing Analyzer, is a software system for solving lumped parameter representations of physical problems governed by diffusion-type equations. SINDA was originally designed for analyzing thermal systems represented in electrical analog, lumped parameter form, although its use may be extended to include other classes of physical systems which can be modeled in this form. As a thermal analyzer, SINDA can handle such interrelated phenomena as sublimation, diffuse radiation within enclosures, transport delay effects, and sensitivity analysis. FLUINT, the FLUid INTegrator, is an advanced one-dimensional fluid analysis program that solves arbitrary fluid flow networks. The working fluids can be single phase vapor, single phase liquid, or two phase. The SINDA'85/FLUINT system permits the mutual influences of thermal and fluid problems to be analyzed. The SINDA system consists of a programming language, a preprocessor, and a subroutine library. The SINDA language is designed for working with lumped parameter representations and finite difference solution techniques. The preprocessor accepts programs written in the SINDA language and converts them into standard FORTRAN. The SINDA library consists of a large number of FORTRAN subroutines that perform a variety of commonly needed actions. The use of these subroutines can greatly reduce the programming effort required to solve many problems. A complete run of a SINDA'85/FLUINT model is a four step process. First, the user's desired model is run through the preprocessor which writes out data files for the processor to read and translates the user's program code. Second, the translated code is compiled. The third step requires linking the user's code with the processor library. Finally, the processor is executed. SINDA'85/FLUINT program features include 20,000 nodes, 100,000 conductors, 100 thermal submodels, and 10 fluid submodels. SINDA'85/FLUINT can also model two phase flow, capillary devices, user defined fluids, gravity and acceleration body forces on a fluid, and variable volumes. SINDA'85/FLUINT offers the following numerical solution techniques. The Finite difference formulation of the explicit method is the Forward-difference explicit approximation. The formulation of the implicit method is the Crank-Nicolson approximation. The program allows simulation of non-uniform heating and facilitates modeling thin-walled heat exchangers. The ability to model non-equilibrium behavior within two-phase volumes is included. Recent improvements to the program were made in modeling real evaporator-pumps and other capillary-assist evaporators. SINDA'85/FLUINT is available by license for a period of ten (10) years to approved licensees. The licensed program product includes the source code and one copy of the supporting documentation. Additional copies of the documentation may be purchased separately at any time. SINDA'85/FLUINT is written in FORTRAN 77. Version 2.3 has been implemented on Cray series computers running UNICOS, CONVEX computers running CONVEX OS, and DEC RISC computers running ULTRIX. Binaries are included with the Cray version only. The Cray version of SINDA'85/FLUINT also contains SINGE, an additional graphics program developed at Johnson Space Flight Center. Both source and executable code are provided for SINGE. Users wishing to create their own SINGE executable will also need the NASA Device Independent Graphics Library (NASADIG, previously known as SMDDIG; UNIX version, MSC-22001). The Cray and CONVEX versions of SINDA'85/FLUINT are available on 9-track 1600 BPI UNIX tar format magnetic tapes. The CONVEX version is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format. The DEC RISC ULTRIX version is available on a TK50 magnetic tape cartridge in UNIX tar format. SINDA was developed in 1971, and first had fluid capability added in 1975. SINDA'85/FLUINT version 2.3 was released in 1990.

  13. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  14. The Nuremberg Code–A critique

    PubMed Central

    Ghooi, Ravindra B.

    2011-01-01

    The Nuremberg Code drafted at the end of the Doctor’s trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics. PMID:21731859

  15. RAVEN Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego

    2016-06-01

    RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less

  16. Functional Equivalence Acceptance Testing of FUN3D for Entry Descent and Landing Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Wood, William A.; Kleb, William L.; Alter, Stephen J.; Glass, Christopher E.; Padilla, Jose F.; Hammond, Dana P.; White, Jeffery A.

    2013-01-01

    The functional equivalence of the unstructured grid code FUN3D to the the structured grid code LAURA (Langley Aerothermodynamic Upwind Relaxation Algorithm) is documented for applications of interest to the Entry, Descent, and Landing (EDL) community. Examples from an existing suite of regression tests are used to demonstrate the functional equivalence, encompassing various thermochemical models and vehicle configurations. Algorithm modifications required for the node-based unstructured grid code (FUN3D) to reproduce functionality of the cell-centered structured code (LAURA) are also documented. Challenges associated with computation on tetrahedral grids versus computation on structured-grid derived hexahedral systems are discussed.

  17. Medicare program: changes to the hospital outpatient prospective payment system and CY 2008 payment rates, the ambulatory surgical center payment system and CY 2008 payment rates, the hospital inpatient prospective payment system and FY 2008 payment rates; and payments for graduate medical education for affiliated teaching hospitals in certain emergency situations Medicare and Medicaid programs: hospital conditions of participation; necessary provider designations of critical access hospitals. Interim and final rule with comment period.

    PubMed

    2007-11-27

    This final rule with comment period revises the Medicare hospital outpatient prospective payment system to implement applicable statutory requirements and changes arising from our continuing experience with this system. We describe the changes to the amounts and factors used to determine the payment rates for Medicare hospital outpatient services paid under the prospective payment system. These changes are applicable to services furnished on or after January 1, 2008. In addition, the rule sets forth the applicable relative payment weights and amounts for services furnished in ASCs, specific HCPCS codes to which the final policies of the ASC payment system apply, and other pertinent rate setting information for the CY 2008 ASC payment system. Furthermore, this final rule with comment period will make changes to the policies relating to the necessary provider designations of critical access hospitals and changes to several of the current conditions of participation requirements. The attached document also incorporates the changes to the FY 2008 hospital inpatient prospective payment system (IPPS) payment rates made as a result of the enactment of the TMA, Abstinence Education, and QI Programs Extension Act of 2007, Public Law 110-90. In addition, we are changing the provisions in our previously issued FY 2008 IPPS final rule and are establishing a new policy, retroactive to October 1, 2007, of not applying the documentation and coding adjustment to the FY 2008 hospital-specific rates for Medicare-dependent, small rural hospitals (MDHs) and sole community hospitals (SCHs). In the interim final rule with comment period in this document, we are modifying our regulations relating to graduate medical education (GME) payments made to teaching hospitals that have Medicare affiliation agreements for certain emergency situations.

  18. [Differentiation of coding quality in orthopaedics by special, illustration-oriented case group analysis in the G-DRG System 2005].

    PubMed

    Schütz, U; Reichel, H; Dreinhöfer, K

    2007-01-01

    We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.

  19. 77 FR 72913 - Defining Larger Participants of the Consumer Debt Collection Market; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ...;to and codified in the Code of Federal Regulations, which is published #0;under 50 titles pursuant to 44 U.S.C. 1510. #0; #0;The Code of Federal Regulations is sold by the Superintendent of Documents. #0... collection. The final rule contained four typographical errors, which this document corrects. Three of these...

  20. FDA Procedures for Standardization and Certification of Retail Food Inspection/Training Officers, 2000.

    ERIC Educational Resources Information Center

    Food and Drug Administration (DHHS/PHS), Rockville, MD.

    This document provides information, standards, and behavioral objectives for standardization and certification of retail food inspection personnel in the Food and Drug Administration (FDA). The procedures described in the document are based on the FDA Food Code, updated to reflect current Food Code provisions and to include a more refined focus on…

  1. The importance of documenting code, and how you might make yourself do it

    NASA Astrophysics Data System (ADS)

    Tollerud, Erik Jon; Astropy Project

    2016-01-01

    Your science code is awesome. It reduces data, performs some statistical analysis, or models a physical process better than anyone has done before. You wisely decide that it is worth sharing with your student/advisor, research collaboration, or the whole world. But when you send it out, no one seems willing to use it. Why? Most of the time, it's your documentation. You wrote the code for yourself, so you know what every function, procedure, or class is supposed to do. Unfortunately, your users (sometimes including you 6 months later) do not. In this talk, I will describe some of the tools, both technical and psychological, to make that documentation happen (particularly for the Python ecosystem).

  2. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  3. Practical uses of SPFIT

    NASA Astrophysics Data System (ADS)

    Drouin, Brian J.

    2017-10-01

    Over twenty-five years ago, Herb Pickett introduced his quantum-mechanical fitting programs to the spectroscopic community. The utility and flexibility of the software has enabled a whole generation of spectroscopists to analyze both simple and complex spectra without having to write and compile their own code. Last year Stewart Novick provided a primer for the coming generation of users. This follow-on work will serve as a guide to intermediate and advanced usage of the software. It is meant to be used in concert with the online documentation as well as the spectral line catalog archive.

  4. Level-2 Milestone 6007: Sierra Early Delivery System Deployed to Secret Restricted Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertsch, A. D.

    This report documents the delivery and installation of Shark, a CORAL Sierra early delivery system deployed on the LLNL SRD network. Early ASC program users have run codes on the machine in support of application porting for the final Sierra system which will be deployed at LLNL in CY2018. In addition to the SRD resource, Shark, unclassified resources, Rzmanta and Ray, have been deployed on the LLNL Restricted Zone and Collaboration Zone networks in support of application readiness for the Sierra platform.

  5. Synthesis: Intertwining product and process

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1990-01-01

    Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.

  6. VizieR Online Data Catalog: FARGO_THORIN 1.0 hydrodynamic code (Chrenko+, 2017)

    NASA Astrophysics Data System (ADS)

    Chrenko, O.; Broz, M.; Lambrechts, M.

    2017-07-01

    This archive contains the source files, documentation and example simulation setups of the FARGO_THORIN 1.0 hydrodynamic code. The program was introduced, described and used for simulations in the paper. It is built on top of the FARGO code (Masset, 2000A&AS..141..165M, Baruteau & Masset, 2008ApJ...672.1054B) and it is also interfaced with the REBOUND integrator package (Rein & Liu, 2012A&A...537A.128R). THORIN stands for Two-fluid HydrOdynamics, the Rebound integrator Interface and Non-isothermal gas physics. The program is designed for self-consistent investigations of protoplanetary systems consisting of a gas disk, a disk of small solid particles (pebbles) and embedded protoplanets. Code features: I) Non-isothermal gas disk with implicit numerical solution of the energy equation. The implemented energy source terms are: Compressional heating, viscous heating, stellar irradiation, vertical escape of radiation, radiative diffusion in the midplane and radiative feedback to accretion heating of protoplanets. II) Planets evolved in 3D, with close encounters allowed. The orbits are integrated using the IAS15 integrator (Rein & Spiegel, 2015MNRAS.446.1424R). The code detects the collisions among planets and resolve them as mergers. III) Refined treatment of the planet-disk gravitational interaction. The code uses a vertical averaging of the gravitational potential, as outlined in Muller & Kley (2012A&A...539A..18M). IV) Pebble disk represented by an Eulerian, presureless and inviscid fluid. The pebble dynamics is affected by the Epstein gas drag and optionally by the diffusive effects. We also implemented the drag back-reaction term into the Navier-Stokes equation for the gas. Archive summary: ------------------------------------------------------------------------- directory/file Explanation ------------------------------------------------------------------------- /in_relax Contains setup of the first example simulation /in_wplanet Contains setup of the second example simulation /srcmain Contains the source files of FARGOTHORIN /src_reb Contains the source files of the REBOUND integrator package to be linked with THORIN GUNGPL3 GNU General Public License, version 3 LICENSE License agreement README Simple user's guide UserGuide.pdf Extended user's guide refman.pdf Programer's guide ----------------------------------------------------------------------------- (1 data file).

  7. CaveMan Enterprise version 1.0 Software Validation and Verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, David

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prisemore » has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.« less

  8. PuffinPlot: A versatile, user-friendly program for paleomagnetic analysis

    NASA Astrophysics Data System (ADS)

    Lurcock, P. C.; Wilson, G. S.

    2012-06-01

    PuffinPlot is a user-friendly desktop application for analysis of paleomagnetic data, offering a unique combination of features. It runs on several operating systems, including Windows, Mac OS X, and Linux; supports both discrete and long core data; and facilitates analysis of very weakly magnetic samples. As well as interactive graphical operation, PuffinPlot offers batch analysis for large volumes of data, and a Python scripting interface for programmatic control of its features. Available data displays include demagnetization/intensity, Zijderveld, equal-area (for sample, site, and suite level demagnetization data, and for magnetic susceptibility anisotropy data), a demagnetization data table, and a natural remanent magnetization intensity histogram. Analysis types include principal component analysis, Fisherian statistics, and great-circle path intersections. The results of calculations can be exported as CSV (comma-separated value) files; graphs can be printed, and can also be saved as publication-quality vector files in SVG or PDF format. PuffinPlot is free, and the program, user manual, and fully documented source code may be downloaded from http://code.google.com/p/puffinplot/.

  9. The 2006 NESCent Phyloinformatics Hackathon: A Field Report

    PubMed Central

    Lapp, Hilmar; Bala, Sendu; Balhoff, James P.; Bouck, Amy; Goto, Naohisa; Holder, Mark; Holland, Richard; Holloway, Alisha; Katayama, Toshiaki; Lewis, Paul O.; Mackey, Aaron J.; Osborne, Brian I.; Piel, William H.; Kosakovsky Pond, Sergei L.; Poon, Art F.Y.; Qiu, Wei-Gang; Stajich, Jason E.; Stoltzfus, Arlin; Thierer, Tobias; Vilella, Albert J.; Vos, Rutger A.; Zmasek, Christian M.; Zwickl, Derrick J.; Vision, Todd J.

    2007-01-01

    In December, 2006, a group of 26 software developers from some of the most widely used life science programming toolkits and phylogenetic software projects converged on Durham, North Carolina, for a Phyloinformatics Hackathon, an intense five-day collaborative software coding event sponsored by the National Evolutionary Synthesis Center (NESCent). The goal was to help researchers to integrate multiple phylogenetic software tools into automated workflows. Participants addressed deficiencies in interoperability between programs by implementing “glue code” and improving support for phylogenetic data exchange standards (particularly NEXUS) across the toolkits. The work was guided by use-cases compiled in advance by both developers and users, and the code was documented as it was developed. The resulting software is freely available for both users and developers through incorporation into the distributions of several widely-used open-source toolkits. We explain the motivation for the hackathon, how it was organized, and discuss some of the outcomes and lessons learned. We conclude that hackathons are an effective mode of solving problems in software interoperability and usability, and are underutilized in scientific software development.

  10. Hybrid MPI/OpenMP Implementation of the ORAC Molecular Dynamics Program for Generalized Ensemble and Fast Switching Alchemical Simulations.

    PubMed

    Procacci, Piero

    2016-06-27

    We present a new release (6.0β) of the ORAC program [Marsili et al. J. Comput. Chem. 2010, 31, 1106-1116] with a hybrid OpenMP/MPI (open multiprocessing message passing interface) multilevel parallelism tailored for generalized ensemble (GE) and fast switching double annihilation (FS-DAM) nonequilibrium technology aimed at evaluating the binding free energy in drug-receptor system on high performance computing platforms. The production of the GE or FS-DAM trajectories is handled using a weak scaling parallel approach on the MPI level only, while a strong scaling force decomposition scheme is implemented for intranode computations with shared memory access at the OpenMP level. The efficiency, simplicity, and inherent parallel nature of the ORAC implementation of the FS-DAM algorithm, project the code as a possible effective tool for a second generation high throughput virtual screening in drug discovery and design. The code, along with documentation, testing, and ancillary tools, is distributed under the provisions of the General Public License and can be freely downloaded at www.chim.unifi.it/orac .

  11. Field estimates of gravity terrain corrections and Y2K-compatible method to convert from gravity readings with multiple base stations to tide- and long-term drift-corrected observations

    USGS Publications Warehouse

    Plouff, Donald

    2000-01-01

    Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first must be converted (compiled) into an executable form on the user's computer. Although program testing was done in a UNIX (tradename of American Telephone and Telegraph Company) computer environment, it is anticipated that only a system-dependent date-and-time function may need to be changed for adaptation to other computer platforms that accept standard Fortran code.d del iliscipit volorer sequi ting etue feum zzriliquatum zzriustrud esenibh ex esto esequat.

  12. AVIRIS Reflectance Retrievals: UCSB Users Manual. Appendix 1

    NASA Technical Reports Server (NTRS)

    Roberts, Dar A.; Prentiss, Dylan

    2001-01-01

    The following write-up is designed to help students and researchers take Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) radiance data and retrieve surface reflectance. In the event that the software is not available, but a user has access to a reflectance product, this document is designed to provide a better understanding of how AVIRIS reflectance was retrieved. This guide assumes that the reader has both a basic understanding of the UNIX computing environment, and that of spectroscopy. Knowledge of the Interactive Data Language (IDL) and the Environment for Visualizing Images (ENVI) is helpful. This is a working document, and many of the fine details described in the following pages have been previously undocumented. After having read this document the reader should be able to process AVIRIS to reflectance, provided access to all of the code is possible. The AVIRIS radiance data itself is pre-processed at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The first section of this paper describes how to read data from tape and byte-swap the data. Section 2 describes the procedure in preparing support files before running the 'h2o' suite of programs. Section 3 describes the four programs used in the process, h2olut9.f, h2ospl9.f, vlsfit9.f and rfl9.f.

  13. Summary of Pressure Gain Combustion Research at NASA

    NASA Technical Reports Server (NTRS)

    Perkins, H. Douglas; Paxson, Daniel E.

    2018-01-01

    NASA has undertaken a systematic exploration of many different facets of pressure gain combustion over the last 25 years in an effort to exploit the inherent thermodynamic advantage of pressure gain combustion over the constant pressure combustion process used in most aerospace propulsion systems. Applications as varied as small-scale UAV's, rotorcraft, subsonic transports, hypersonics and launch vehicles have been considered. In addition to studying pressure gain combustor concepts such as wave rotors, pulse detonation engines, pulsejets, and rotating detonation engines, NASA has studied inlets, nozzles, ejectors and turbines which must also process unsteady flow in an integrated propulsion system. Other design considerations such as acoustic signature, combustor material life and heat transfer that are unique to pressure gain combustors have also been addressed in NASA research projects. In addition to a wide range of experimental studies, a number of computer codes, from 0-D up through 3-D, have been developed or modified to specifically address the analysis of unsteady flow fields. Loss models have also been developed and incorporated into these codes that improve the accuracy of performance predictions and decrease computational time. These codes have been validated numerous times across a broad range of operating conditions, and it has been found that once validated for one particular pressure gain combustion configuration, these codes are readily adaptable to the others. All in all, the documentation of this work has encompassed approximately 170 NASA technical reports, conference papers and journal articles to date. These publications are very briefly summarized herein, providing a single point of reference for all of NASA's pressure gain combustion research efforts. This documentation does not include the significant contributions made by NASA research staff to the programs of other agencies, universities, industrial partners and professional society committees through serving as technical advisors, technical reviewers and research consultants.

  14. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less

  15. Description of the AILS Alerting Algorithm

    NASA Technical Reports Server (NTRS)

    Samanant, Paul; Jackson, Mike

    2000-01-01

    This document provides a complete description of the Airborne Information for Lateral Spacing (AILS) alerting algorithms. The purpose of AILS is to provide separation assurance between aircraft during simultaneous approaches to closely spaced parallel runways. AILS will allow independent approaches to be flown in such situations where dependent approaches were previously required (typically under Instrument Meteorological Conditions (IMC)). This is achieved by providing multiple levels of alerting for pairs of aircraft that are in parallel approach situations. This document#s scope is comprehensive and covers everything from general overviews, definitions, and concepts down to algorithmic elements and equations. The entire algorithm is presented in complete and detailed pseudo-code format. This can be used by software programmers to program AILS into a software language. Additional supporting information is provided in the form of coordinate frame definitions, data requirements, calling requirements as well as all necessary pre-processing and post-processing requirements. This is important and required information for the implementation of AILS into an analysis, a simulation, or a real-time system.

  16. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  17. The NPG 7120.5A Electronic Review Process

    NASA Technical Reports Server (NTRS)

    McBrayer, Robert; Ives, Mark

    1998-01-01

    The use of electronics to review a document is well within the technical realm of today's state-of-the-art workplace. File servers and web site interaction are common tools for many NASA employees. The electronic comment processing described here was developed for the NPG 7120.5A review to augment the existing NASA Online Directives Information System (NODIS). The NODIS system is NASA's official system for formal review, approval and storage of NASA Directives. The electronic review process worked so well that NASA and other agencies may want to consider it as one of our "best practices." It was participatory decision making at its very best, a process that attracted dozens of very good ideas to improve the document as well as the way we can be managing projects far more effectively. The revision of NPG 7120.5A has significant implications for the way all elements of the Agency accomplish program and project management. Therefore, the review of NPG 7120.5A was an Agencywide effort with high visibility, heavy participation and a short schedule. The level of involvement created interest in supplementing the formal NODIS system with a system to collect comments efficiently and to allow the Centers and Codes to review and consolidate their comments into the official system in a short period of time. In addition, the Program Management Council Working Group (PMCWG), responsible for the revision of the document and the disposition of official comments, needed an electronic system to manage the disposition of comments, obtain PMCWG consensus on each disposition, and coordinate the disposition with the appropriate Headquarters Code that had submitted the official comment. The combined NASA and contractor talents and resources provided a system that supplemented the NODIS system and its operating personnel to produce a thorough review and approval of NPG 7120.5A on April 3, 1998, 7.5 months from the start of the process. The original six-month schedule is indicated. All milestones occurred on time, except for completion of comment disposition, which required an additional 30 days. Approval of the document occurred sixteen days after completion of the "Purple Package."

  18. Orbital Maneuvering Engine Feed System Coupled Stability Investigation, Computer User's Manual

    NASA Technical Reports Server (NTRS)

    Schuman, M. D.; Fertig, K. W.; Hunting, J. K.; Kahn, D. R.

    1975-01-01

    An operating manual for the feed system coupled stability model was given, in partial fulfillment of a program designed to develop, verify, and document a digital computer model that can be used to analyze and predict engine/feed system coupled instabilities in pressure-fed storable propellant propulsion systems over a frequency range of 10 to 1,000 Hz. The first section describes the analytical approach to modelling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure, and presents the governing equations in each of the technical areas. This is followed by the program user's guide, which is a complete description of the structure and operation of the computerized model. Last, appendices provide an alphabetized FORTRAN symbol table, detailed program logic diagrams, computer code listings, and sample case input and output data listings.

  19. CDC 7600 LTSS programming stratagens: preparing your first production code for the Livermore Timesharing System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fong, K. W.

    1977-08-15

    This report deals with some techniques in applied programming using the Livermore Timesharing System (LTSS) on the CDC 7600 computers at the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network). This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been revised to accommodate differences between LLLCC and NMFECC implementations. Topics include: maintaining programs, debugging, recovering from system crashes, and using the central processing unit, memory, and input/output devices efficiently and economically. Routines that aid in these procedures aremore » mentioned. The companion report, UCID-17556, An LTSS Compendium, discusses the hardware and operating system and should be read before reading this report.« less

  20. Connecting Architecture and Implementation

    NASA Astrophysics Data System (ADS)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  1. MCNP4A: Features and philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.

    This paper describes MCNP, states its philosophy, introduces a number of new features becoming available with version MCNP4A, and answers a number of questions asked by participants in the workshop. MCNP is a general-purpose three-dimensional neutron, photon and electron transport code. Its philosophy is ``Quality, Value and New Features.`` Quality is exemplified by new software quality assurance practices and a program of benchmarking against experiments. Value includes a strong emphasis on documentation and code portability. New features are the third priority. MCNP4A is now available at Los Alamos. New features in MCNP4A include enhanced statistical analysis, distributed processor multitasking, newmore » photon libraries, ENDF/B-VI capabilities, X-Windows graphics, dynamic memory allocation, expanded criticality output, periodic boundaries, plotting of particle tracks via SABRINA, and many other improvements. 23 refs.« less

  2. Rosetta3: An Object-Oriented Software Suite for the Simulation and Design of Macromolecules

    PubMed Central

    Leaver-Fay, Andrew; Tyka, Michael; Lewis, Steven M.; Lange, Oliver F.; Thompson, James; Jacak, Ron; Kaufman, Kristian; Renfrew, P. Douglas; Smith, Colin A.; Sheffler, Will; Davis, Ian W.; Cooper, Seth; Treuille, Adrien; Mandell, Daniel J.; Richter, Florian; Ban, Yih-En Andrew; Fleishman, Sarel J.; Corn, Jacob E.; Kim, David E.; Lyskov, Sergey; Berrondo, Monica; Mentzer, Stuart; Popović, Zoran; Havranek, James J.; Karanicolas, John; Das, Rhiju; Meiler, Jens; Kortemme, Tanja; Gray, Jeffrey J.; Kuhlman, Brian; Baker, David; Bradley, Philip

    2013-01-01

    We have recently completed a full re-architecturing of the Rosetta molecular modeling program, generalizing and expanding its existing functionality. The new architecture enables the rapid prototyping of novel protocols by providing easy to use interfaces to powerful tools for molecular modeling. The source code of this rearchitecturing has been released as Rosetta3 and is freely available for academic use. At the time of its release, it contained 470,000 lines of code. Counting currently unpublished protocols at the time of this writing, the source includes 1,285,000 lines. Its rapid growth is a testament to its ease of use. This document describes the requirements for our new architecture, justifies the design decisions, sketches out central classes, and highlights a few of the common tasks that the new software can perform. PMID:21187238

  3. Improvements to the fastex flutter analysis computer code

    NASA Technical Reports Server (NTRS)

    Taylor, Ronald F.

    1987-01-01

    Modifications to the FASTEX flutter analysis computer code (UDFASTEX) are described. The objectives were to increase the problem size capacity of FASTEX, reduce run times by modification of the modal interpolation procedure, and to add new user features. All modifications to the program are operable on the VAX 11/700 series computers under the VAX operating system. Interfaces were provided to aid in the inclusion of alternate aerodynamic and flutter eigenvalue calculations. Plots can be made of the flutter velocity, display and frequency data. A preliminary capability was also developed to plot contours of unsteady pressure amplitude and phase. The relevant equations of motion, modal interpolation procedures, and control system considerations are described and software developments are summarized. Additional information documenting input instructions, procedures, and details of the plate spline algorithm is found in the appendices.

  4. Noise Measurements of the VAIIPR Fan

    NASA Technical Reports Server (NTRS)

    Mendoza, Jeff; Weir, Don

    2012-01-01

    This final report has been prepared by Honeywell Aerospace, Phoenix, Arizona, a unit of Honeywell International, Inc., documenting work performed during the period September 2004 through November 2005 for the National Aeronautics and Space Administration (NASA) Glenn Research Center, Cleveland, Ohio, under the Revolutionary Aero-Space Engine Research (RASER) Program, Contract No. NAS3- 01136, Task Order 6, Noise Measurements of the VAIIPR Fan. The NASA Task Manager was Dr. Joe Grady, NASA Glenn Research Center, Mail Code 60-6, Cleveland, Ohio 44135. The NASA Contract Officer was Mr. Albert Spence, NASA Glenn Research Center, Mail Code 60-6, Cleveland, Ohio 44135. This report focuses on the evaluation of internal fan noise as generated from various inflow disturbances based on measurements made from a circumferential array of sensors located near the fan and sensors upstream of a serpentine inlet.

  5. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  6. A resistive magnetohydrodynamics solver using modern C++ and the Boost library

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-09-01

    In this paper we describe the implementation of our C++ resistive magnetohydrodynamics solver. The framework developed facilitates the separation of the code implementing the specific numerical method and the physical model from the handling of boundary conditions and the management of the computational domain. In particular, this will allow us to use finite difference stencils which are only defined in the interior of the domain (the boundary conditions are handled automatically). We will discuss this and other design considerations and their impact on performance in some detail. In addition, we provide a documentation of the code developed and demonstrate that a performance comparable to Fortran can be achieved, while still maintaining a maximum of code readability and extensibility. Catalogue identifier: AFAH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 592774 No. of bytes in distributed program, including test data, etc.: 43771395 Distribution format: tar.gz Programming language: C++03. Computer: PC, HPC systems. Operating system: POSIX compatible (extensively tested on various Linux systems). In fact only the timing class requires POSIX routines; all other parts of the program can be run on any system where a C++ compiler, Boost, CVODE, and an implementation of BLAS are available. RAM: Hundredths of Kilobytes to Gigabytes (depending on the problem size) Classification: 19.10, 4.3. External routines: Boost, CVODE, either a BLAS library or Intel MKL Nature of problem: An approximate solution to the equations of resistive magnetohydrodynamics for a given initial value and given boundary conditions is computed. Solution method: The discretization is performed using a finite difference approximation in space and the CVODE library in time (which employs a scheme based on the backward differentiation formulas). Restrictions: We consider the 2.5 dimensional case; that is, the magnetic field and the velocity field are three dimensional but all quantities depend only on x and y (but not z). Unusual features: We provide an implementation in C++ using the Boost library that combines high level techniques (which greatly increases code maintainability and extensibility) with performance that is comparable to Fortran implementations. Running time: From seconds to weeks (depending on the problem size).

  7. The Role of Ontologies in Schema-based Program Synthesis

    NASA Technical Reports Server (NTRS)

    Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.

    2004-01-01

    Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.

  8. Patient health record on a smart card.

    PubMed

    Naszlady, A; Naszlady, J

    1998-02-01

    A validated health questionnaire has been used for the documentation of a patient's history (826 items) and of the findings from physical examination (591 items) in our clinical ward for 25 years. This computerized patient record has been completed in EUCLIDES code (CEN TC/251) for laboratory tests and an ATC and EAN code listing for the names of the drugs permanently required by the patient. In addition, emergency data were also included on an EEPROM chipcard with a 24 kb capacity. The program is written in FOX-PRO language. A group of 5000 chronically ill in-patients received these cards which contain their health data. For security reasons the contents of the smart card is only accessible by a doctor's PIN coded key card. The personalization of each card was carried out in our health center and the depersonalized alphanumeric data were collected for further statistical evaluation. This information served as a basis for a real need assessment of health care and for the calculation of its cost. Code-combined with an optical card, a completely paperless electronic patient record system has been developed containing all three information carriers in medicine: Texts, Curves and Pictures.

  9. The Cloud Feedback Model Intercomparison Project (CFMIP) Diagnostic Codes Catalogue – metrics, diagnostics and methodologies to evaluate, understand and improve the representation of clouds and cloud feedbacks in climate models

    DOE PAGES

    Tsushima, Yoko; Brient, Florent; Klein, Stephen A.; ...

    2017-11-27

    The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less

  10. The Cloud Feedback Model Intercomparison Project (CFMIP) Diagnostic Codes Catalogue – metrics, diagnostics and methodologies to evaluate, understand and improve the representation of clouds and cloud feedbacks in climate models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsushima, Yoko; Brient, Florent; Klein, Stephen A.

    The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less

  11. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  12. CIM explorer--intelligent tool for exploring the ICD Romanian version.

    PubMed

    Filip, F; Haras, C

    2000-01-01

    The CIM Explorer software provides us with an intelligent interface for exploring the Romanian version of the International Classification of Diseases (in Romanian Clasificarea Internationala a Maladiilor-CIM). The ICD was transposed from its initial appearance as a printed document into a database. The classification can be accessed in two modes: "Navigation" and "Code" and queried in the "Key words" mode. In the last mode CIM Explorer program searches for the right content of the ICD records starting from naturally written expressions which it "understands". As a results it returns all the records containing the key words regardless their grammatical form. This program implements the specificity of the Romanian language where the words are made up from a root and a flexional termination.

  13. Inventory of Safety-related Codes and Standards for Energy Storage Systems with some Experiences related to Approval and Acceptance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, David R.

    The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.

  14. Parallel implementation of an adaptive and parameter-free N-body integrator

    NASA Astrophysics Data System (ADS)

    Pruett, C. David; Ingham, William H.; Herman, Ralph D.

    2011-05-01

    Previously, Pruett et al. (2003) [3] described an N-body integrator of arbitrarily high order M with an asymptotic operation count of O(MN). The algorithm's structure lends itself readily to data parallelization, which we document and demonstrate here in the integration of point-mass systems subject to Newtonian gravitation. High order is shown to benefit parallel efficiency. The resulting N-body integrator is robust, parameter-free, highly accurate, and adaptive in both time-step and order. Moreover, it exhibits linear speedup on distributed parallel processors, provided that each processor is assigned at least a handful of bodies. Program summaryProgram title: PNB.f90 Catalogue identifier: AEIK_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIK_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3052 No. of bytes in distributed program, including test data, etc.: 68 600 Distribution format: tar.gz Programming language: Fortran 90 and OpenMPI Computer: All shared or distributed memory parallel processors Operating system: Unix/Linux Has the code been vectorized or parallelized?: The code has been parallelized but has not been explicitly vectorized. RAM: Dependent upon N Classification: 4.3, 4.12, 6.5 Nature of problem: High accuracy numerical evaluation of trajectories of N point masses each subject to Newtonian gravitation. Solution method: Parallel and adaptive extrapolation in time via power series of arbitrary degree. Running time: 5.1 s for the demo program supplied with the package.

  15. APOLLO: A computer program for the calculation of chemical equilibrium and reaction kinetics of chemical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, H.D.

    1991-11-01

    Several of the technologies being evaluated for the treatment of waste material involve chemical reactions. Our example is the in situ vitrification (ISV) process where electrical energy is used to melt soil and waste into a ``glass like`` material that immobilizes and encapsulates any residual waste. During the ISV process, various chemical reactions may occur that produce significant amounts of products which must be contained and treated. The APOLLO program was developed to assist in predicting the composition of the gases that are formed. Although the development of this program was directed toward ISV applications, it should be applicable tomore » other technologies where chemical reactions are of interest. This document presents the mathematical methodology of the APOLLO computer code. APOLLO is a computer code that calculates the products of both equilibrium and kinetic chemical reactions. The current version, written in FORTRAN, is readily adaptable to existing transport programs designed for the analysis of chemically reacting flow systems. Separate subroutines EQREACT and KIREACT for equilibrium ad kinetic chemistry respectively have been developed. A full detailed description of the numerical techniques used, which include both Lagrange multiplies and a third-order integrating scheme is presented. Sample test problems are presented and the results are in excellent agreement with those reported in the literature.« less

  16. APOLLO: A computer program for the calculation of chemical equilibrium and reaction kinetics of chemical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, H.D.

    1991-11-01

    Several of the technologies being evaluated for the treatment of waste material involve chemical reactions. Our example is the in situ vitrification (ISV) process where electrical energy is used to melt soil and waste into a glass like'' material that immobilizes and encapsulates any residual waste. During the ISV process, various chemical reactions may occur that produce significant amounts of products which must be contained and treated. The APOLLO program was developed to assist in predicting the composition of the gases that are formed. Although the development of this program was directed toward ISV applications, it should be applicable tomore » other technologies where chemical reactions are of interest. This document presents the mathematical methodology of the APOLLO computer code. APOLLO is a computer code that calculates the products of both equilibrium and kinetic chemical reactions. The current version, written in FORTRAN, is readily adaptable to existing transport programs designed for the analysis of chemically reacting flow systems. Separate subroutines EQREACT and KIREACT for equilibrium ad kinetic chemistry respectively have been developed. A full detailed description of the numerical techniques used, which include both Lagrange multiplies and a third-order integrating scheme is presented. Sample test problems are presented and the results are in excellent agreement with those reported in the literature.« less

  17. Technical Basis for PNNL Beryllium Inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Michelle Lynn

    2014-07-09

    The Department of Energy (DOE) issued Title 10 of the Code of Federal Regulations Part 850, “Chronic Beryllium Disease Prevention Program” (the Beryllium Rule) in 1999 and required full compliance by no later than January 7, 2002. The Beryllium Rule requires the development of a baseline beryllium inventory of the locations of beryllium operations and other locations of potential beryllium contamination at DOE facilities. The baseline beryllium inventory is also required to identify workers exposed or potentially exposed to beryllium at those locations. Prior to DOE issuing 10 CFR 850, Pacific Northwest Nuclear Laboratory (PNNL) had documented the beryllium characterizationmore » and worker exposure potential for multiple facilities in compliance with DOE’s 1997 Notice 440.1, “Interim Chronic Beryllium Disease.” After DOE’s issuance of 10 CFR 850, PNNL developed an implementation plan to be compliant by 2002. In 2014, an internal self-assessment (ITS #E-00748) of PNNL’s Chronic Beryllium Disease Prevention Program (CBDPP) identified several deficiencies. One deficiency is that the technical basis for establishing the baseline beryllium inventory when the Beryllium Rule was implemented was either not documented or not retrievable. In addition, the beryllium inventory itself had not been adequately documented and maintained since PNNL established its own CBDPP, separate from Hanford Site’s program. This document reconstructs PNNL’s baseline beryllium inventory as it would have existed when it achieved compliance with the Beryllium Rule in 2001 and provides the technical basis for the baseline beryllium inventory.« less

  18. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less

  19. Nemo: an evolutionary and population genetics programming framework.

    PubMed

    Guillaume, Frédéric; Rougemont, Jacques

    2006-10-15

    Nemo is an individual-based, genetically explicit and stochastic population computer program for the simulation of population genetics and life-history trait evolution in a metapopulation context. It comes as both a C++ programming framework and an executable program file. Its object-oriented programming design gives it the flexibility and extensibility needed to implement a large variety of forward-time evolutionary models. It provides developers with abstract models allowing them to implement their own life-history traits and life-cycle events. Nemo offers a large panel of population models, from the Island model to lattice models with demographic or environmental stochasticity and a variety of already implemented traits (deleterious mutations, neutral markers and more), life-cycle events (mating, dispersal, aging, selection, etc.) and output operators for saving data and statistics. It runs on all major computer platforms including parallel computing environments. The source code, binaries and documentation are available under the GNU General Public License at http://nemo2.sourceforge.net.

  20. Evaluation of Factors Influencing Accuracy of Principal Procedure Coding Based on ICD-9-CM: An Iranian Study

    PubMed Central

    Farzandipour, Mehrdad; Sheikhtaheri, Abbas

    2009-01-01

    To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647

  1. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.

  2. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  3. A Summary of Important Documents in the Field of Research Ethics

    PubMed Central

    Fischer, Bernard A

    2006-01-01

    Today's researchers are obligated to conduct their studies ethically. However, it often seems a daunting task to become familiar with the important ethical codes required to do so. The purpose of this article is to examine the content of those ethical documents most relevant to the biomedical researcher. Documents examined include the Nuremberg Code, the Declaration of Helsinki, Henry Beecher's landmark paper, the Belmont Report, the U.S. Common Rule, the Guideline for Good Clinical Practice, and the National Bioethics Advisory Commission's report on research protections for the mentally ill. PMID:16192409

  4. CTF Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, Maria N.; Salko, Robert K.

    Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less

  5. Bibliography of groundwater resources of the glacial aquifer systems in Washington, Idaho, and northwestern Montana, 1905-2011

    USGS Publications Warehouse

    Kahle, Sue C.; Futornick, Zoe O.

    2012-01-01

    The U.S. Geological Survey Groundwater Resources Program is undertaking a series of regional groundwater availability studies to improve our understanding of groundwater availability in major aquifers across the Nation. One of the objectives of the Glacial Principal Aquifers study (proposed) is to provide information on the occurrence of groundwater in glacial aquifers in the United States, an area that includes parts of the northern continental States and much of Alaska. Toward this effort, a literature search was conducted to identify readily available documents that describe the occurrence of groundwater in glacial aquifers in the United States. This bibliography provides citations for documents, as well as codes indicating types of information available in each, for Washington, Idaho, and northwestern Montana—an area corresponding approximately to the southern extent of the Cordilleran ice sheet.

  6. Constitutive relations in TRAC-P1A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohatgi, U.S.; Saha, P.

    1980-08-01

    The purpose of this document is to describe the basic thermal-hydraulic models and correlations that are in the TRAC-P1A code, as released in March 1979. It is divided into two parts, A and B. Part A describes the models in the three-dimensional vessel module of TRAC, whereas Part B focuses on the loop components that are treated by one-dimensional formulations. The report follows the format of the questions prepared by the Analysis Development Branch of USNRC and the questionnaire has been attached to this document for completeness. Concerted efforts have been made in understanding the present models in TRAC-P1A bymore » going through the FORTRAN listing of the code. Some discrepancies between the code and the TRAC-P1A manual have been found. These are pointed out in this document. Efforts have also been made to check the TRAC references for the range of applicability of the models and correlations used in the code. 26 refs., 5 figs., 1 tab.« less

  7. Identifying Falls Risk Screenings Not Documented with Administrative Codes Using Natural Language Processing

    PubMed Central

    Zhu, Vivienne J; Walker, Tina D; Warren, Robert W; Jenny, Peggy B; Meystre, Stephane; Lenert, Leslie A

    2017-01-01

    Quality reporting that relies on coded administrative data alone may not completely and accurately depict providers’ performance. To assess this concern with a test case, we developed and evaluated a natural language processing (NLP) approach to identify falls risk screenings documented in clinical notes of patients without coded falls risk screening data. Extracting information from 1,558 clinical notes (mainly progress notes) from 144 eligible patients, we generated a lexicon of 38 keywords relevant to falls risk screening, 26 terms for pre-negation, and 35 terms for post-negation. The NLP algorithm identified 62 (out of the 144) patients who falls risk screening documented only in clinical notes and not coded. Manual review confirmed 59 patients as true positives and 77 patients as true negatives. Our NLP approach scored 0.92 for precision, 0.95 for recall, and 0.93 for F-measure. These results support the concept of utilizing NLP to enhance healthcare quality reporting. PMID:29854264

  8. Medical surveillance and programs on industrial hygiene at RCRA facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, T.E.

    1994-12-31

    Some special areas where much progress in industrial hygiene and safety has been made in the past few years are; training, personal protective equipment, uniforms, personal monitoring, area monitoring, and medical surveillance. Before one can begin to construct programs for worker protection, some knowledge of potential exposures must be gained. The best place to start is the Waste Analysis Plan, and the list of wastes that a particular site is authorized to receive. Waste Codes are listed within a facility`s Part A and Part B permits. Actual facility receipt of wastes are well documented within Load Records and other documentation.more » A facility`s training program forms the heart of a health and safety program. Every TSD facility should have developed a matrix of job titles and required training. Every facility must also make a commitment to providing a wide range of personal protective equipment, including a wide array of disposables. Some facilities will benefit from the occasional use of the newer respirator quantitative fit-testing devices. All facilities are urged to rent or borrow this type of equipment periodically. Quantitative respirator fit-testers are capable of revealing important deficiencies in a respirator program. Providing uniforms is a newer means of protecting workers. The use of uniforms is an effective means for addressing the idea of carry-home-waste. The use of disposables including boots, must be integrated into a Uniform Program if the program is to be effective. In addition, employees must strictly understand that uniforms must not leave the facility at any time, including lunch time.« less

  9. Advanced information processing system: Hosting of advanced guidance, navigation and control algorithms on AIPS using ASTER

    NASA Technical Reports Server (NTRS)

    Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John

    1994-01-01

    This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.

  10. Residential Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Model Documentation - Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code.

  11. Experimental annotation of post-translational features and translated coding regions in the pathogen Salmonella Typhimurium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansong, Charles; Tolic, Nikola; Purvine, Samuel O.

    Complete and accurate genome annotation is crucial for comprehensive and systematic studies of biological systems. For example systems biology-oriented genome scale modeling efforts greatly benefit from accurate annotation of protein-coding genes to develop proper functioning models. However, determining protein-coding genes for most new genomes is almost completely performed by inference, using computational predictions with significant documented error rates (> 15%). Furthermore, gene prediction programs provide no information on biologically important post-translational processing events critical for protein function. With the ability to directly measure peptides arising from expressed proteins, mass spectrometry-based proteomics approaches can be used to augment and verify codingmore » regions of a genomic sequence and importantly detect post-translational processing events. In this study we utilized “shotgun” proteomics to guide accurate primary genome annotation of the bacterial pathogen Salmonella Typhimurium 14028 to facilitate a systems-level understanding of Salmonella biology. The data provides protein-level experimental confirmation for 44% of predicted protein-coding genes, suggests revisions to 48 genes assigned incorrect translational start sites, and uncovers 13 non-annotated genes missed by gene prediction programs. We also present a comprehensive analysis of post-translational processing events in Salmonella, revealing a wide range of complex chemical modifications (70 distinct modifications) and confirming more than 130 signal peptide and N-terminal methionine cleavage events in Salmonella. This study highlights several ways in which proteomics data applied during the primary stages of annotation can improve the quality of genome annotations, especially with regards to the annotation of mature protein products.« less

  12. Tablet-based cardiac arrest documentation: a pilot study.

    PubMed

    Peace, Jack M; Yuen, Trevor C; Borak, Meredith H; Edelson, Dana P

    2014-02-01

    Conventional paper-based resuscitation transcripts are notoriously inaccurate, often lacking the precision that is necessary for recording a fast-paced resuscitation. The aim of this study was to evaluate whether a tablet computer-based application could improve upon conventional practices for resuscitation documentation. Nurses used either the conventional paper code sheet or a tablet application during simulated resuscitation events. Recorded events were compared to a gold standard record generated from video recordings of the simulations and a CPR-sensing defibrillator/monitor. Events compared included defibrillations, medication deliveries, and other interventions. During the study period, 199 unique interventions were observed in the gold standard record. Of these, 102 occurred during simulations recorded by the tablet application, 78 by the paper code sheet, and 19 during scenarios captured simultaneously by both documentation methods These occurred over 18 simulated resuscitation scenarios, in which 9 nurses participated. The tablet application had a mean sensitivity of 88.0% for all interventions, compared to 67.9% for the paper code sheet (P=0.001). The median time discrepancy was 3s for the tablet, and 77s for the paper code sheet when compared to the gold standard (P<0.001). Similar to prior studies, we found that conventional paper-based documentation practices are inaccurate, often misreporting intervention delivery times or missing their delivery entirely. However, our study also demonstrated that a tablet-based documentation method may represent a means to substantially improve resuscitation documentation quality, which could have implications for resuscitation quality improvement and research. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. A Peer Helpers Code of Behavior.

    ERIC Educational Resources Information Center

    de Rosenroll, David A.

    This document presents a guide for developing a peer helpers code of behavior. The first section discusses issues relevant to the trainers. These issues include whether to give a model directly to the group or whether to engender "ownership" of the code by the group; timing of introduction of the code; and addressing the issue of…

  14. Roanoke College Student Conduct Code 1990-91.

    ERIC Educational Resources Information Center

    Roanoke Coll., VA.

    This Roanoke College (Virginia) 1990-91 conduct code manual is intended for distribution to students. A reproduction of the Academic Integrity and Student Conduct Code Form which all students must sign leads off the document. A section detailing the student conduct code explains the delegation of authority within the institution and describes the…

  15. SSR_pipeline--computer software for the identification of microsatellite sequences from paired-end Illumina high-throughput DNA sequence data

    USGS Publications Warehouse

    Miller, Mark P.; Knaus, Brian J.; Mullins, Thomas D.; Haig, Susan M.

    2013-01-01

    SSR_pipeline is a flexible set of programs designed to efficiently identify simple sequence repeats (SSRs; for example, microsatellites) from paired-end high-throughput Illumina DNA sequencing data. The program suite contains three analysis modules along with a fourth control module that can be used to automate analyses of large volumes of data. The modules are used to (1) identify the subset of paired-end sequences that pass quality standards, (2) align paired-end reads into a single composite DNA sequence, and (3) identify sequences that possess microsatellites conforming to user specified parameters. Each of the three separate analysis modules also can be used independently to provide greater flexibility or to work with FASTQ or FASTA files generated from other sequencing platforms (Roche 454, Ion Torrent, etc). All modules are implemented in the Python programming language and can therefore be used from nearly any computer operating system (Linux, Macintosh, Windows). The program suite relies on a compiled Python extension module to perform paired-end alignments. Instructions for compiling the extension from source code are provided in the documentation. Users who do not have Python installed on their computers or who do not have the ability to compile software also may choose to download packaged executable files. These files include all Python scripts, a copy of the compiled extension module, and a minimal installation of Python in a single binary executable. See program documentation for more information.

  16. Flowgen: Flowchart-based documentation for C + + codes

    NASA Astrophysics Data System (ADS)

    Kosower, David A.; Lopez-Villarejo, J. J.

    2015-11-01

    We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.

  17. An installed nacelle design code using a multiblock Euler solver. Volume 1: Theory document

    NASA Technical Reports Server (NTRS)

    Chen, H. C.

    1992-01-01

    An efficient multiblock Euler design code was developed for designing a nacelle installed on geometrically complex airplane configurations. This approach employed a design driver based on a direct iterative surface curvature method developed at LaRC. A general multiblock Euler flow solver was used for computing flow around complex geometries. The flow solver used a finite-volume formulation with explicit time-stepping to solve the Euler Equations. It used a multiblock version of the multigrid method to accelerate the convergence of the calculations. The design driver successively updated the surface geometry to reduce the difference between the computed and target pressure distributions. In the flow solver, the change in surface geometry was simulated by applying surface transpiration boundary conditions to avoid repeated grid generation during design iterations. Smoothness of the designed surface was ensured by alternate application of streamwise and circumferential smoothings. The capability and efficiency of the code was demonstrated through the design of both an isolated nacelle and an installed nacelle at various flow conditions. Information on the execution of the computer program is provided in volume 2.

  18. The Continuous Intercomparison of Radiation Codes (CIRC): Phase I Cases

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Turner, David D.; Miller, Mark A.; Minnis, Patrick; Clough, Shepard; Barker, Howard; Ellingson, Robert

    2007-01-01

    CIRC aspires to be the successor to ICRCCM (Intercomparison of Radiation Codes in Climate Models). It is envisioned as an evolving and regularly updated reference source for GCM-type radiative transfer (RT) code evaluation with the principle goal to contribute in the improvement of RT parameterizations. CIRC is jointly endorsed by DOE's Atmospheric Radiation Measurement (ARM) program and the GEWEX Radiation Panel (GRP). CIRC's goal is to provide test cases for which GCM RT algorithms should be performing at their best, i.e, well characterized clear-sky and homogeneous, overcast cloudy cases. What distinguishes CIRC from previous intercomparisons is that its pool of cases is based on observed datasets. The bulk of atmospheric and surface input as well as radiative fluxes come from ARM observations as documented in the Broadband Heating Rate Profile (BBHRP) product. BBHRP also provides reference calculations from AER's RRTM RT algorithms that can be used to select the most optimal set of cases and to provide a first-order estimate of our ability to achieve radiative flux closure given the limitations in our knowledge of the atmospheric state.

  19. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  20. A Generic Software Safety Document Generator

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  1. Advanced composites structural concepts and materials technologies for primary aircraft structures. Structural response and failure analysis: ISPAN modules users manual

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Huang, Jui-Ten; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    The ISPAN Program (Interactive Stiffened Panel Analysis) is an interactive design tool that is intended to provide a means of performing simple and self contained preliminary analysis of aircraft primary structures made of composite materials. The program combines a series of modules with the finite element code DIAL as its backbone. Four ISPAN Modules were developed and are documented. These include: (1) flat stiffened panel; (2) curved stiffened panel; (3) flat tubular panel; and (4) curved geodesic panel. Users are instructed to input geometric and material properties, load information and types of analysis (linear, bifurcation buckling, or post-buckling) interactively. The program utilizing this information will generate finite element mesh and perform analysis. The output in the form of summary tables of stress or margins of safety, contour plots of loads or stress, and deflected shape plots may be generalized and used to evaluate specific design.

  2. A Process and Programming Design to Develop Virtual Patients for Medical Education

    PubMed Central

    McGee, James B.; Wu, Martha

    1999-01-01

    Changes in the financing and delivery of healthcare in our nation's teaching hospitals have diminished the variety and quality of a medical student's clinical training. The Virtual Patient Project is a series of computer-based, multimedia, clinical simulations, designed to fill this gap. After the development of a successful prototype and obtaining funding for a series of 16 cases, a method to write and produce many virtual patients was created. Case authors now meet with our production team to write and edit a movie-like script. This script is converted into a design document which specifies the clinical aspects, teaching points, media production, and interactivity of each case. The program's code was modularized, using object-oriented techniques, to allow for the variations in cases and for team programming. All of the clinical and teaching content is stored in a database, that allows for faster and easier editing by many persons simultaneously.

  3. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  4. BioRuby: bioinformatics software for the Ruby programming language.

    PubMed

    Goto, Naohisa; Prins, Pjotr; Nakao, Mitsuteru; Bonnal, Raoul; Aerts, Jan; Katayama, Toshiaki

    2010-10-15

    The BioRuby software toolkit contains a comprehensive set of free development tools and libraries for bioinformatics and molecular biology, written in the Ruby programming language. BioRuby has components for sequence analysis, pathway analysis, protein modelling and phylogenetic analysis; it supports many widely used data formats and provides easy access to databases, external programs and public web services, including BLAST, KEGG, GenBank, MEDLINE and GO. BioRuby comes with a tutorial, documentation and an interactive environment, which can be used in the shell, and in the web browser. BioRuby is free and open source software, made available under the Ruby license. BioRuby runs on all platforms that support Ruby, including Linux, Mac OS X and Windows. And, with JRuby, BioRuby runs on the Java Virtual Machine. The source code is available from http://www.bioruby.org/. katayama@bioruby.org

  5. The NASA computer aided design and test system

    NASA Technical Reports Server (NTRS)

    Gould, J. M.; Juergensen, K.

    1973-01-01

    A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.

  6. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  7. ImgLib2--generic image processing in Java.

    PubMed

    Pietzsch, Tobias; Preibisch, Stephan; Tomancák, Pavel; Saalfeld, Stephan

    2012-11-15

    ImgLib2 is an open-source Java library for n-dimensional data representation and manipulation with focus on image processing. It aims at minimizing code duplication by cleanly separating pixel-algebra, data access and data representation in memory. Algorithms can be implemented for classes of pixel types and generic access patterns by which they become independent of the specific dimensionality, pixel type and data representation. ImgLib2 illustrates that an elegant high-level programming interface can be achieved without sacrificing performance. It provides efficient implementations of common data types, storage layouts and algorithms. It is the data model underlying ImageJ2, the KNIME Image Processing toolbox and an increasing number of Fiji-Plugins. ImgLib2 is licensed under BSD. Documentation and source code are available at http://imglib2.net and in a public repository at https://github.com/imagej/imglib. Supplementary data are available at Bioinformatics Online. saalfeld@mpi-cbg.de

  8. Assessment of Literature Related to Combustion Appliance Venting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rapp, V. H.; Less, B. D.; Singer, B. C.

    In many residential building retrofit programs, air tightening to increase energy efficiency is often constrained by safety concerns with naturally vented combustion appliances. Tighter residential buildings more readily depressurize when exhaust equipment is operated, making combustion appliances more prone to backdraft or spill combustion exhaust into the living space. Several measures, such as installation guidelines, vent sizing codes, and combustion safety diagnostics, are in place with the intent to prevent backdrafting and combustion spillage, but the diagnostics conflict and the risk mitigation objective is inconsistent. This literature review summarizes the metrics and diagnostics used to assess combustion safety, documents theirmore » technical basis, and investigates their risk mitigations. It compiles information from the following: codes for combustion appliance venting and installation; standards and guidelines for combustion safety diagnostics; research evaluating combustion safety diagnostics; research investigating wind effects on building depressurization and venting; and software for simulating vent system performance.« less

  9. Earth-moon system: Dynamics and parameter estimation; numerical considerations and program documentation

    NASA Technical Reports Server (NTRS)

    Breedlove, W. J., Jr.

    1976-01-01

    Major activities included coding and verifying equations of motion for the earth-moon system. Some attention was also given to numerical integration methods and parameter estimation methods. Existing analytical theories such as Brown's lunar theory, Eckhardt's theory for lunar rotation, and Newcomb's theory for the rotation of the earth were coded and verified. These theories serve as checks for the numerical integration. Laser ranging data for the period January 1969 - December 1975 was collected and stored on tape. The main goal of this research is the development of software to enable physical parameters of the earth-moon system to be estimated making use of data available from the Lunar Laser Ranging Experiment and the Very Long Base Interferometry experiment of project Apollo. A more specific goal is to develop software for the estimation of certain physical parameters of the moon such as inertia ratios, and the third and fourth harmonic gravity coefficients.

  10. Inside the Hotline: A compilation of 1991 monthly Hotline reports. Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-03-01

    The Resource Conservation and Recovery Act (RCRA)/Superfund (SF)/Office of Underground Storage Tanks (OUST) and Emergency Planning and Community Right-to-Know (EPCRA) Hotlines were established to respond to inquiries from the regulated community and the public concerning waste management and disposal regulations. The Hotline also serves as a referral point on the availability and distribution of program related documents and published materials. The document is a compilation of questions and answers, Federal Register summaries from individual Monthly Hotline Reports for the period of January to December 1991. It also contains user-friendly indices which are arranged according to subject matter, regulatory and statorymore » citations. The document can be used by its reader to explore the application of the regulations in different scenarios or to shed light on complex issues. Neither the questions nor the FR summaries are intended to fully represent or be used in place of the regulations. For an understanding of the actual regulatory requirements in any given situation, the reader must consult the appropriate sections of Title 40 of the Code of Federal Regulations, pertinent FR and EPA guidance documents, as well as relevant State regulations.« less

  11. FLY MPI-2: a parallel tree code for LSS

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Comparato, M.; Antonuccio-Delogu, V.

    2006-04-01

    New version program summaryProgram title: FLY 3.1 Catalogue identifier: ADSC_v2_0 Licensing provisions: yes Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSC_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland No. of lines in distributed program, including test data, etc.: 158 172 No. of bytes in distributed program, including test data, etc.: 4 719 953 Distribution format: tar.gz Programming language: Fortran 90, C Computer: Beowulf cluster, PC, MPP systems Operating system: Linux, Aix RAM: 100M words Catalogue identifier of previous version: ADSC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 155 (2003) 159 Does the new version supersede the previous version?: yes Nature of problem: FLY is a parallel collisionless N-body code for the calculation of the gravitational force Solution method: FLY is based on the hierarchical oct-tree domain decomposition introduced by Barnes and Hut (1986) Reasons for the new version: The new version of FLY is implemented by using the MPI-2 standard: the distributed version 3.1 was developed by using the MPICH2 library on a PC Linux cluster. Today the FLY performance allows us to consider the FLY code among the most powerful parallel codes for tree N-body simulations. Another important new feature regards the availability of an interface with hydrodynamical Paramesh based codes. Simulations must follow a box large enough to accurately represent the power spectrum of fluctuations on very large scales so that we may hope to compare them meaningfully with real data. The number of particles then sets the mass resolution of the simulation, which we would like to make as fine as possible. The idea to build an interface between two codes, that have different and complementary cosmological tasks, allows us to execute complex cosmological simulations with FLY, specialized for DM evolution, and a code specialized for hydrodynamical components that uses a Paramesh block structure. Summary of revisions: The parallel communication schema was totally changed. The new version adopts the MPICH2 library. Now FLY can be executed on all Unix systems having an MPI-2 standard library. The main data structure, is declared in a module procedure of FLY (fly_h.F90 routine). FLY creates the MPI Window object for one-sided communication for all the shared arrays, with a call like the following: CALL MPI_WIN_CREATE(POS, SIZE, REAL8, MPI_INFO_NULL, MPI_COMM_WORLD, WIN_POS, IERR) the following main window objects are created: win_pos, win_vel, win_acc: particles positions velocities and accelerations, win_pos_cell, win_mass_cell, win_quad, win_subp, win_grouping: cells positions, masses, quadrupole momenta, tree structure and grouping cells. Other windows are created for dynamic load balance and global counters. Restrictions: The program uses the leapfrog integrator schema, but could be changed by the user. Unusual features: FLY uses the MPI-2 standard: the MPICH2 library on Linux systems was adopted. To run this version of FLY the working directory must be shared among all the processors that execute FLY. Additional comments: Full documentation for the program is included in the distribution in the form of a README file, a User Guide and a Reference manuscript. Running time: IBM Linux Cluster 1350, 512 nodes with 2 processors for each node and 2 GB RAM for each processor, at Cineca, was adopted to make performance tests. Processor type: Intel Xeon Pentium IV 3.0 GHz and 512 KB cache (128 nodes have Nocona processors). Internal Network: Myricom LAN Card "C" Version and "D" Version. Operating System: Linux SuSE SLES 8. The code was compiled using the mpif90 compiler version 8.1 and with basic optimization options in order to have performances that could be useful compared with other generic clusters Processors

  12. GOMA 6.0 :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunk, Peter Randall; Rao, Rekha Ranjana; Chen, Ken S

    Goma 6.0 is a finite element program which excels in analyses of multiphysical processes, particularly those involving the major branches of mechanics (viz. fluid/solid mechanics, energy transport and chemical species transport). Goma is based on a full-Newton-coupled algorithm which allows for simultaneous solution of the governing principles, making the code ideally suited for problems involving closely coupled bulk mechanics and interfacial phenomena. Example applications include, but are not limited to, coating and polymer processing flows, super-alloy processing, welding/soldering, electrochemical processes, and solid-network or solution film drying. This document serves as a users guide and reference.

  13. MILSTAMP TACs: Military Standard Transportation and Movement Procedures Transportation Account Codes. Volume 2

    DTIC Science & Technology

    1987-02-15

    this chapter. NO - If shipment is not second des - tination transportation , obtain fund cite per yes response for question 2 above. 4. For Direct Support...return . . . . . . . . .0 . . . . . . . a. . .. A820 (8) LOGAIR/QUICKTRANS. Transportation Account Codes de - signed herein are applicable to the...oo~• na~- Transportation Tis Document Contains Tasotto Missing Page/s That Are Unavailable In The And Original Document Movement sdocument has boon

  14. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  15. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  16. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We believe that the lessons learned and tools developed will be useful in many areas of science beyond the computational mineralogy. Key tools that will be described include: a pure Fortran XML library (FoX) that presents XPath, SAX and DOM interfaces as well as permitting the easy production of valid XML from legacy Fortran programs; a job submission framework that automatically schedules calculations to remote grid resources, handles data staging and metadata capture; and a tool (AgentX) that map concepts from an ontology onto locations in documents of various formats that we use to enable data exchange.

  17. User Manual for the NASA Glenn Ice Accretion Code LEWICE. Version 2.2.2

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2002-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.2.2 of this code, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A of this report has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  18. Aether: leveraging linear programming for optimal cloud computing in genomics.

    PubMed

    Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J; Kostic, Aleksandar D

    2018-05-01

    Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users' existing HPC pipelines. Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu. Supplementary data are available at Bioinformatics online.

  19. Dynamic Analysis of Spur Gear Transmissions (DANST). PC Version 3.00 User Manual

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Lin, Hsiang Hsi; Delgado, Irebert R.

    1996-01-01

    DANST is a FORTRAN computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the static transmission error, dynamic load, tooth bending stress and other properties of spur gears as they are influenced by operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratios ranging from one to three. It was designed to be easy to use and it is extensively documented in several previous reports and by comments in the source code. This report describes installing and using a new PC version of DANST, covers input data requirements and presents examples.

  20. NGNP Data Management and Analysis System Modeling Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cynthia D. Gentillon

    2009-09-01

    Projects for the very-high-temperature reactor (VHTR) program provide data in support of Nuclear Regulatory Commission licensing of the VHTR. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high temperature and high fluence environments. In addition, thermal-hydraulic experiments are conducted to validate codes used to assess reactor safety. The VHTR Program has established the NGNP Data Management and Analysis System (NDMAS) to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the thirdmore » NDMAS objective. It describes capabilities for displaying the data in meaningful ways and identifying relationships among the measured quantities that contribute to their understanding.« less

Top