Sample records for code specifically developed

  1. A proto-code of ethics and conduct for European nurse directors.

    PubMed

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  2. Methodology, status and plans for development and assessment of TUF and CATHENA codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luxat, J.C.; Liu, W.S.; Leung, R.K.

    1997-07-01

    An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less

  3. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, Charles J.; Shi, Xizeng

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less

  4. Guidelines for development structured FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Earnest, B. M.

    1984-01-01

    Computer programming and coding standards were compiled to serve as guidelines for the uniform writing of FORTRAN 77 programs at NASA Langley. Software development philosophy, documentation, general coding conventions, and specific FORTRAN coding constraints are discussed.

  5. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  6. Technical Support Document for Version 3.6.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2009-09-29

    This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  7. Regulation of mammalian cell differentiation by long non-coding RNAs

    PubMed Central

    Hu, Wenqian; Alvarez-Dominguez, Juan R; Lodish, Harvey F

    2012-01-01

    Differentiation of specialized cell types from stem and progenitor cells is tightly regulated at several levels, both during development and during somatic tissue homeostasis. Many long non-coding RNAs have been recognized as an additional layer of regulation in the specification of cellular identities; these non-coding species can modulate gene-expression programmes in various biological contexts through diverse mechanisms at the transcriptional, translational or messenger RNA stability levels. Here, we summarize findings that implicate long non-coding RNAs in the control of mammalian cell differentiation. We focus on several representative differentiation systems and discuss how specific long non-coding RNAs contribute to the regulation of mammalian development. PMID:23070366

  8. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  9. Pre- and Post-Processing Tools to Streamline the CFD Process

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne Miller

    2002-01-01

    This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.

  10. Non-coding cancer driver candidates identified with a sample- and position-specific model of the somatic mutation rate

    PubMed Central

    Juul, Malene; Bertl, Johanna; Guo, Qianyun; Nielsen, Morten Muhlig; Świtnicki, Michał; Hornshøj, Henrik; Madsen, Tobias; Hobolth, Asger; Pedersen, Jakob Skou

    2017-01-01

    Non-coding mutations may drive cancer development. Statistical detection of non-coding driver regions is challenged by a varying mutation rate and uncertainty of functional impact. Here, we develop a statistically founded non-coding driver-detection method, ncdDetect, which includes sample-specific mutational signatures, long-range mutation rate variation, and position-specific impact measures. Using ncdDetect, we screened non-coding regulatory regions of protein-coding genes across a pan-cancer set of whole-genomes (n = 505), which top-ranked known drivers and identified new candidates. For individual candidates, presence of non-coding mutations associates with altered expression or decreased patient survival across an independent pan-cancer sample set (n = 5454). This includes an antigen-presenting gene (CD1A), where 5’UTR mutations correlate significantly with decreased survival in melanoma. Additionally, mutations in a base-excision-repair gene (SMUG1) correlate with a C-to-T mutational-signature. Overall, we find that a rich model of mutational heterogeneity facilitates non-coding driver identification and integrative analysis points to candidates of potential clinical relevance. DOI: http://dx.doi.org/10.7554/eLife.21778.001 PMID:28362259

  11. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobromir Panayotov; Andrew Grief; Brad J. Merrill

    'Fusion for Energy' (F4E) develops designs and implements the European Test Blanket Systems (TBS) in ITER - Helium-Cooled Lithium-Lead (HCLL) and Helium-Cooled Pebble-Bed (HCPB). Safety demonstration is an essential element for the integration of TBS in ITER and accident analyses are one of its critical segments. A systematic approach to the accident analyses had been acquired under the F4E contract on TBS safety analyses. F4E technical requirements and AMEC and INL efforts resulted in the development of a comprehensive methodology for fusion breeding blanket accident analyses. It addresses the specificity of the breeding blankets design, materials and phenomena and atmore » the same time is consistent with the one already applied to ITER accident analyses. Methodology consists of several phases. At first the reference scenarios are selected on the base of FMEA studies. In the second place elaboration of the accident analyses specifications we use phenomena identification and ranking tables to identify the requirements to be met by the code(s) and TBS models. Thus the limitations of the codes are identified and possible solutions to be built into the models are proposed. These include among others the loose coupling of different codes or code versions in order to simulate multi-fluid flows and phenomena. The code selection and issue of the accident analyses specifications conclude this second step. Furthermore the breeding blanket and ancillary systems models are built on. In this work challenges met and solutions used in the development of both MELCOR and RELAP5 codes models of HCLL and HCPB TBSs will be shared. To continue the developed models are qualified by comparison with finite elements analyses, by code to code comparison and sensitivity studies. Finally, the qualified models are used for the execution of the accident analyses of specific scenario. When possible the methodology phases will be illustrated in the paper by limited number of tables and figures. Description of each phase and its results in detail as well the methodology applications to EU HCLL and HCPB TBSs will be published in separate papers. The developed methodology is applicable to accident analyses of other TBSs to be tested in ITER and as well to DEMO breeding blankets.« less

  13. FEDEF: A High Level Architecture Federate Development Framework

    DTIC Science & Technology

    2010-09-01

    require code changes for operability between HLA specifications. Configuration of federate requirements such as publications, subscriptions, time ... management , and management protocol should occur outside of federate source code, allowing for federate reusability without code modification and re

  14. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  15. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  16. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, Tony; Sutherland, James C.

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  17. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE PAGES

    Saad, Tony; Sutherland, James C.

    2016-05-04

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  18. Guide to Permitting Hydrogen Motor Fuel Dispensing Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivkin, Carl; Buttner, William; Burgess, Robert

    2016-03-28

    The purpose of this guide is to assist project developers, permitting officials, code enforcement officials, and other parties involved in developing permit applications and approving the implementation of hydrogen motor fuel dispensing facilities. The guide facilitates the identification of the elements to be addressed in the permitting of a project as it progresses through the approval process; the specific requirements associated with those elements; and the applicable (or potentially applicable) codes and standards by which to determine whether the specific requirements have been met. The guide attempts to identify all applicable codes and standards relevant to the permitting requirements.

  19. Design implications for task-specific search utilities for retrieval and re-engineering of code

    NASA Astrophysics Data System (ADS)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  20. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.

  1. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  2. Technical Support Document for Version 3.4.0 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2007-09-14

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.« less

  3. Development and validation of a registry-based definition of eosinophilic esophagitis in Denmark

    PubMed Central

    Dellon, Evan S; Erichsen, Rune; Pedersen, Lars; Shaheen, Nicholas J; Baron, John A; Sørensen, Henrik T; Vyberg, Mogens

    2013-01-01

    AIM: To develop and validate a case definition of eosinophilic esophagitis (EoE) in the linked Danish health registries. METHODS: For case definition development, we queried the Danish medical registries from 2006-2007 to identify candidate cases of EoE in Northern Denmark. All International Classification of Diseases-10 (ICD-10) and prescription codes were obtained, and archived pathology slides were obtained and re-reviewed to determine case status. We used an iterative process to select inclusion/exclusion codes, refine the case definition, and optimize sensitivity and specificity. We then re-queried the registries from 2008-2009 to yield a validation set. The case definition algorithm was applied, and sensitivity and specificity were calculated. RESULTS: Of the 51 and 49 candidate cases identified in both the development and validation sets, 21 and 24 had EoE, respectively. Characteristics of EoE cases in the development set [mean age 35 years; 76% male; 86% dysphagia; 103 eosinophils per high-power field (eos/hpf)] were similar to those in the validation set (mean age 42 years; 83% male; 67% dysphagia; 77 eos/hpf). Re-review of archived slides confirmed that the pathology coding for esophageal eosinophilia was correct in greater than 90% of cases. Two registry-based case algorithms based on pathology, ICD-10, and pharmacy codes were successfully generated in the development set, one that was sensitive (90%) and one that was specific (97%). When these algorithms were applied to the validation set, they remained sensitive (88%) and specific (96%). CONCLUSION: Two registry-based definitions, one highly sensitive and one highly specific, were developed and validated for the linked Danish national health databases, making future population-based studies feasible. PMID:23382628

  4. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  5. Improving the accuracy of operation coding in surgical discharge summaries

    PubMed Central

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  6. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  7. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  8. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... or will be developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar...

  9. 48 CFR 252.227-7013 - Rights in technical data-Noncommercial items.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... causing a computer to perform a specific operation or series of operations. (3) Computer software means computer programs, source code, source code listings, object code listings, design details, algorithms... developed exclusively with Government funds; (ii) Studies, analyses, test data, or similar data produced for...

  10. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  11. Technical Support Document for Version 3.9.0 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2011-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.« less

  12. Technical Support Document for Version 3.9.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.« less

  13. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  14. Overview of FAR-TECH's magnetic fusion energy research

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Soo; Bogatu, I. N.; Galkin, S. A.; Spencer, J. Andrew; Svidzinski, V. A.; Zhao, L.

    2017-10-01

    FAR-TECH, Inc. has been working on magnetic fusion energy research over two-decades. During the years, we have developed unique approaches to help understanding the physics, and resolving issues in magnetic fusion energy. The specific areas of work have been in modeling RF waves in plasmas, MHD modeling and mode-identification, and nano-particle plasma jet and its application to disruption mitigation. Our research highlights in recent years will be presented with examples, specifically, developments of FullWave (Full Wave RF code), PMARS (Parallelized MARS code), and HEM (Hybrid ElectroMagnetic code). In addition, nano-particle plasma-jet (NPPJ) and its application for disruption mitigation will be presented. Work is supported by the U.S. DOE SBIR program.

  15. Operational rate-distortion performance for joint source and channel coding of images.

    PubMed

    Ruf, M J; Modestino, J W

    1999-01-01

    This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.

  16. Improving the sensitivity and specificity of the abbreviated injury scale coding system.

    PubMed Central

    Kramer, C F; Barancik, J I; Thode, H C

    1990-01-01

    The Abbreviated Injury Scale with Epidemiologic Modifications (AIS 85-EM) was developed to make it possible to code information about anatomic injury types and locations that, although generally available from medical records, is not codable under the standard Abbreviated Injury Scale, published by the American Association for Automotive Medicine in 1985 (AIS 85). In a population-based sample of 3,223 motor vehicle trauma cases, 68 percent of the patients had one or more injuries that were coded to the AIS 85 body region nonspecific category external. When the same patients' injuries were coded using the AIS 85-EM coding procedure, only 15 percent of the patients had injuries that could not be coded to a specific body region. With AIS 85-EM, the proportion of codable head injury cases increased from 16 percent to 37 percent, thereby improving the potential for identifying cases with head and threshold brain injury. The data suggest that body region coding of all injuries is necessary to draw valid and reliable conclusions about changes in injury patterns and their sequelae. The increased specificity of body region coding improves assessments of the efficacy of injury intervention strategies and countermeasure programs using epidemiologic methodology. PMID:2116633

  17. The proposed coding standard at GSFC

    NASA Technical Reports Server (NTRS)

    Morakis, J. C.; Helgert, H. J.

    1977-01-01

    As part of the continuing effort to introduce standardization of spacecraft and ground equipment in satellite systems, NASA's Goddard Space Flight Center and other NASA facilities have supported the development of a set of standards for the use of error control coding in telemetry subsystems. These standards are intended to ensure compatibility between spacecraft and ground encoding equipment, while allowing sufficient flexibility to meet all anticipated mission requirements. The standards which have been developed to date cover the application of block codes in error detection and error correction modes, as well as short and long constraint length convolutional codes decoded via the Viterbi and sequential decoding algorithms, respectively. Included are detailed specifications of the codes, and their implementation. Current effort is directed toward the development of standards covering channels with burst noise characteristics, channels with feedback, and code concatenation.

  18. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  19. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  20. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  1. Development of N-version software samples for an experiment in software fault tolerance

    NASA Technical Reports Server (NTRS)

    Lauterbach, L.

    1987-01-01

    The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.

  2. Some partial-unit-memory convolutional codes

    NASA Technical Reports Server (NTRS)

    Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.

    1991-01-01

    The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.

  3. Empirical impact evaluation of the WHO Global Code of Practice on the International Recruitment of Health Personnel in Australia, Canada, UK and USA

    PubMed Central

    2013-01-01

    Background The active recruitment of health workers from developing countries to developed countries has become a major threat to global health. In an effort to manage this migration, the 63rd World Health Assembly adopted the World Health Organization (WHO) Global Code of Practice on the International Recruitment of Health Personnel in May 2010. While the Code has been lauded as the first globally-applicable regulatory framework for health worker recruitment, its impact has yet to be evaluated. We offer the first empirical evaluation of the Code’s impact on national and sub-national actors in Australia, Canada, United Kingdom and United States of America, which are the English-speaking developed countries with the greatest number of migrant health workers. Methods 42 key informants from across government, civil society and private sectors were surveyed to measure their awareness of the Code, knowledge of specific changes resulting from it, overall opinion on the effectiveness of non-binding codes, and suggestions to improve this Code’s implementation. Results 60% of respondents believed their colleagues were not aware of the Code, and 93% reported that no specific changes had been observed in their work as a result of the Code. 86% reported that the Code has not had any meaningful impact on policies, practices or regulations in their countries. Conclusions This suggests a gap between awareness of the Code among stakeholders at global forums and the awareness and behaviour of national and sub-national actors. Advocacy and technical guidance for implementing the Code are needed to improve its impact on national decision-makers. PMID:24228827

  4. Infrastructure for Rapid Development of Java GUI Programs

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip

    2006-01-01

    The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.

  5. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  6. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  7. Is phonology bypassed in normal or dyslexic development?

    PubMed

    Pennington, B F; Lefly, D L; Van Orden, G C; Bookman, M O; Smith, S D

    1987-01-01

    A pervasive assumption in most accounts of normal reading and spelling development is that phonological coding is important early in development but is subsequently superseded by faster, orthographic coding which bypasses phonology. We call this assumption, which derives from dual process theory, the developmental bypass hypothesis. The present study tests four specific predictions of the developmental bypass hypothesis by comparing dyslexics and nondyslexics from the same families in a cross-sectional design. The four predictions are: 1) That phonological coding skill develops early in normal readers and soon reaches asymptote, whereas orthographic coding skill has a protracted course of development; 2) that the correlation of adult reading or spelling performance with phonological coding skill is considerably less than the correlation with orthographic coding skill; 3) that dyslexics who are mainly deficient in phonological coding skill should be able to bypass this deficit and eventually close the gap in reading and spelling performance; and 4) that the greatest differences between dyslexics and developmental controls on measures of phonological coding skill should be observed early rather than late in development.None of the four predictions of the developmental bypass hypothesis were upheld. Phonological coding skill continued to develop in nondyslexics until adulthood. It accounted for a substantial (32-53 percent) portion of the variance in reading and spelling performance in adult nondyslexics, whereas orthographic coding skill did not account for a statistically reliable portion of this variance. The dyslexics differed little across age in phonological coding skill, but made linear progress in orthographic coding skill, surpassing spelling-age (SA) controls by adulthood. Nonetheless, they didnot close the gap in reading and spelling performance. Finally, dyslexics were significantly worse than SA (and Reading Age [RA]) controls in phonological coding skill only in adulthood.

  8. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.

  9. Applang - A DSL for specification of mobile applications for android platform based on textX

    NASA Astrophysics Data System (ADS)

    Kosanović, Milan; Dejanović, Igor; Milosavljević, Gordana

    2016-06-01

    Mobile platforms become a ubiquitous part of our daily lives thus making more pressure to software developers to develop more applications faster and with the support for different mobile operating systems. To foster the faster development of mobile services and applications and to support various mobile operating systems a new software development approaches must be undertaken. Domain-Specific Languages (DSL) are a viable approach that promise to solve a problem of target platform diversity as well as to facilitate rapid application development and shorter time-to-market. This paper presents Applang, a DSL for the specification of mobile applications for the Android platform, based on textX meta-language. The application is described using Applang DSL and the source code for a target platform is automatically generated by the provided code generator. The same application defined using single Applang source can be transformed to various targets with little or no manual modifications.

  10. Recipes for Reading: A Teacher's Handbook for Diagnostic and Prescriptive Teaching, or the Reading Teacher's Cookbook.

    ERIC Educational Resources Information Center

    Moody, Barbara J., Ed.; And Others

    A coding system for categorizing reading skills was developed in order to provide manuals for each grade level (preprimer through 6) that would aid teachers in locating materials on a particular skill by page number in a specific text. A skill code key of the skills usually taught at a given reading grade level is based on specific basal test…

  11. Recipes for Reading: A Teacher's Handbook for Diagnostic and Prescriptive Teaching, or the Reading Teacher's "Cookbook."

    ERIC Educational Resources Information Center

    Moody, Barbara J., Ed.; And Others

    A coding system for categorizing reading skills was developed in order to provide manuals for each grade level (preprimer through 6) that would aid teachers in locating materials on a particular skill by page number in a specific text. A skill code key of the skills usually taught at a given reading grade level is based on specific basal test…

  12. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  13. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  14. Binary weight distributions of some Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Arnold, S.

    1992-01-01

    The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.

  15. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  16. Specifications and programs for computer software validation

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Kleir, R.; Davis, T.; Henneman, M.; Haller, A.; Lasseter, G. L.

    1973-01-01

    Three software products developed during the study are reported and include: (1) FORTRAN Automatic Code Evaluation System, (2) the Specification Language System, and (3) the Array Index Validation System.

  17. Generic Kalman Filter Software

    NASA Technical Reports Server (NTRS)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  18. A Software Development Platform for Wearable Medical Applications.

    PubMed

    Zhang, Ruikai; Lin, Wei

    2015-10-01

    Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers.

  19. New French Regulation for NPPs and Code Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, Claude

    2006-07-01

    On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less

  20. The development of non-coding RNA ontology.

    PubMed

    Huang, Jingshan; Eilbeck, Karen; Smith, Barry; Blake, Judith A; Dou, Dejing; Huang, Weili; Natale, Darren A; Ruttenberg, Alan; Huan, Jun; Zimmermann, Michael T; Jiang, Guoqian; Lin, Yu; Wu, Bin; Strachan, Harrison J; de Silva, Nisansa; Kasukurthi, Mohan Vamsi; Jha, Vikash Kumar; He, Yongqun; Zhang, Shaojie; Wang, Xiaowei; Liu, Zixing; Borchert, Glen M; Tan, Ming

    2016-01-01

    Identification of non-coding RNAs (ncRNAs) has been significantly improved over the past decade. On the other hand, semantic annotation of ncRNA data is facing critical challenges due to the lack of a comprehensive ontology to serve as common data elements and data exchange standards in the field. We developed the Non-Coding RNA Ontology (NCRO) to handle this situation. By providing a formally defined ncRNA controlled vocabulary, the NCRO aims to fill a specific and highly needed niche in semantic annotation of large amounts of ncRNA biological and clinical data.

  1. Python Radiative Transfer Emission code (PyRaTE): non-LTE spectral lines simulations

    NASA Astrophysics Data System (ADS)

    Tritsis, A.; Yorke, H.; Tassis, K.

    2018-05-01

    We describe PyRaTE, a new, non-local thermodynamic equilibrium (non-LTE) line radiative transfer code developed specifically for post-processing astrochemical simulations. Population densities are estimated using the escape probability method. When computing the escape probability, the optical depth is calculated towards all directions with density, molecular abundance, temperature and velocity variations all taken into account. A very easy-to-use interface, capable of importing data from simulations outputs performed with all major astrophysical codes, is also developed. The code is written in PYTHON using an "embarrassingly parallel" strategy and can handle all geometries and projection angles. We benchmark the code by comparing our results with those from RADEX (van der Tak et al. 2007) and against analytical solutions and present case studies using hydrochemical simulations. The code will be released for public use.

  2. Maneuvering Rotorcraft Noise Prediction: A New Code for a New Problem

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.; Bres, Guillaume A.; Perez, Guillaume; Jones, Henry E.

    2002-01-01

    This paper presents the unique aspects of the development of an entirely new maneuver noise prediction code called PSU-WOPWOP. The main focus of the code is the aeroacoustic aspects of the maneuver noise problem, when the aeromechanical input data are provided (namely aircraft and blade motion, blade airloads). The PSU-WOPWOP noise prediction capability was developed for rotors in steady and transient maneuvering flight. Featuring an object-oriented design, the code allows great flexibility for complex rotor configuration and motion (including multiple rotors and full aircraft motion). The relative locations and number of hinges, flexures, and body motions can be arbitrarily specified to match the any specific rotorcraft. An analysis of algorithm efficiency is performed for maneuver noise prediction along with a description of the tradeoffs made specifically for the maneuvering noise problem. Noise predictions for the main rotor of a rotorcraft in steady descent, transient (arrested) descent, hover and a mild "pop-up" maneuver are demonstrated.

  3. Identifying Psoriasis and Psoriatic Arthritis Patients in Retrospective Databases When Diagnosis Codes Are Not Available: A Validation Study Comparing Medication/Prescriber Visit-Based Algorithms with Diagnosis Codes.

    PubMed

    Dobson-Belaire, Wendy; Goodfield, Jason; Borrelli, Richard; Liu, Fei Fei; Khan, Zeba M

    2018-01-01

    Using diagnosis code-based algorithms is the primary method of identifying patient cohorts for retrospective studies; nevertheless, many databases lack reliable diagnosis code information. To develop precise algorithms based on medication claims/prescriber visits (MCs/PVs) to identify psoriasis (PsO) patients and psoriatic patients with arthritic conditions (PsO-AC), a proxy for psoriatic arthritis, in Canadian databases lacking diagnosis codes. Algorithms were developed using medications with narrow indication profiles in combination with prescriber specialty to define PsO and PsO-AC. For a 3-year study period from July 1, 2009, algorithms were validated using the PharMetrics Plus database, which contains both adjudicated medication claims and diagnosis codes. Positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity of the developed algorithms were assessed using diagnosis code as the reference standard. Chosen algorithms were then applied to Canadian drug databases to profile the algorithm-identified PsO and PsO-AC cohorts. In the selected database, 183,328 patients were identified for validation. The highest PPVs for PsO (85%) and PsO-AC (65%) occurred when a predictive algorithm of two or more MCs/PVs was compared with the reference standard of one or more diagnosis codes. NPV and specificity were high (99%-100%), whereas sensitivity was low (≤30%). Reducing the number of MCs/PVs or increasing diagnosis claims decreased the algorithms' PPVs. We have developed an MC/PV-based algorithm to identify PsO patients with a high degree of accuracy, but accuracy for PsO-AC requires further investigation. Such methods allow researchers to conduct retrospective studies in databases in which diagnosis codes are absent. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Digital Controller For Emergency Beacon

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1990-01-01

    Prototype digital controller intended for use in 406-MHz emergency beacon. Undergoing development according to international specifications, 406-MHz emergency beacon system includes satellites providing worldwide monitoring of beacons, with Doppler tracking to locate each beacon within 5 km. Controller turns beacon on and off and generates binary codes identifying source (e.g., ship, aircraft, person, or vehicle on land). Codes transmitted by phase modulation. Knowing code, monitor attempts to communicate with user, monitor uses code information to dispatch rescue team appropriate to type and locations of carrier.

  5. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    DTIC Science & Technology

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  6. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  7. Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.

    PubMed

    Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong

    2017-05-01

    Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.

  8. [ENT and head and neck surgery in the German DRG system 2007].

    PubMed

    Franz, D; Roeder, N; Hörmann, K; Alberty, J

    2007-07-01

    The German DRG system has been further developed into version 2007. For ENT and head and neck surgery, significant changes in the coding of diagnoses and medical operations as well as in the the DRG structure have been made. New ICD codes for sleep apnoea and acquired tracheal stenosis have been implemented. Surgery on the acoustic meatus, removal of auricle hyaline cartilage for transplantation (e. g. rhinosurgery) and tonsillotomy have been coded in the 2007 version. In addition, the DRG structure has been improved. Case allocation of more than one significant operation has been established. The G-DRG system has gained in complexity. High demands are made on the coding of complex cases, whereas standard cases require mostly only one specific diagnosis and one specific OPS code. The quality of case allocation for ENT patients within the G-DRG system has been improved. Nevertheless, further adjustments of the G-DRG system are necessary.

  9. Systems, methods and apparatus for modeling, specifying and deploying policies in autonomous and autonomic systems using agent-oriented software engineering

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.

  10. Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.

    2005-01-01

    In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.

  11. Strategies for the Legal Implementation of the International Code of Marketing of Breast-Milk Substitutes: Report on a WHO Meeting (Copenhagen, Denmark, November 10-12, 1982).

    ERIC Educational Resources Information Center

    World Health Organization, Copenhagen (Denmark). Regional Office for Europe.

    For various reasons, several countries have had difficulty implementing the International Code of Marketing of Breast-milk Substitutes. To address those problems, a meeting was convened under the auspices of the World Health Organization. Specific purposes of the meeting were to inform member states about the Code and to develop national…

  12. A Deterministic Transport Code for Space Environment Electrons

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.

    2010-01-01

    A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.

  13. Highly selective BSA imprinted polyacrylamide hydrogels facilitated by a metal-coding MIP approach.

    PubMed

    El-Sharif, H F; Yapati, H; Kalluru, S; Reddy, S M

    2015-12-01

    We report the fabrication of metal-coded molecularly imprinted polymers (MIPs) using hydrogel-based protein imprinting techniques. A Co(II) complex was prepared using (E)-2-((2 hydrazide-(4-vinylbenzyl)hydrazono)methyl)phenol; along with iron(III) chloroprotoporphyrin (Hemin), vinylferrocene (VFc), zinc(II) protoporphyrin (ZnPP) and protoporphyrin (PP), these complexes were introduced into the MIPs as co-monomers for metal-coding of non-metalloprotein imprints. Results indicate a 66% enhancement for bovine serum albumin (BSA) protein binding capacities (Q, mg/g) via metal-ion/ligand exchange properties within the metal-coded MIPs. Specifically, Co(II)-complex-based MIPs exhibited 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/non-imprinted (NIP) control). The selectivity of our Co(II)-coded BSA MIPs were also tested using bovine haemoglobin (BHb), lysozyme (Lyz), and trypsin (Tryp). By evaluating imprinting factors (K), each of the latter proteins was found to have lower affinities in comparison to cognate BSA template. The hydrogels were further characterised by thermal analysis and differential scanning calorimetry (DSC) to assess optimum polymer composition. The development of hydrogel-based molecularly imprinted polymer (HydroMIPs) technology for the memory imprinting of proteins and for protein biosensor development presents many possibilities, including uses in bio-sample clean-up or selective extraction, replacement of biological antibodies in immunoassays and biosensors for medicine and the environment. Biosensors for proteins and viruses are currently expensive to develop because they require the use of expensive antibodies. Because of their biomimicry capabilities (and their potential to act as synthetic antibodies), HydroMIPs potentially offer a route to the development of new low-cost biosensors. Herein, a metal ion-mediated imprinting approach was employed to metal-code our hydrogel-based MIPs for the selective recognition of bovine serum albumin (BSA). Specifically, Co(II)-complex based MIPs exhibited a 66% enhancement (in comparison to our normal MIPs) exhibiting 92 ± 1% specific binding with Q values of 5.7 ± 0.45 mg BSA/g polymer and imprinting factors (IF) of 14.8 ± 1.9 (MIP/ non-imprinted (NIP) control). The proposed metal-coded MIPs for protein recognition are intended to lead to unprecedented improvement in MIP selectivity and for future biosensor development that rely on an electrochemical redox processes. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  14. More Questions than Answers: A Response to Stephens, Reeder, and Elder.

    ERIC Educational Resources Information Center

    Bhaerman, Robert D.

    1992-01-01

    Responds to the three main articles in this issue with questions concerning the development and use of policy-impact codes (rural-urban classification systems) for specific purposes in policymaking, research, and practice. Questions the necessity for policy-impact codes to ensure equity, adequacy, responsiveness, and appropriateness of rural…

  15. Author Correction: Single-nucleus analysis of accessible chromatin in developing mouse forebrain reveals cell-type-specific transcriptional regulation.

    PubMed

    Preissl, Sebastian; Fang, Rongxin; Huang, Hui; Zhao, Yuan; Raviram, Ramya; Gorkin, David U; Zhang, Yanxiao; Sos, Brandon C; Afzal, Veena; Dickel, Diane E; Kuan, Samantha; Visel, Axel; Pennacchio, Len A; Zhang, Kun; Ren, Bing

    2018-03-01

    In the version of this article initially published online, the accession code was given as GSE1000333. The correct code is GSE100033. The error has been corrected in the print, HTML and PDF versions of the article.

  16. Transforming Aggregate Object-Oriented Formal Specifications to Code

    DTIC Science & Technology

    1999-03-01

    integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development

  17. Genomic analysis of organismal complexity in the multicellular green alga Volvox carteri

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prochnik, Simon E.; Umen, James; Nedelcu, Aurora

    2010-07-01

    Analysis of the Volvox carteri genome reveals that this green alga's increased organismal complexity and multicellularity are associated with modifications in protein families shared with its unicellular ancestor, and not with large-scale innovations in protein coding capacity. The multicellular green alga Volvox carteri and its morphologically diverse close relatives (the volvocine algae) are uniquely suited for investigating the evolution of multicellularity and development. We sequenced the 138 Mb genome of V. carteri and compared its {approx}14,500 predicted proteins to those of its unicellular relative, Chlamydomonas reinhardtii. Despite fundamental differences in organismal complexity and life history, the two species have similarmore » protein-coding potentials, and few species-specific protein-coding gene predictions. Interestingly, volvocine algal-specific proteins are enriched in Volvox, including those associated with an expanded and highly compartmentalized extracellular matrix. Our analysis shows that increases in organismal complexity can be associated with modifications of lineage-specific proteins rather than large-scale invention of protein-coding capacity.« less

  18. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  19. Transversal Clifford gates on folded surface codes

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less

  20. Automated real-time software development

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.

    1993-01-01

    A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.

  1. Black box multigrid

    NASA Technical Reports Server (NTRS)

    Dendy, J. E., Jr.

    1981-01-01

    The black box multigrid (BOXMG) code, which only needs specification of the matrix problem for application in the multigrid method was investigated. It is contended that a major problem with the multigrid method is that each new grid configuration requires a major programming effort to develop a code that specifically handles that grid configuration. The SOR and ICCG methods only specify the matrix problem, no matter what the grid configuration. It is concluded that the BOXMG does everything else necessary to set up the auxiliary coarser problems to achieve a multigrid solution.

  2. Tutorial on Reed-Solomon error correction coding

    NASA Technical Reports Server (NTRS)

    Geisel, William A.

    1990-01-01

    This tutorial attempts to provide a frank, step-by-step approach to Reed-Solomon (RS) error correction coding. RS encoding and RS decoding both with and without erasing code symbols are emphasized. There is no need to present rigorous proofs and extreme mathematical detail. Rather, the simple concepts of groups and fields, specifically Galois fields, are presented with a minimum of complexity. Before RS codes are presented, other block codes are presented as a technical introduction into coding. A primitive (15, 9) RS coding example is then completely developed from start to finish, demonstrating the encoding and decoding calculations and a derivation of the famous error-locator polynomial. The objective is to present practical information about Reed-Solomon coding in a manner such that it can be easily understood.

  3. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  4. BcMF11, a novel non-coding RNA gene from Brassica campestris, is required for pollen development and male fertility.

    PubMed

    Song, Jiang-Hua; Cao, Jia-Shu; Wang, Cheng-Gang

    2013-01-01

    KEY MESSAGE : BcMF11 as a non-coding RNA gene has an essential role in pollen development, and might be useful for regulating the pollen fertility of crops by antisense RNA technology. We previously identified a 828-bp full-length cDNA of BcMF11, a novel pollen-specific non-coding mRNA-like gene from Chinese cabbage (Brassica campestris L. ssp. chinensis Makino). However, little information is known about the function of BcMF11 in pollen development. To investigate its exact biological roles in pollen development, the BcMF11 cDNA was antisense inhibited in transgenic Chinese cabbage under the control of a tapetum-specific promoter BcA9 and a constitutive promoter CaMV 35S. Antisense RNA transgenic plants displayed decreasing expression of BcMF11 and showed distinct morphological defects. Pollen germination test in vitro and in vivo of the transgenic plants suggested that inhibition of BcMF11 decreased pollen germination efficiency and delayed the pollen tubes' extension in the style. Under scanning electron microscopy, many shrunken and collapsed pollen grains were detected in the antisense BcMF11 transgenic Chinese cabbage. Further cytological observation revealed abnormal pollen development process in transgenic plants, including delayed degradation of tapetum, asynchronous separation of microspore, and aborted development of pollen grain. These results suggest that BcMF11, as a non-coding RNA, plays an essential role in pollen development and male fertility.

  5. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  6. Utilization of genetic tests: analysis of gene-specific billing in Medicare claims data.

    PubMed

    Lynch, Julie A; Berse, Brygida; Dotson, W David; Khoury, Muin J; Coomer, Nicole; Kautter, John

    2017-08-01

    We examined the utilization of precision medicine tests among Medicare beneficiaries through analysis of gene-specific tier 1 and 2 billing codes developed by the American Medical Association in 2012. We conducted a retrospective cross-sectional study. The primary source of data was 2013 Medicare 100% fee-for-service claims. We identified claims billed for each laboratory test, the number of patients tested, expenditures, and the diagnostic codes indicated for testing. We analyzed variations in testing by patient demographics and region of the country. Pharmacogenetic tests were billed most frequently, accounting for 48% of the expenditures for new codes. The most common indications for testing were breast cancer, long-term use of medications, and disorders of lipid metabolism. There was underutilization of guideline-recommended tumor mutation tests (e.g., epidermal growth factor receptor) and substantial overutilization of a test discouraged by guidelines (methylenetetrahydrofolate reductase). Methodology-based tier 2 codes represented 15% of all claims billed with the new codes. The highest rate of testing per beneficiary was in Mississippi and the lowest rate was in Alaska. Gene-specific billing codes significantly improved our ability to conduct population-level research of precision medicine. Analysis of these data in conjunction with clinical records should be conducted to validate findings.Genet Med advance online publication 26 January 2017.

  7. Aging and the Baseline Code: An Alternative to the "Normless Elderly."

    ERIC Educational Resources Information Center

    Offenbacher, Deborah I.; Poster, Constance H.

    1985-01-01

    A projective test administered to 120 older persons revealed a "baseline normative code" to which respondents held themselves and their contemporaries. Findings suggest that, in the absence of age-specific norms, the elderly do not become "normless" but develop their own normative prescriptions to fit their past socialization and present…

  8. TFIIS-Dependent Non-coding Transcription Regulates Developmental Genome Rearrangements

    PubMed Central

    Maliszewska-Olejniczak, Kamila; Gruchota, Julita; Gromadka, Robert; Denby Wilkes, Cyril; Arnaiz, Olivier; Mathy, Nathalie; Duharcourt, Sandra; Bétermier, Mireille; Nowak, Jacek K.

    2015-01-01

    Because of their nuclear dimorphism, ciliates provide a unique opportunity to study the role of non-coding RNAs (ncRNAs) in the communication between germline and somatic lineages. In these unicellular eukaryotes, a new somatic nucleus develops at each sexual cycle from a copy of the zygotic (germline) nucleus, while the old somatic nucleus degenerates. In the ciliate Paramecium tetraurelia, the genome is massively rearranged during this process through the reproducible elimination of repeated sequences and the precise excision of over 45,000 short, single-copy Internal Eliminated Sequences (IESs). Different types of ncRNAs resulting from genome-wide transcription were shown to be involved in the epigenetic regulation of genome rearrangements. To understand how ncRNAs are produced from the entire genome, we have focused on a homolog of the TFIIS elongation factor, which regulates RNA polymerase II transcriptional pausing. Six TFIIS-paralogs, representing four distinct families, can be found in P. tetraurelia genome. Using RNA interference, we showed that TFIIS4, which encodes a development-specific TFIIS protein, is essential for the formation of a functional somatic genome. Molecular analyses and high-throughput DNA sequencing upon TFIIS4 RNAi demonstrated that TFIIS4 is involved in all kinds of genome rearrangements, including excision of ~48% of IESs. Localization of a GFP-TFIIS4 fusion revealed that TFIIS4 appears specifically in the new somatic nucleus at an early developmental stage, before IES excision. RT-PCR experiments showed that TFIIS4 is necessary for the synthesis of IES-containing non-coding transcripts. We propose that these IES+ transcripts originate from the developing somatic nucleus and serve as pairing substrates for germline-specific short RNAs that target elimination of their homologous sequences. Our study, therefore, connects the onset of zygotic non coding transcription to the control of genome plasticity in Paramecium, and establishes for the first time a specific role of TFIIS in non-coding transcription in eukaryotes. PMID:26177014

  9. Case-finding for common mental disorders of anxiety and depression in primary care: an external validation of routinely collected data.

    PubMed

    John, Ann; McGregor, Joanne; Fone, David; Dunstan, Frank; Cornish, Rosie; Lyons, Ronan A; Lloyd, Keith R

    2016-03-15

    The robustness of epidemiological research using routinely collected primary care electronic data to support policy and practice for common mental disorders (CMD) anxiety and depression would be greatly enhanced by appropriate validation of diagnostic codes and algorithms for data extraction. We aimed to create a robust research platform for CMD using population-based, routinely collected primary care electronic data. We developed a set of Read code lists (diagnosis, symptoms, treatments) for the identification of anxiety and depression in the General Practice Database (GPD) within the Secure Anonymised Information Linkage Databank at Swansea University, and assessed 12 algorithms for Read codes to define cases according to various criteria. Annual incidence rates were calculated per 1000 person years at risk (PYAR) to assess recording practice for these CMD between January 1(st) 2000 and December 31(st) 2009. We anonymously linked the 2799 MHI-5 Caerphilly Health and Social Needs Survey (CHSNS) respondents aged 18 to 74 years to their routinely collected GP data in SAIL. We estimated the sensitivity, specificity and positive predictive value of the various algorithms using the MHI-5 as the gold standard. The incidence of combined depression/anxiety diagnoses remained stable over the ten-year period in a population of over 500,000 but symptoms increased from 6.5 to 20.7 per 1000 PYAR. A 'historical' GP diagnosis for depression/anxiety currently treated plus a current diagnosis (treated or untreated) resulted in a specificity of 0.96, sensitivity 0.29 and PPV 0.76. Adding current symptom codes improved sensitivity (0.32) with a marginal effect on specificity (0.95) and PPV (0.74). We have developed an algorithm with a high specificity and PPV of detecting cases of anxiety and depression from routine GP data that incorporates symptom codes to reflect GP coding behaviour. We have demonstrated that using diagnosis and current treatment alone to identify cases for depression and anxiety using routinely collected primary care data will miss a number of true cases given changes in GP recording behaviour. The Read code lists plus the developed algorithms will be applicable to other routinely collected primary care datasets, creating a platform for future e-cohort research into these conditions.

  10. Image Transmission via Spread Spectrum Techniques. Part A

    DTIC Science & Technology

    1976-01-01

    Code 408 DR. EDWIN H. WRENCH (714-225-6871) Code 408 and HARPER J. WHITEHOUSE (714:225-6315), Code 4002 Naval Undersea Center San Diego. California...progress report appears in two parts. Part A is a summary of work done in support of this program at the Naval Undersea Center. Part B contains final...a technical description of the bandwidth compression system developed at the Naval Undersea Center. This paper is an excerpt from the specifications

  11. About the necessity to manage events coded with MedDRA prior to statistical analysis: proposal of a strategy with application to a randomized clinical trial, ANRS 099 ALIZE.

    PubMed

    Journot, Valérie; Tabuteau, Sophie; Collin, Fidéline; Molina, Jean-Michel; Chene, Geneviève; Rancinan, Corinne

    2008-03-01

    Since 2003, the Medical Dictionary for Regulatory Activities (MedDRA) is the regulatory standard for safety report in clinical trials in the European Community. Yet, we found no published example of a practical experience for a scientifically oriented statistical analysis of events coded with MedDRA. We took advantage of a randomized trial in HIV-infected patients with MedDRA-coded events to explain the difficulties encountered during the events analysis and the strategy developed to report events consistently with trial-specific objectives. MedDRA has a rich hierarchical structure, which allows the grouping of coded terms into 5 levels, the highest being "System Organ Class" (SOC). Each coded term may be related to several SOCs, among which one primary SOC is defined. We developed a new general 5-step strategy to select a SOC as trial primary SOC, consistently with trial-specific objectives for this analysis. We applied it to the ANRS 099 ALIZE trial, where all events were coded with MedDRA version 3.0. We compared the MedDRA and the ALIZE primary SOCs. In the ANRS 099 ALIZE trial, 355 patients were recruited, and 3,722 events were reported and documented, among which 35% had multiple SOCs (2 to 4). We applied the proposed 5-step strategy. Altogether, 23% of MedDRA primary SOCs were modified, mainly from MedDRA primary SOCs "Investigations" (69%) and "Ear and labyrinth disorders" (6%), for the ALIZE primary SOCs "Hepatobiliary disorders" (35%), "Musculoskeletal and connective tissue disorders" (21%), and "Gastrointestinal disorders" (15%). MedDRA largely enhanced in size and complexity with versioning and the development of Standardized MedDRA Queries. Yet, statisticians should not systematically rely on primary SOCs proposed by MedDRA to report events. A simple general 5-step strategy to re-classify events consistently with the trial-specific objectives might be useful in HIV trials as well as in other fields.

  12. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  13. Development of the Off-line Analysis Code for GODDESS

    NASA Astrophysics Data System (ADS)

    Garland, Heather; Cizewski, Jolie; Lepailleur, Alex; Walters, David; Pain, Steve; Smith, Karl

    2016-09-01

    Determining (n, γ) cross sections on unstable nuclei is important for understanding the r-process that is theorized to occur in supernovae and neutron-star mergers. However, (n, γ) reactions are difficult to measure directly because of the short lifetime of the involved neutron rich nuclei. A possible surrogate for the (n, γ) reaction is the (d,p γ) reaction; the measurement of these reactions in inverse kinematics is part of the scope of GODDESS - Gammasphere ORRUBA (Oak Ridge Rutgers University Barrel Array): Dual Detectors for Experimental Structure Studies. The development of an accurate and efficient off-line analysis code for GODDESS experiments is not only essential, but also provides a unique opportunity to create an analysis code designed specifically for transfer reaction experiments. The off-line analysis code has been developed to produce histograms from the binary data file to determine how to best sort events. Recent developments in the off-line analysis code will be presented as well as details on the energy and position calibrations for the ORRUBA detectors. This work is supported in part by the U.S. Department of Energy and National Science Foundation.

  14. Data compression using adaptive transform coding. Appendix 1: Item 1. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Rost, Martin Christopher

    1988-01-01

    Adaptive low-rate source coders are described in this dissertation. These coders adapt by adjusting the complexity of the coder to match the local coding difficulty of the image. This is accomplished by using a threshold driven maximum distortion criterion to select the specific coder used. The different coders are built using variable blocksized transform techniques, and the threshold criterion selects small transform blocks to code the more difficult regions and larger blocks to code the less complex regions. A theoretical framework is constructed from which the study of these coders can be explored. An algorithm for selecting the optimal bit allocation for the quantization of transform coefficients is developed. The bit allocation algorithm is more fully developed, and can be used to achieve more accurate bit assignments than the algorithms currently used in the literature. Some upper and lower bounds for the bit-allocation distortion-rate function are developed. An obtainable distortion-rate function is developed for a particular scalar quantizer mixing method that can be used to code transform coefficients at any rate.

  15. Developing a Multi-Dimensional Hydrodynamics Code with Astrochemical Reactions

    NASA Astrophysics Data System (ADS)

    Kwak, Kyujin; Yang, Seungwon

    2015-08-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) revealed high resolution molecular lines some of which are still unidentified yet. Because formation of these astrochemical molecules has been seldom studied in traditional chemistry, observations of new molecular lines drew a lot of attention from not only astronomers but also chemists both experimental and theoretical. Theoretical calculations for the formation of these astrochemical molecules have been carried out providing reaction rates for some important molecules, and some of theoretical predictions have been measured in laboratories. The reaction rates for the astronomically important molecules are now collected to form databases some of which are publically available. By utilizing these databases, we develop a multi-dimensional hydrodynamics code that includes the reaction rates of astrochemical molecules. Because this type of hydrodynamics code is able to trace the molecular formation in a non-equilibrium fashion, it is useful to study the formation history of these molecules that affects the spatial distribution of some specific molecules. We present the development procedure of this code and some test problems in order to verify and validate the developed code.

  16. SPIN: An Inversion Code for the Photospheric Spectral Line

    NASA Astrophysics Data System (ADS)

    Yadav, Rahul; Mathew, Shibu K.; Tiwary, Alok Ranjan

    2017-08-01

    Inversion codes are the most useful tools to infer the physical properties of the solar atmosphere from the interpretation of Stokes profiles. In this paper, we present the details of a new Stokes Profile INversion code (SPIN) developed specifically to invert the spectro-polarimetric data of the Multi-Application Solar Telescope (MAST) at Udaipur Solar Observatory. The SPIN code has adopted Milne-Eddington approximations to solve the polarized radiative transfer equation (RTE) and for the purpose of fitting a modified Levenberg-Marquardt algorithm has been employed. We describe the details and utilization of the SPIN code to invert the spectro-polarimetric data. We also present the details of tests performed to validate the inversion code by comparing the results from the other widely used inversion codes (VFISV and SIR). The inverted results of the SPIN code after its application to Hinode/SP data have been compared with the inverted results from other inversion codes.

  17. Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakarado, Gary L.

    The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA,more » to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.« less

  18. [Orthopedic and trauma surgery in the German DRG System 2007].

    PubMed

    Franz, D; Kaufmann, M; Siebert, C H; Windolf, J; Roeder, N

    2007-03-01

    The German Diagnosis-Related Groups (DRG) System was further developed into its 2007 version. For orthopedic and trauma surgery, significant changes were made in terms of the coding of diagnoses and medical procedures, as well as in the DRG structure itself. The German Societies for Trauma Surgery and for Orthopedics and Orthopedic Surgery (Deutsch Gesellschaft für Unfallchirurgie, DGU; and Deutsche Gesellschaft für Orthopädie und Orthopädische Chirurgie, DGOOC) once again cooperated constructively with the German DRG Institute InEK. Among other innovations, new International Classification of Diseases (ICD) codes for second-degree burns were implemented. Procedure codes for joint operations, endoprosthetic-surgery and spine surgery were restructured. Furthermore, a specific code for septic surgery was introduced in 2007. In addition, the DRG structure was improved. Case allocation of patients with more than one significant operation was established. Further DRG subdivisions were established according to the patients age and the Patient Clinical Complexity Level (PCCL). DRG developments for 2007 have improved appropriate case allocation, but once again increased the system's complexity. Clinicians need an ever growing amount of specific coding know-how. Still, further adjustments to the German DRG system are required to allow for a correct allocation of cases and funds.

  19. Applications and error correction for adiabatic quantum optimization

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen

    Adiabatic quantum optimization (AQO) is a fast-developing subfield of quantum information processing which holds great promise in the relatively near future. Here we develop an application, quantum anomaly detection, and an error correction code, Quantum Annealing Correction (QAC), for use with AQO. The motivation for the anomaly detection algorithm is the problematic nature of classical software verification and validation (V&V). The number of lines of code written for safety-critical applications such as cars and aircraft increases each year, and with it the cost of finding errors grows exponentially (the cost of overlooking errors, which can be measured in human safety, is arguably even higher). We approach the V&V problem by using a quantum machine learning algorithm to identify charateristics of software operations that are implemented outside of specifications, then define an AQO to return these anomalous operations as its result. Our error correction work is the first large-scale experimental demonstration of quantum error correcting codes. We develop QAC and apply it to USC's equipment, the first and second generation of commercially available D-Wave AQO processors. We first show comprehensive experimental results for the code's performance on antiferromagnetic chains, scaling the problem size up to 86 logical qubits (344 physical qubits) and recovering significant encoded success rates even when the unencoded success rates drop to almost nothing. A broader set of randomized benchmarking problems is then introduced, for which we observe similar behavior to the antiferromagnetic chain, specifically that the use of QAC is almost always advantageous for problems of sufficient size and difficulty. Along the way, we develop problem-specific optimizations for the code and gain insight into the various on-chip error mechanisms (most prominently thermal noise, since the hardware operates at finite temperature) and the ways QAC counteracts them. We finish by showing that the scheme is robust to qubit loss on-chip, a significant benefit when considering an implemented system.

  20. Context influences on TALE–DNA binding revealed by quantitative profiling

    PubMed Central

    Rogers, Julia M.; Barrera, Luis A.; Reyon, Deepak; Sander, Jeffry D.; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L.

    2015-01-01

    Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE–DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000–20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE–DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design. PMID:26067805

  1. Context influences on TALE-DNA binding revealed by quantitative profiling.

    PubMed

    Rogers, Julia M; Barrera, Luis A; Reyon, Deepak; Sander, Jeffry D; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L

    2015-06-11

    Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE-DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000-20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE-DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design.

  2. Energy Cost Impact of Non-Residential Energy Code Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.

    2016-08-22

    The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less

  3. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  4. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  5. Advanced Spectral Modeling Development

    DTIC Science & Technology

    1992-09-14

    above, the AFGL line-by-line code already possesses many of the attributes desired of a generally applicable transmittance/radiance simulation code, it...transmittance calculations, (b) perform generalized multiple scattering calculations, (c) calculate both heating and dissociative fluxes, (d) provide...This report is subdivided into task specific subsections. The following section describes our general approach to address these technical issues (Section

  6. Global transcriptome analysis reveals extensive gene remodeling, alternative splicing and differential transcription profiles in non-seed vascular plant Selaginella moellendorffii.

    PubMed

    Zhu, Yan; Chen, Longxian; Zhang, Chengjun; Hao, Pei; Jing, Xinyun; Li, Xuan

    2017-01-25

    Selaginella moellendorffii, a lycophyte, is a model plant to study the early evolution and development of vascular plants. As the first and only sequenced lycophyte to date, the genome of S. moellendorffii revealed many conserved genes and pathways, as well as specialized genes different from flowering plants. Despite the progress made, little is known about long noncoding RNAs (lncRNA) and the alternative splicing (AS) of coding genes in S. moellendorffii. Its coding gene models have not been fully validated with transcriptome data. Furthermore, it remains important to understand whether the regulatory mechanisms similar to flowering plants are used, and how they operate in a non-seed primitive vascular plant. RNA-sequencing (RNA-seq) was performed for three S. moellendorffii tissues, root, stem, and leaf, by constructing strand-specific RNA-seq libraries from RNA purified using RiboMinus isolation protocol. A total of 176 million reads (44 Gbp) were obtained from three tissue types, and were mapped to S. moellendorffii genome. By comparing with 22,285 existing gene models of S. moellendorffii, we identified 7930 high-confidence novel coding genes (a 35.6% increase), and for the first time reported 4422 lncRNAs in a lycophyte. Further, we refined 2461 (11.0%) of existing gene models, and identified 11,030 AS events (for 5957 coding genes) revealed for the first time for lycophytes. Tissue-specific gene expression with functional implication was analyzed, and 1031, 554, and 269 coding genes, and 174, 39, and 17 lncRNAs were identified in root, stem, and leaf tissues, respectively. The expression of critical genes for vascular development stages, i.e. formation of provascular cells, xylem specification and differentiation, and phloem specification and differentiation, was compared in S. moellendorffii tissues, indicating a less complex regulatory mechanism in lycophytes than in flowering plants. The results were further strengthened by the evolutionary trend of seven transcription factor families related to vascular development, which was observed among four representative species of seed and non-seed vascular plants, and nonvascular land and aquatic plants. The deep RNA-seq study of S. moellendorffii discovered extensive new gene contents, including novel coding genes, lncRNAs, AS events, and refined gene models. Compared to flowering vascular plants, S. moellendorffii displayed a less complexity in both gene structure, alternative splicing, and regulatory elements of vascular development. The study offered important insight into the evolution of vascular plants, and the regulation mechanism of vascular development in a non-seed plant.

  7. Validation of Case Finding Algorithms for Hepatocellular Cancer from Administrative Data and Electronic Health Records using Natural Language Processing

    PubMed Central

    Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica

    2013-01-01

    Background Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC ICD-9 codes, and evaluated whether natural language processing (NLP) by the Automated Retrieval Console (ARC) for document classification improves HCC identification. Methods We identified a cohort of patients with ICD-9 codes for HCC during 2005–2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared to manual classification. PPV, sensitivity, and specificity of ARC were calculated. Results 1138 patients with HCC were identified by ICD-9 codes. Based on manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. Conclusion A combined approach of ICD-9 codes and NLP of pathology and radiology reports improves HCC case identification in automated data. PMID:23929403

  8. Validation of Case Finding Algorithms for Hepatocellular Cancer From Administrative Data and Electronic Health Records Using Natural Language Processing.

    PubMed

    Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica

    2016-02-01

    Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC International Classification of Diseases, 9th Revision (ICD-9) codes, and evaluated whether natural language processing by the Automated Retrieval Console (ARC) for document classification improves HCC identification. We identified a cohort of patients with ICD-9 codes for HCC during 2005-2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared with manual classification. PPV, sensitivity, and specificity of ARC were calculated. A total of 1138 patients with HCC were identified by ICD-9 codes. On the basis of manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. A combined approach of ICD-9 codes and natural language processing of pathology and radiology reports improves HCC case identification in automated data.

  9. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  10. INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Javier Ortensi; Sonat Sen

    2013-09-01

    The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less

  11. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Colt, Joanne S.; Armenti, Karla R.; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P.; Karagas, Margaret R.; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T.; Johnson, Calvin A.; Friesen, Melissa C.

    2016-01-01

    Background Mapping job titles to standardized occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiologic studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Methods Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14,983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in two occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. Results For 11,991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6- and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (kappa: 0.6–0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Conclusions Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiologic studies. PMID:27102331

  12. Determination of photovoltaic concentrator optical design specifications using performance modeling

    NASA Astrophysics Data System (ADS)

    Kerschen, Kevin A.; Levy, Sheldon L.

    The strategy used to develop an optical design specification for a 500X concentration photovoltaic module to be used with a 28-percent-efficient concentrator photovoltaic cell is reported. The computer modeling code (PVOPTICS) developed for this purpose, a Fresnel lens design strategy, and optical component specification procedures are described. Comparisons are made between the predicted performance and the measured performance of components fabricated to those specifications. An acrylic lens and a reflective secondary optical element have been tested, showing efficiencies exceeding 88 percent.

  13. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  14. Ground Operations Aerospace Language (GOAL). Volume 1: Study overview

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A series of NASA and Contractor studies sponsored by NASA/KSC resulted in a specification for the Ground Operations Aerospace Language (GOAL). The Cape Kennedy Facility of the IBM Corporation was given the responsibility, under existing contracts, to perform an analysis of the Language Specification, to design and develop a GOAL Compiler, to provide a specification for a data bank, to design and develop an interpretive code translator, and to perform associated application studies.

  15. Highly conserved elements discovered in vertebrates are present in non-syntenic loci of tunicates, act as enhancers and can be transcribed during development

    PubMed Central

    Sanges, Remo; Hadzhiev, Yavor; Gueroult-Bellone, Marion; Roure, Agnes; Ferg, Marco; Meola, Nicola; Amore, Gabriele; Basu, Swaraj; Brown, Euan R.; De Simone, Marco; Petrera, Francesca; Licastro, Danilo; Strähle, Uwe; Banfi, Sandro; Lemaire, Patrick; Birney, Ewan; Müller, Ferenc; Stupka, Elia

    2013-01-01

    Co-option of cis-regulatory modules has been suggested as a mechanism for the evolution of expression sites during development. However, the extent and mechanisms involved in mobilization of cis-regulatory modules remains elusive. To trace the history of non-coding elements, which may represent candidate ancestral cis-regulatory modules affirmed during chordate evolution, we have searched for conserved elements in tunicate and vertebrate (Olfactores) genomes. We identified, for the first time, 183 non-coding sequences that are highly conserved between the two groups. Our results show that all but one element are conserved in non-syntenic regions between vertebrate and tunicate genomes, while being syntenic among vertebrates. Nevertheless, in all the groups, they are significantly associated with transcription factors showing specific functions fundamental to animal development, such as multicellular organism development and sequence-specific DNA binding. The majority of these regions map onto ultraconserved elements and we demonstrate that they can act as functional enhancers within the organism of origin, as well as in cross-transgenesis experiments, and that they are transcribed in extant species of Olfactores. We refer to the elements as ‘Olfactores conserved non-coding elements’. PMID:23393190

  16. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  17. Standardized development of computer software. Part 2: Standards

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1978-01-01

    This monograph contains standards for software development and engineering. The book sets forth rules for design, specification, coding, testing, documentation, and quality assurance audits of software; it also contains detailed outlines for the documentation to be produced.

  18. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  19. Increasing Flexibility in Energy Code Compliance: Performance Packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Rosenberg, Michael I.

    Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less

  20. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    NASA Astrophysics Data System (ADS)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  1. Raptor: An Enterprise Knowledge Discovery Engine Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-08-31

    The Raptor Version 2.0 computer code uses a set of documents as seed documents to recommend documents of interest from a large, target set of documents. The computer code provides results that show the recommended documents with the highest similarity to the seed documents. Version 2.0 was specifically developed to work with SharePoint 2007 and MS SQL server.

  2. Channel coding/decoding alternatives for compressed TV data on advanced planetary missions.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1972-01-01

    The compatibility of channel coding/decoding schemes with a specific TV compressor developed for advanced planetary missions is considered. Under certain conditions, it is shown that compressed data can be transmitted at approximately the same rate as uncompressed data without any loss in quality. Thus, the full gains of data compression can be achieved in real-time transmission.

  3. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  4. An Idealized, Single Radial Swirler, Lean-Direct-Injection (LDI) Concept Meshing Script

    NASA Technical Reports Server (NTRS)

    Iannetti, Anthony C.; Thompson, Daniel

    2008-01-01

    To easily study combustor design parameters using computational fluid dynamics codes (CFD), a Gridgen Glyph-based macro (based on the Tcl scripting language) dubbed BladeMaker has been developed for the meshing of an idealized, single radial swirler, lean-direct-injection (LDI) combustor. BladeMaker is capable of taking in a number of parameters, such as blade width, blade tilt with respect to the perpendicular, swirler cup radius, and grid densities, and producing a three-dimensional meshed radial swirler with a can-annular (canned) combustor. This complex script produces a data format suitable for but not specific to the National Combustion Code (NCC), a state-of-the-art CFD code developed for reacting flow processes.

  5. Computational Infrastructure for Engine Structural Performance Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.

  6. Shared Memory Parallelization of an Implicit ADI-type CFD Code

    NASA Technical Reports Server (NTRS)

    Hauser, Th.; Huang, P. G.

    1999-01-01

    A parallelization study designed for ADI-type algorithms is presented using the OpenMP specification for shared-memory multiprocessor programming. Details of optimizations specifically addressed to cache-based computer architectures are described and performance measurements for the single and multiprocessor implementation are summarized. The paper demonstrates that optimization of memory access on a cache-based computer architecture controls the performance of the computational algorithm. A hybrid MPI/OpenMP approach is proposed for clusters of shared memory machines to further enhance the parallel performance. The method is applied to develop a new LES/DNS code, named LESTool. A preliminary DNS calculation of a fully developed channel flow at a Reynolds number of 180, Re(sub tau) = 180, has shown good agreement with existing data.

  7. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  8. Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Zhang, Jian; Chen, Yan

    This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less

  9. Energy Storage System Safety: Plan Review and Inspection Checklist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Pam C.; Conover, David R.

    Codes, standards, and regulations (CSR) governing the design, construction, installation, commissioning, and operation of the built environment are intended to protect the public health, safety, and welfare. While these documents change over time to address new technology and new safety challenges, there is generally some lag time between the introduction of a technology into the market and the time it is specifically covered in model codes and standards developed in the voluntary sector. After their development, there is also a timeframe of at least a year or two until the codes and standards are adopted. Until existing model codes andmore » standards are updated or new ones are developed and then adopted, one seeking to deploy energy storage technologies or needing to verify the safety of an installation may be challenged in trying to apply currently implemented CSRs to an energy storage system (ESS). The Energy Storage System Guide for Compliance with Safety Codes and Standards1 (CG), developed in June 2016, is intended to help address the acceptability of the design and construction of stationary ESSs, their component parts, and the siting, installation, commissioning, operations, maintenance, and repair/renovation of ESS within the built environment.« less

  10. Complete analysis of steady and transient missile aerodynamic/propulsive/plume flowfield interactions

    NASA Astrophysics Data System (ADS)

    York, B. J.; Sinha, N.; Dash, S. M.; Hosangadi, A.; Kenzakowski, D. C.; Lee, R. A.

    1992-07-01

    The analysis of steady and transient aerodynamic/propulsive/plume flowfield interactions utilizing several state-of-the-art computer codes (PARCH, CRAFT, and SCHAFT) is discussed. These codes have been extended to include advanced turbulence models, generalized thermochemistry, and multiphase nonequilibrium capabilities. Several specialized versions of these codes have been developed for specific applications. This paper presents a brief overview of these codes followed by selected cases demonstrating steady and transient analyses of conventional as well as advanced missile systems. Areas requiring upgrades include turbulence modeling in a highly compressible environment and the treatment of particulates in general. Recent progress in these areas are highlighted.

  11. Changes in the Coding and Non-coding Transcriptome and DNA Methylome that Define the Schwann Cell Repair Phenotype after Nerve Injury.

    PubMed

    Arthur-Farraj, Peter J; Morgan, Claire C; Adamowicz, Martyna; Gomez-Sanchez, Jose A; Fazal, Shaline V; Beucher, Anthony; Razzaghi, Bonnie; Mirsky, Rhona; Jessen, Kristjan R; Aitman, Timothy J

    2017-09-12

    Repair Schwann cells play a critical role in orchestrating nerve repair after injury, but the cellular and molecular processes that generate them are poorly understood. Here, we perform a combined whole-genome, coding and non-coding RNA and CpG methylation study following nerve injury. We show that genes involved in the epithelial-mesenchymal transition are enriched in repair cells, and we identify several long non-coding RNAs in Schwann cells. We demonstrate that the AP-1 transcription factor C-JUN regulates the expression of certain micro RNAs in repair Schwann cells, in particular miR-21 and miR-34. Surprisingly, unlike during development, changes in CpG methylation are limited in injury, restricted to specific locations, such as enhancer regions of Schwann cell-specific genes (e.g., Nedd4l), and close to local enrichment of AP-1 motifs. These genetic and epigenomic changes broaden our mechanistic understanding of the formation of repair Schwann cell during peripheral nervous system tissue repair. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community

    NASA Astrophysics Data System (ADS)

    Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.

    2016-12-01

    The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.

  13. TOUGH+ v1.5 Core Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, George J.

    TOUGH+ v1.5 is a numerical code for the simulation of multi-phase, multi-component flow and transport of mass and heat through porous and fractured media, and represents the third update of the code since its first release [Moridis et al., 2008]. TOUGH+ is a successor to the TOUGH2 [Pruess et al., 1991; 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. It is written in standard FORTRAN 95/2003, and can be run on any computational platform (workstations, PC, Macintosh). TOUGH+ v1.5 employs dynamic memory allocation, thus minimizing storage requirements. It has amore » completely modular structure, follows the tenets of Object-Oriented Programming (OOP), and involves the advanced features of FORTRAN 95/2003, i.e., modules, derived data types, the use of pointers, lists and trees, data encapsulation, defined operators and assignments, operator extension and overloading, use of generic procedures, and maximum use of the powerful intrinsic vector and matrix processing operations. TOUGH+ v1.5 is the core code for its family of applications, i.e., the part of the code that is common to all its applications. It provides a description of the underlying physics and thermodynamics of non-isothermal flow, of the mathematical and numerical approaches, as well as a detailed explanation of the general (common to all applications) input requirements, options, capabilities and output specifications. The core code cannot run by itself: it needs to be coupled with the code for the specific TOUGH+ application option that describes a particular type of problem. The additional input requirements specific to a particular TOUGH+ application options and related illustrative examples can be found in the corresponding User's Manual.« less

  14. Identification of Hospitalizations for Intentional Self-Harm when E-Codes are Incompletely Recorded

    PubMed Central

    Patrick, Amanda R.; Miller, Matthew; Barber, Catherine W.; Wang, Philip S.; Canning, Claire F.; Schneeweiss, Sebastian

    2010-01-01

    Context Suicidal behavior has gained attention as an adverse outcome of prescription drug use. Hospitalizations for intentional self-harm, including suicide, can be identified in administrative claims databases using external cause of injury codes (E-codes). However, rates of E-code completeness in US government and commercial claims databases are low due to issues with hospital billing software. Objective To develop an algorithm to identify intentional self-harm hospitalizations using recorded injury and psychiatric diagnosis codes in the absence of E-code reporting. Methods We sampled hospitalizations with an injury diagnosis (ICD-9 800–995) from 2 databases with high rates of E-coding completeness: 1999–2001 British Columbia, Canada data and the 2004 U.S. Nationwide Inpatient Sample. Our gold standard for intentional self-harm was a diagnosis of E950-E958. We constructed algorithms to identify these hospitalizations using information on type of injury and presence of specific psychiatric diagnoses. Results The algorithm that identified intentional self-harm hospitalizations with high sensitivity and specificity was a diagnosis of poisoning; toxic effects; open wound to elbow, wrist, or forearm; or asphyxiation; plus a diagnosis of depression, mania, personality disorder, psychotic disorder, or adjustment reaction. This had a sensitivity of 63%, specificity of 99% and positive predictive value (PPV) of 86% in the Canadian database. Values in the US data were 74%, 98%, and 73%. PPV was highest (80%) in patients under 25 and lowest those over 65 (44%). Conclusions The proposed algorithm may be useful for researchers attempting to study intentional self-harm in claims databases with incomplete E-code reporting, especially among younger populations. PMID:20922709

  15. A common class of transcripts with 5'-intron depletion, distinct early coding sequence features, and N1-methyladenosine modification.

    PubMed

    Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P

    2017-03-01

    Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5 ' proximal- i ntron- m inus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N 1 -methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N 1 -methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  16. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    from the tools being used. For example, while Coq proves properties it does not dump an explanation of the proofs in any currently supported form. The...Distribution Unlimited 5 Hotel room locks and card keys use a simple protocol to manage the transition of rooms from one guest to the next. The lock...retains that guest key’s code. A new guest checks in and gets a card with a new current code, and the previous code set to the previous guest’s current

  17. DSP code optimization based on cache

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Li, Chengcheng; Tang, Bin

    2013-03-01

    DSP program's running efficiency on board is often lower than which via the software simulation during the program development, which is mainly resulted from the user's improper use and incomplete understanding of the cache-based memory. This paper took the TI TMS320C6455 DSP as an example, analyzed its two-level internal cache, and summarized the methods of code optimization. Processor can achieve its best performance when using these code optimization methods. At last, a specific algorithm application in radar signal processing is proposed. Experiment result shows that these optimization are efficient.

  18. Child Injury Deaths: Comparing Prevention Information from Two Coding Systems

    PubMed Central

    Schnitzer, Patricia G.; Ewigman, Bernard G.

    2006-01-01

    Objectives The International Classification of Disease (ICD) external cause of injury E-codes do not sufficiently identify injury circumstances amenable to prevention. The researchers developed an alternative classification system (B-codes) that incorporates behavioral and environmental factors, for use in childhood injury research, and compare the two coding systems in this paper. Methods All fatal injuries among children less than age five that occurred between January 1, 1992, and December 31, 1994, were classified using both B-codes and E-codes. Results E-codes identified the most common causes of injury death: homicide (24%), fires (21%), motor vehicle incidents (21%), drowning (10%), and suffocation (9%). The B-codes further revealed that homicides (51%) resulted from the child being shaken or struck by another person; many fires deaths (42%) resulted from children playing with matches or lighters; drownings (46%) usually occurred in natural bodies of water; and most suffocation deaths (68%) occurred in unsafe sleeping arrangements. Conclusions B-codes identify additional information with specific relevance for prevention of childhood injuries. PMID:15944169

  19. [Long non-coding RNAs in the pathophysiology of atherosclerosis].

    PubMed

    Novak, Jan; Vašků, Julie Bienertová; Souček, Miroslav

    2018-01-01

    The human genome contains about 22 000 protein-coding genes that are transcribed to an even larger amount of messenger RNAs (mRNA). Interestingly, the results of the project ENCODE from 2012 show, that despite up to 90 % of our genome being actively transcribed, protein-coding mRNAs make up only 2-3 % of the total amount of the transcribed RNA. The rest of RNA transcripts is not translated to proteins and that is why they are referred to as "non-coding RNAs". Earlier the non-coding RNA was considered "the dark matter of genome", or "the junk", whose genes has accumulated in our DNA during the course of evolution. Today we already know that non-coding RNAs fulfil a variety of regulatory functions in our body - they intervene into epigenetic processes from chromatin remodelling to histone methylation, or into the transcription process itself, or even post-transcription processes. Long non-coding RNAs (lncRNA) are one of the classes of non-coding RNAs that have more than 200 nucleotides in length (non-coding RNAs with less than 200 nucleotides in length are called small non-coding RNAs). lncRNAs represent a widely varied and large group of molecules with diverse regulatory functions. We can identify them in all thinkable cell types or tissues, or even in an extracellular space, which includes blood, specifically plasma. Their levels change during the course of organogenesis, they are specific to different tissues and their changes also occur along with the development of different illnesses, including atherosclerosis. This review article aims to present lncRNAs problematics in general and then focuses on some of their specific representatives in relation to the process of atherosclerosis (i.e. we describe lncRNA involvement in the biology of endothelial cells, vascular smooth muscle cells or immune cells), and we further describe possible clinical potential of lncRNA, whether in diagnostics or therapy of atherosclerosis and its clinical manifestations.Key words: atherosclerosis - lincRNA - lncRNA - MALAT - MIAT.

  20. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  1. Development of an Efficient Entire-Capsid-Coding-Region Amplification Method for Direct Detection of Poliovirus from Stool Extracts

    PubMed Central

    Kilpatrick, David R.; Nakamura, Tomofumi; Burns, Cara C.; Bukbuk, David; Oderinde, Soji B.; Oberste, M. Steven; Kew, Olen M.; Pallansch, Mark A.; Shimizu, Hiroyuki

    2014-01-01

    Laboratory diagnosis has played a critical role in the Global Polio Eradication Initiative since 1988, by isolating and identifying poliovirus (PV) from stool specimens by using cell culture as a highly sensitive system to detect PV. In the present study, we aimed to develop a molecular method to detect PV directly from stool extracts, with a high efficiency comparable to that of cell culture. We developed a method to efficiently amplify the entire capsid coding region of human enteroviruses (EVs) including PV. cDNAs of the entire capsid coding region (3.9 kb) were obtained from as few as 50 copies of PV genomes. PV was detected from the cDNAs with an improved PV-specific real-time reverse transcription-PCR system and nucleotide sequence analysis of the VP1 coding region. For assay validation, we analyzed 84 stool extracts that were positive for PV in cell culture and detected PV genomes from 100% of the extracts (84/84 samples) with this method in combination with a PV-specific extraction method. PV could be detected in 2/4 stool extract samples that were negative for PV in cell culture. In PV-positive samples, EV species C viruses were also detected with high frequency (27% [23/86 samples]). This method would be useful for direct detection of PV from stool extracts without using cell culture. PMID:25339406

  2. Thermodynamic properties of UF sub 6 measured with a ballistic piston compressor

    NASA Technical Reports Server (NTRS)

    Sterritt, D. E.; Lalos, G. T.; Schneider, R. T.

    1973-01-01

    From experiments performed with a ballistic piston compressor, certain thermodynamic properties of uranium hexafluoride were investigated. Difficulties presented by the nonideal processes encountered in ballistic compressors are discussed and a computer code BCCC (Ballistic Compressor Computer Code) is developed to analyze the experimental data. The BCCC unfolds the thermodynamic properties of uranium hexafluoride from the helium-uranium hexafluoride mixture used as the test gas in the ballistic compressor. The thermodynamic properties deduced include the specific heat at constant volume, the ratio of specific heats for UF6, and the viscous coupling constant of helium-uranium hexafluoride mixtures.

  3. Video data compression using artificial neural network differential vector quantization

    NASA Technical Reports Server (NTRS)

    Krishnamurthy, Ashok K.; Bibyk, Steven B.; Ahalt, Stanley C.

    1991-01-01

    An artificial neural network vector quantizer is developed for use in data compression applications such as Digital Video. Differential Vector Quantization is used to preserve edge features, and a new adaptive algorithm, known as Frequency-Sensitive Competitive Learning, is used to develop the vector quantizer codebook. To develop real time performance, a custom Very Large Scale Integration Application Specific Integrated Circuit (VLSI ASIC) is being developed to realize the associative memory functions needed in the vector quantization algorithm. By using vector quantization, the need for Huffman coding can be eliminated, resulting in superior performance against channel bit errors than methods that use variable length codes.

  4. Common spaceborne multicomputer operating system and development environment

    NASA Technical Reports Server (NTRS)

    Craymer, L. G.; Lewis, B. F.; Hayes, P. J.; Jones, R. L.

    1994-01-01

    A preliminary technical specification for a multicomputer operating system is developed. The operating system is targeted for spaceborne flight missions and provides a broad range of real-time functionality, dynamic remote code-patching capability, and system fault tolerance and long-term survivability features. Dataflow concepts are used for representing application algorithms. Functional features are included to ensure real-time predictability for a class of algorithms which require data-driven execution on an iterative steady state basis. The development environment supports the development of algorithm code, design of control parameters, performance analysis, simulation of real-time dataflow applications, and compiling and downloading of the resulting application.

  5. Development of a Pebble-Bed Liquid-Nitrogen Evaporator/Superheater for the BRL 1/6th Scale Large Blast/Thermal Simulator Test Bed. Phase 1. Prototype Design and Analysis

    DTIC Science & Technology

    1991-08-01

    specifications are taken primarily from the 1983 version of the ASME Boiler and Pressure Vessel Code . Other design requirements were developea from standard safe...rules and practices of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code to provide a safe and reliable system

  6. Micro- and meso-scale simulations of magnetospheric processes related to the aurora and substorm morphology

    NASA Technical Reports Server (NTRS)

    Swift, Daniel W.

    1991-01-01

    The primary methodology during the grant period has been the use of micro or meso-scale simulations to address specific questions concerning magnetospheric processes related to the aurora and substorm morphology. This approach, while useful in providing some answers, has its limitations. Many of the problems relating to the magnetosphere are inherently global and kinetic. Effort during the last year of the grant period has increasingly focused on development of a global-scale hybrid code to model the entire, coupled magnetosheath - magnetosphere - ionosphere system. In particular, numerical procedures for curvilinear coordinate generation and exactly conservative differencing schemes for hybrid codes in curvilinear coordinates have been developed. The new computer algorithms and the massively parallel computer architectures now make this global code a feasible proposition. Support provided by this project has played an important role in laying the groundwork for the eventual development or a global-scale code to model and forecast magnetospheric weather.

  7. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  8. Evaluating training of screening, brief intervention, and referral to treatment (SBIRT) for substance use: Reliability of the MD3 SBIRT Coding Scale.

    PubMed

    DiClemente, Carlo C; Crouch, Taylor Berens; Norwood, Amber E Q; Delahanty, Janine; Welsh, Christopher

    2015-03-01

    Screening, brief intervention, and referral to treatment (SBIRT) has become an empirically supported and widely implemented approach in primary and specialty care for addressing substance misuse. Accordingly, training of providers in SBIRT has increased exponentially in recent years. However, the quality and fidelity of training programs and subsequent interventions are largely unknown because of the lack of SBIRT-specific evaluation tools. The purpose of this study was to create a coding scale to assess quality and fidelity of SBIRT interactions addressing alcohol, tobacco, illicit drugs, and prescription medication misuse. The scale was developed to evaluate performance in an SBIRT residency training program. Scale development was based on training protocol and competencies with consultation from Motivational Interviewing coding experts. Trained medical residents practiced SBIRT with standardized patients during 10- to 15-min videotaped interactions. This study included 25 tapes from the Family Medicine program coded by 3 unique coder pairs with varying levels of coding experience. Interrater reliability was assessed for overall scale components and individual items via intraclass correlation coefficients. Coder pair-specific reliability was also assessed. Interrater reliability was excellent overall for the scale components (>.85) and nearly all items. Reliability was higher for more experienced coders, though still adequate for the trained coder pair. Descriptive data demonstrated a broad range of adherence and skills. Subscale correlations supported concurrent and discriminant validity. Data provide evidence that the MD3 SBIRT Coding Scale is a psychometrically reliable coding system for evaluating SBIRT interactions and can be used to evaluate implementation skills for fidelity, training, assessment, and research. Recommendations for refinement and further testing of the measure are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  9. The effect of multiple internal representations on context-rich instruction

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Aulls, Mark W.

    2007-11-01

    We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.

  10. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  11. Engineering for Sustainable Development and the Common Good

    ERIC Educational Resources Information Center

    Kelly, William E.

    2006-01-01

    In 1994, the American Society of Civil Engineers (ASCE) updated its Code of Ethics to include specific statements on sustainable development and at about the same time, 1994, ASCE adopted its Policy 418 on sustainable development. Sustainable development as defined by ASCE "is the challenge of meeting human needs for natural resources, industrial…

  12. Predictive information processing is a fundamental learning mechanism present in early development: evidence from infants.

    PubMed

    Trainor, Laurel J

    2012-02-01

    Evidence is presented that predictive coding is fundamental to brain function and present in early infancy. Indeed, mismatch responses to unexpected auditory stimuli are among the earliest robust cortical event-related potential responses, and have been measured in young infants in response to many types of deviation, including in pitch, timing, and melodic pattern. Furthermore, mismatch responses change quickly with specific experience, suggesting that predictive coding reflects a powerful, early-developing learning mechanism. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Manual of phosphoric acid fuel cell power plant optimization model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    An optimized cost and performance model for a phosphoric acid fuel cell power plant system was derived and developed into a modular FORTRAN computer code. Cost, energy, mass, and electrochemical analyses were combined to develop a mathematical model for optimizing the steam to methane ratio in the reformer, hydrogen utilization in the PAFC plates per stack. The nonlinear programming code, COMPUTE, was used to solve this model, in which the method of mixed penalty function combined with Hooke and Jeeves pattern search was chosen to evaluate this specific optimization problem.

  14. A description of the new 3D electron gun and collector modeling tool: MICHELLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petillo, J.; Mondelli, A.; Krueger, W.

    1999-07-01

    A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less

  15. Technology Development for Human Exploration Beyond LEO in the New Millennium IAA-13-3 Strategies and Plans for Human Mars Missions

    NASA Technical Reports Server (NTRS)

    Larson, William E.; Lueck, Dale E.; Parrish, Clyde F.; Sanders, Gerald B.; Trevathan, Joseph R.; Baird, R. Scott; Simon, Tom; Peters, T.; Delgado, H. (Technical Monitor)

    2001-01-01

    As we look forward into the new millennium, the extension of human presence beyond Low-Earth Orbit (LEO) looms large in the plans of NASA. The Agency's Strategic Plan specifically calls out the need to identify and develop technologies for 100 and 1000-day class missions beyond LEO. To meet the challenge of these extended duration missions, it is important that we learn how to utilize the indigenous resources available to us on extraterrestrial bodies. This concept, known as In-Situ Resource Utilization (ISRU) can greatly reduce the launch mass & cost of human missions while reducing the risk. These technologies may also pave the way for the commercial development of space. While no specific target beyond LEO is identified in NASA's Strategic Plan, mission architecture studies have been on-going for the Moon, Mars, Near-Earth Asteroids and Earth/Moon & Earth/Sun Libration Points. As a result of these studies, the NASA Office of Space Flight (Code M) through the Johnson and Kennedy Space Centers, is leading the effort to develop ISRU technologies and systems to meet the current and future needs of human missions beyond LEO and on to Mars. This effort also receives support from the NASA Office of Biological and Physical Research (Code U), the Office of Space Science (Code S), and the Office of Aerospace Technology (Code R). This paper will present unique developments in the area of fuel and oxidizer production, breathing air production, water production, C02 collection, separation of atmospheric gases, and gas liquefaction and storage. A technology overview will be provided for each topic along with the results achieved to date, future development plans, and the mission architectures that these technologies support.

  16. Countermeasures for Time-Cheat Detection in Multiplayer Online Games

    NASA Astrophysics Data System (ADS)

    Ferretti, Stefano

    Cheating is an important issue in games. Depending on the system over which the game is deployed, several types of malicious actions may be accomplished so as to take an unfair and unexpected advantage over the game and over the (digital, human) adversaries. When the game is a standalone application, cheats typically just relate to the specific software code being developed to build the application. It is not a surprise to find (in the Web and in specialized magazines) people that explain cheats on specific games stating, for instance, which configuration files can be altered (and how to do it) to automatically gain some bonus during the game. To avoid this, game developers are hence motivated to build stable code, with related data that should be securely managed and made difficult to alter.

  17. Automatic Testcase Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.

    2008-01-01

    The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.

  18. Identification and substrate prediction of new Fragaria x ananassa aquaporins and expression in different tissues and during strawberry fruit development.

    PubMed

    Merlaen, Britt; De Keyser, Ellen; Van Labeke, Marie-Christine

    2018-01-01

    The newly identified aquaporin coding sequences presented here pave the way for further insights into the plant-water relations in the commercial strawberry ( Fragaria x ananassa ). Aquaporins are water channel proteins that allow water to cross (intra)cellular membranes. In Fragaria x ananassa , few of them have been identified hitherto, hampering the exploration of the water transport regulation at cellular level. Here, we present new aquaporin coding sequences belonging to different subclasses: plasma membrane intrinsic proteins subtype 1 and subtype 2 (PIP1 and PIP2) and tonoplast intrinsic proteins (TIP). The classification is based on phylogenetic analysis and is confirmed by the presence of conserved residues. Substrate-specific signature sequences (SSSSs) and specificity-determining positions (SDPs) predict the substrate specificity of each new aquaporin. Expression profiling in leaves, petioles and developing fruits reveals distinct patterns, even within the same (sub)class. Expression profiles range from leaf-specific expression over constitutive expression to fruit-specific expression. Both upregulation and downregulation during fruit ripening occur. Substrate specificity and expression profiles suggest that functional specialization exists among aquaporins belonging to a different but also to the same (sub)class.

  19. Exome sequencing and arrayCGH detection of gene sequence and copy number variation between ILS and ISS mouse strains.

    PubMed

    Dumas, Laura; Dickens, C Michael; Anderson, Nathan; Davis, Jonathan; Bennett, Beth; Radcliffe, Richard A; Sikela, James M

    2014-06-01

    It has been well documented that genetic factors can influence predisposition to develop alcoholism. While the underlying genomic changes may be of several types, two of the most common and disease associated are copy number variations (CNVs) and sequence alterations of protein coding regions. The goal of this study was to identify CNVs and single-nucleotide polymorphisms that occur in gene coding regions that may play a role in influencing the risk of an individual developing alcoholism. Toward this end, two mouse strains were used that have been selectively bred based on their differential sensitivity to alcohol: the Inbred long sleep (ILS) and Inbred short sleep (ISS) mouse strains. Differences in initial response to alcohol have been linked to risk for alcoholism, and the ILS/ISS strains are used to investigate the genetics of initial sensitivity to alcohol. Array comparative genomic hybridization (arrayCGH) and exome sequencing were conducted to identify CNVs and gene coding sequence differences, respectively, between ILS and ISS mice. Mouse arrayCGH was performed using catalog Agilent 1 × 244 k mouse arrays. Subsequently, exome sequencing was carried out using an Illumina HiSeq 2000 instrument. ArrayCGH detected 74 CNVs that were strain-specific (38 ILS/36 ISS), including several ISS-specific deletions that contained genes implicated in brain function and neurotransmitter release. Among several interesting coding variations detected by exome sequencing was the gain of a premature stop codon in the alpha-amylase 2B (AMY2B) gene specifically in the ILS strain. In total, exome sequencing detected 2,597 and 1,768 strain-specific exonic gene variants in the ILS and ISS mice, respectively. This study represents the most comprehensive and detailed genomic comparison of ILS and ISS mouse strains to date. The two complementary genome-wide approaches identified strain-specific CNVs and gene coding sequence variations that should provide strong candidates to contribute to the alcohol-related phenotypic differences associated with these strains.

  20. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  1. A Code of Ethics and Standards for Outer-Space Commerce

    NASA Astrophysics Data System (ADS)

    Livingston, David M.

    2002-01-01

    Now is the time to put forth an effective code of ethics for businesses in outer space. A successful code would be voluntary and would actually promote the growth of individual companies, not hinder their efforts to provide products and services. A properly designed code of ethics would ensure the development of space commerce unfettered by government-created barriers. Indeed, if the commercial space industry does not develop its own professional code of ethics, government- imposed regulations would probably be instituted. Should this occur, there is a risk that the development of off-Earth commerce would become more restricted. The code presented in this paper seeks to avoid the imposition of new barriers to space commerce as well as make new commercial space ventures easier to develop. The proposed code consists of a preamble, which underscores basic values, followed by a number of specific principles. For the most part, these principles set forth broad commitments to fairness and integrity with respect to employees, consumers, business transactions, political contributions, natural resources, off-Earth development, designated environmental protection zones, as well as relevant national and international laws. As acceptance of this code of ethics grows within the industry, general modifications will be necessary to accommodate the different types of businesses entering space commerce. This uniform applicability will help to assure that the code will not be perceived as foreign in nature, potentially restrictive, or threatening. Companies adopting this code of ethics will find less resistance to their space development plans, not only in the United States but also from nonspacefaring nations. Commercial space companies accepting and refining this code would demonstrate industry leadership and an understanding that will serve future generations living, working, and playing in space. Implementation of the code would also provide an off-Earth precedent for a modified free-market economy. With the code as a backdrop, a colonial or Wild West mentality would become less likely. Off-Earth resources would not be as susceptible to plunder and certain areas could be designated as environmental reserves for the benefit of all. Companies would find it advantageous to balance the goal of wealth maximization with ethical principles if such a strategy enhances the long-term prospects for success.

  2. The Role of Ontologies in Schema-based Program Synthesis

    NASA Technical Reports Server (NTRS)

    Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.

    2004-01-01

    Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.

  3. Computer-based coding of free-text job descriptions to efficiently identify occupations in epidemiological studies.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Colt, Joanne S; Armenti, Karla R; Baris, Dalsu; Chow, Wong-Ho; Davis, Faith; Johnson, Alison; Purdue, Mark P; Karagas, Margaret R; Schwartz, Kendra; Schwenn, Molly; Silverman, Debra T; Johnson, Calvin A; Friesen, Melissa C

    2016-06-01

    Mapping job titles to standardised occupation classification (SOC) codes is an important step in identifying occupational risk factors in epidemiological studies. Because manual coding is time-consuming and has moderate reliability, we developed an algorithm called SOCcer (Standardized Occupation Coding for Computer-assisted Epidemiologic Research) to assign SOC-2010 codes based on free-text job description components. Job title and task-based classifiers were developed by comparing job descriptions to multiple sources linking job and task descriptions to SOC codes. An industry-based classifier was developed based on the SOC prevalence within an industry. These classifiers were used in a logistic model trained using 14 983 jobs with expert-assigned SOC codes to obtain empirical weights for an algorithm that scored each SOC/job description. We assigned the highest scoring SOC code to each job. SOCcer was validated in 2 occupational data sources by comparing SOC codes obtained from SOCcer to expert assigned SOC codes and lead exposure estimates obtained by linking SOC codes to a job-exposure matrix. For 11 991 case-control study jobs, SOCcer-assigned codes agreed with 44.5% and 76.3% of manually assigned codes at the 6-digit and 2-digit level, respectively. Agreement increased with the score, providing a mechanism to identify assignments needing review. Good agreement was observed between lead estimates based on SOCcer and manual SOC assignments (κ 0.6-0.8). Poorer performance was observed for inspection job descriptions, which included abbreviations and worksite-specific terminology. Although some manual coding will remain necessary, using SOCcer may improve the efficiency of incorporating occupation into large-scale epidemiological studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Identification of coding and non-coding mutational hotspots in cancer genomes.

    PubMed

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from likely passenger regions susceptible to somatic mutation.

  5. Crew interface specifications preparation for in-flight maintenance and stowage functions

    NASA Technical Reports Server (NTRS)

    Parker, F. W.; Carlton, B. E.

    1972-01-01

    The findings and data products developed during the Phase 2 crew interface specification study are presented. Five new NASA general specifications were prepared: operations location coding system for crew interfaces; loose equipment and stowage management requirements; loose equipment and stowage data base information requirements; spacecraft loose equipment stowage drawing requirements; and inflight stowage management data requirements. Additional data was developed defining inflight maintenance processes and related data concepts for inflight troubleshooting, remove/repair/replace and scheduled maintenance activities. The process of maintenance task and equipment definition during spacecraft design and development was also defined and related data concepts were identified for futher development into formal NASA specifications during future follow-on study phases of the contract.

  6. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  7. Measurement of neutron spectra in the AWE workplace using a Bonner sphere spectrometer.

    PubMed

    Danyluk, Peter

    2010-12-01

    A Bonner sphere spectrometer has been used to measure the neutron spectra in eight different workplace areas at AWE (Atomic Weapons Establishment). The spectra were analysed by the National Physical Laboratory using their principal unfolding code STAY'SL and the results were also analysed by AWE using a bespoke parametrised unfolding code. The bespoke code was designed specifically for the AWE workplace and is very simple to use. Both codes gave results, in good agreement. It was found that the measured fluence rate varied from 2 to 70 neutrons cm⁻² s⁻¹ (± 10%) and the ambient dose equivalent H*(10) varied from 0.5 to 57 µSv h⁻¹ (± 20%). A detailed description of the development and use of the bespoke code is presented.

  8. SCISEAL: A CFD Code for Analysis of Fluid Dynamic Forces in Seals

    NASA Technical Reports Server (NTRS)

    Althavale, Mahesh M.; Ho, Yin-Hsing; Przekwas, Andre J.

    1996-01-01

    A 3D CFD code, SCISEAL, has been developed and validated. Its capabilities include cylindrical seals, and it is employed on labyrinth seals, rim seals, and disc cavities. State-of-the-art numerical methods include colocated grids, high-order differencing, and turbulence models which account for wall roughness. SCISEAL computes efficient solutions for complicated flow geometries and seal-specific capabilities (rotor loads, torques, etc.).

  9. Operations analysis (study 2.6). Volume 4: Computer specification; logistics of orbiting vehicle servicing (LOVES)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The logistics of orbital vehicle servicing computer specifications was developed and a number of alternatives to improve utilization of the space shuttle and the tug were investigated. Preliminary results indicate that space servicing offers a potential for reducing future operational and program costs over ground refurbishment of satellites. A computer code which could be developed to simulate space servicing is presented.

  10. Tailoring a software production environment for a large project

    NASA Technical Reports Server (NTRS)

    Levine, D. R.

    1984-01-01

    A software production environment was constructed to meet the specific goals of a particular large programming project. These goals, the specific solutions as implemented, and the experiences on a project of over 100,000 lines of source code are discussed. The base development environment for this project was an ordinary PWB Unix (tm) system. Several important aspects of the development process required support not available in the existing tool set.

  11. Viewpoint: a comparison of cause-of-injury coding in U.S. military and civilian hospitals.

    PubMed

    Amoroso, P J; Bell, N S; Smith, G S; Senier, L; Pickett, D

    2000-04-01

    Complete and accurate coding of injury causes is essential to the understanding of injury etiology and to the development and evaluation of injury-prevention strategies. While civilian hospitals use ICD-9-CM external cause-of-injury codes, military hospitals use codes derived from the NATO Standardization Agreement (STANAG) 2050. The STANAG uses two separate variables to code injury cause. The Trauma code uses a single digit with 10 possible values to identify the general class of injury as battle injury, intentionally inflicted nonbattle injury, or unintentional injury. The Injury code is used to identify cause or activity at the time of the injury. For a subset of the Injury codes, the last digit is modified to indicate place of occurrence. This simple system contains fewer than 300 basic codes, including many that are specific to battle- and sports-related injuries not coded well by either the ICD-9-CM or the draft ICD-10-CM. However, while falls, poisonings, and injuries due to machinery and tools are common causes of injury hospitalizations in the military, few STANAG codes correspond to these events. Intentional injuries in general and sexual assaults in particular are also not well represented in the STANAG. Because the STANAG does not map directly to the ICD-9-CM system, quantitative comparisons between military and civilian data are difficult. The ICD-10-CM, which will be implemented in the United States sometime after 2001, expands considerably on its predecessor, ICD-9-CM, and provides more specificity and detail than the STANAG. With slight modification, it might become a suitable replacement for the STANAG.

  12. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  13. Domain Specific Language Support for Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domainmore » specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.« less

  14. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  15. A Concept for Run-Time Support of the Chapel Language

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

  16. Calibration and comparison of the NASA Lewis free-piston Stirling engine model predictions with RE-1000 test data

    NASA Technical Reports Server (NTRS)

    Geng, Steven M.

    1987-01-01

    A free-piston Stirling engine performance code is being upgraded and validated at the NASA Lewis Research Center under an interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA Lewis. Many modifications were made to the free-piston code in an attempt to decrease the calibration effort. A procedure was developed that made the code calibration process more systematic. Engine-specific calibration parameters are often used to bring predictions and experimental data into better agreement. The code was calibrated to a matrix of six experimental data points. Predictions of the calibrated free-piston code are compared with RE-1000 free-piston Stirling engine sensitivity test data taken at NASA Lewis. Reasonable agreement was obtained between the code prediction and the experimental data over a wide range of engine operating conditions.

  17. Calibration and comparison of the NASA Lewis free-piston Stirling engine model predictions with RE-1000 test data

    NASA Technical Reports Server (NTRS)

    Geng, Steven M.

    1987-01-01

    A free-piston Stirling engine performance code is being upgraded and validated at the NASA Lewis Research Center under an interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA Lewis. Many modifications were made to the free-piston code in an attempt to decrease the calibration effort. A procedure was developed that made the code calibration process more systematic. Engine-specific calibration parameters are often used to bring predictions and experimental data into better agreement. The code was calibrated to a matrix of six experimental data points. Predictions of the calibrated free-piston code are compared with RE-1000 free-piston Stirling engine sensitivity test data taken at NASA Lewis. Resonable agreement was obtained between the code predictions and the experimental data over a wide range of engine operating conditions.

  18. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  19. Alloy 740H Component Manufacturing Development

    NASA Astrophysics Data System (ADS)

    de Barbadillo, J. J.; Baker, B. A.; Gollihue, R. D.; Patel, S. J.

    Alloy 740H was developed specifically for use in A-USC power plants. This alloy has been intensively evaluated in collaborative programs throughout the world, and the key properties have been verified and documented. In 2011 the alloy was approved for use in welded construction under ASME Code Case 2702. At present, alloy 740H is the only age-hardened nickel-base alloy that is ASME code approved. The emphasis for A-USC materials development is now on verification of the metalworking industry's capability to make the full range of mill product forms and sizes and to produce fittings and fabrications required for construction of a power plant. This paper presents the results of recent developments in component manufacture and evaluation.

  20. Development of an LSI maximum-likelihood convolutional decoder for advanced forward error correction capability on the NASA 30/20 GHz program

    NASA Technical Reports Server (NTRS)

    Clark, R. T.; Mccallister, R. D.

    1982-01-01

    The particular coding option identified as providing the best level of coding gain performance in an LSI-efficient implementation was the optimal constraint length five, rate one-half convolutional code. To determine the specific set of design parameters which optimally matches this decoder to the LSI constraints, a breadboard MCD (maximum-likelihood convolutional decoder) was fabricated and used to generate detailed performance trade-off data. The extensive performance testing data gathered during this design tradeoff study are summarized, and the functional and physical MCD chip characteristics are presented.

  1. GCS component development cycle

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  2. MINIMUM CHECK LIST FOR MECHANICAL PLANS AND SPECIFICATIONS.

    ERIC Educational Resources Information Center

    PIERCE, J.L.

    THIS BULLETIN HAS BEEN PREPARED FOR USE AS A MINIMUM CHECK LIST IN THE DEVELOPMENT AND REVIEW OF MECHANICAL AND ELECTRICAL PLANS AND SPECIFICATIONS BY ENGINEERS, ARCHITECTS, AND SUPERINTENDENTS IN PLANNING PUBLIC SCHOOL FACILITIES. THREE LEVELS OF GUIDELINES ARE MENTIONED--(1) MANDATORY BECAUSE OF LAW, CODE, OR REGULATION, (2) RECOMMENDED AS MOST…

  3. Development of an efficient entire-capsid-coding-region amplification method for direct detection of poliovirus from stool extracts.

    PubMed

    Arita, Minetaro; Kilpatrick, David R; Nakamura, Tomofumi; Burns, Cara C; Bukbuk, David; Oderinde, Soji B; Oberste, M Steven; Kew, Olen M; Pallansch, Mark A; Shimizu, Hiroyuki

    2015-01-01

    Laboratory diagnosis has played a critical role in the Global Polio Eradication Initiative since 1988, by isolating and identifying poliovirus (PV) from stool specimens by using cell culture as a highly sensitive system to detect PV. In the present study, we aimed to develop a molecular method to detect PV directly from stool extracts, with a high efficiency comparable to that of cell culture. We developed a method to efficiently amplify the entire capsid coding region of human enteroviruses (EVs) including PV. cDNAs of the entire capsid coding region (3.9 kb) were obtained from as few as 50 copies of PV genomes. PV was detected from the cDNAs with an improved PV-specific real-time reverse transcription-PCR system and nucleotide sequence analysis of the VP1 coding region. For assay validation, we analyzed 84 stool extracts that were positive for PV in cell culture and detected PV genomes from 100% of the extracts (84/84 samples) with this method in combination with a PV-specific extraction method. PV could be detected in 2/4 stool extract samples that were negative for PV in cell culture. In PV-positive samples, EV species C viruses were also detected with high frequency (27% [23/86 samples]). This method would be useful for direct detection of PV from stool extracts without using cell culture. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  4. Divergent evolutionary rates in vertebrate and mammalian specific conserved non-coding elements (CNEs) in echolocating mammals.

    PubMed

    Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J

    2014-12-19

    The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise frequency-modulated echolocation calls varying widely in frequency and intensity high levels of sequence divergence were found. Levels of selective constraint acting on CNEs differed both across genomic locations and taxa, with observed variation in substitution rates of CNEs among bat species. More work is needed to determine whether this variation can be linked to echolocation, and wider taxonomic sampling is necessary to fully document levels of conservation in CNEs across diverse taxa.

  5. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI).

    PubMed

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-06-01

    Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research.

  6. Guidelines for VCCT-Based Interlaminar Fatigue and Progressive Failure Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Deobald, Lyle R.; Mabson, Gerald E.; Engelstad, Steve; Prabhakar, M.; Gurvich, Mark; Seneviratne, Waruna; Perera, Shenal; O'Brien, T. Kevin; Murri, Gretchen; Ratcliffe, James; hide

    2017-01-01

    This document is intended to detail the theoretical basis, equations, references and data that are necessary to enhance the functionality of commercially available Finite Element codes, with the objective of having functionality better suited for the aerospace industry in the area of composite structural analysis. The specific area of focus will be improvements to composite interlaminar fatigue and progressive interlaminar failure. Suggestions are biased towards codes that perform interlaminar Linear Elastic Fracture Mechanics (LEFM) using Virtual Crack Closure Technique (VCCT)-based algorithms [1,2]. All aspects of the science associated with composite interlaminar crack growth are not fully developed and the codes developed to predict this mode of failure must be programmed with sufficient flexibility to accommodate new functional relationships as the science matures.

  7. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  8. Albany v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew; Phipps, Eric; Ostien, Jakob

    2016-01-13

    The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less

  9. Concurrent electromagnetic scattering analysis

    NASA Technical Reports Server (NTRS)

    Patterson, Jean E.; Cwik, Tom; Ferraro, Robert D.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Parker, Jay

    1989-01-01

    The computational power of the hypercube parallel computing architecture is applied to the solution of large-scale electromagnetic scattering and radiation problems. Three analysis codes have been implemented. A Hypercube Electromagnetic Interactive Analysis Workstation was developed to aid in the design and analysis of metallic structures such as antennas and to facilitate the use of these analysis codes. The workstation provides a general user environment for specification of the structure to be analyzed and graphical representations of the results.

  10. Editing of EIA coded, numerically controlled, machine tool tapes

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.

    1975-01-01

    Editing of numerically controlled (N/C) machine tool tapes (8-level paper tape) using an interactive graphic display processor is described. A rapid technique required for correcting production errors in N/C tapes was developed using the interactive text editor on the IMLAC PDS-ID graphic display system and two special programs resident on disk. The correction technique and special programs for processing N/C tapes coded to EIA specifications are discussed.

  11. NTRFACE for MAGIC

    DTIC Science & Technology

    1989-07-31

    40. NO NO ACCESSION NO N7 ?I TITLE (inWijuod Security Claisification) NTRFACE FOR MAGIC 𔃼 PERSONAL AUTHOR(S) N.T. GLADD PE OF REPORT T b TIME...the MAGIC Particle-in-Cell Simulation Code. 19 ABSTRACT (Contianue on reverse if nceary and d ntiy by block number) The NTRFACE system was developed...made concret by applying it to a specific application- a mature, highly complex plasma physics particle in cell simulation code name MAGIC . This

  12. Agriculture. Dairy Livestock.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for agricultural production, specifically for dairy livestock, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task…

  13. Agriculture. Sheep Livestock.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for agricultural production, specifically for sheep, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list. Each…

  14. Agriculture. Beef Livestock.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for agricultural production, specifically for beef livestock, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task…

  15. Agriculture. Poultry Livestock.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for agricultural production, specifically for poultry, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list.…

  16. Agriculture. Swine Livestock.

    ERIC Educational Resources Information Center

    Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.

    This task-based curriculum guide for agricultural production, specifically for swine, is intended to help the teacher develop a classroom management system where students learn by doing. Introductory materials include a Dictionary of Occupational Titles job code and title sheet, a task sheet for developing leadership skills, and a task list. Each…

  17. VICTORIA: A mechanistic model for radionuclide behavior in the reactor coolant system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaperow, J.H.; Bixler, N.E.

    1996-12-31

    VICTORIA is the U.S. Nuclear Regulatory Commission`s (NRC`s) mechanistic, best-estimate code for analysis of fission product release from the core and subsequent transport in the reactor vessel and reactor coolant system. VICTORIA requires thermal-hydraulic data (i.e., temperatures, pressures, and velocities) as input. In the past, these data have been taken from the results of calculations from thermal-hydraulic codes such as SCDAP/RELAP5, MELCOR, and MAAP. Validation and assessment of VICTORIA 1.0 have been completed. An independent peer review of VICTORIA, directed by Brookhaven National Laboratory and supported by experts in the areas of fuel release, fission product chemistry, and aerosol physics,more » has been undertaken. This peer review, which will independently assess the code`s capabilities, is nearing completion with the peer review committee`s final report expected in Dec 1996. A limited amount of additional development is expected as a result of the peer review. Following this additional development, the NRC plans to release VICTORIA 1.1 and an updated and improved code manual. Future plans mainly involve use of the code for plant calculations to investigate specific safety issues as they arise. Also, the code will continue to be used in support of the Phebus experiments.« less

  18. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  19. A computer-aided design system geared toward conceptual design in a research environment. [for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    STACK S. H.

    1981-01-01

    A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.

  20. Ground Operations Aerospace Language (GOAL). Volume 4: Interpretive code translator

    NASA Technical Reports Server (NTRS)

    1973-01-01

    This specification identifies and describes the principal functions and elements of the Interpretive Code Translator which has been developed for use with the GOAL Compiler. This translator enables the user to convert a compliled GOAL program to a highly general binary format which is designed to enable interpretive execution. The translator program provides user controls which are designed to enable the selection of various output types and formats. These controls provide a means for accommodating many of the implementation options which are discussed in the Interpretive Code Guideline document. The technical design approach is given. The relationship between the translator and the GOAL compiler is explained and the principal functions performed by the Translator are described. Specific constraints regarding the use of the Translator are discussed. The control options are described. These options enable the user to select outputs to be generated by the translator and to control vrious aspects of the translation processing.

  1. Governing sexual behaviour through humanitarian codes of conduct.

    PubMed

    Matti, Stephanie

    2015-10-01

    Since 2001, there has been a growing consensus that sexual exploitation and abuse of intended beneficiaries by humanitarian workers is a real and widespread problem that requires governance. Codes of conduct have been promoted as a key mechanism for governing the sexual behaviour of humanitarian workers and, ultimately, preventing sexual exploitation and abuse (PSEA). This article presents a systematic study of PSEA codes of conduct adopted by humanitarian non-governmental organisations (NGOs) and how they govern the sexual behaviour of humanitarian workers. It draws on Foucault's analytics of governance and speech act theory to examine the findings of a survey of references to codes of conduct made on the websites of 100 humanitarian NGOs, and to analyse some features of the organisation-specific PSEA codes identified. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  2. Optimising Use of Electronic Health Records to Describe the Presentation of Rheumatoid Arthritis in Primary Care: A Strategy for Developing Code Lists

    PubMed Central

    Nicholson, Amanda; Ford, Elizabeth; Davies, Kevin A.; Smith, Helen E.; Rait, Greta; Tate, A. Rosemary; Petersen, Irene; Cassell, Jackie

    2013-01-01

    Background Research using electronic health records (EHRs) relies heavily on coded clinical data. Due to variation in coding practices, it can be difficult to aggregate the codes for a condition in order to define cases. This paper describes a methodology to develop ‘indicator markers’ found in patients with early rheumatoid arthritis (RA); these are a broader range of codes which may allow a probabilistic case definition to use in cases where no diagnostic code is yet recorded. Methods We examined EHRs of 5,843 patients in the General Practice Research Database, aged ≥30y, with a first coded diagnosis of RA between 2005 and 2008. Lists of indicator markers for RA were developed initially by panels of clinicians drawing up code-lists and then modified based on scrutiny of available data. The prevalence of indicator markers, and their temporal relationship to RA codes, was examined in patients from 3y before to 14d after recorded RA diagnosis. Findings Indicator markers were common throughout EHRs of RA patients, with 83.5% having 2 or more markers. 34% of patients received a disease-specific prescription before RA was coded; 42% had a referral to rheumatology, and 63% had a test for rheumatoid factor. 65% had at least one joint symptom or sign recorded and in 44% this was at least 6-months before recorded RA diagnosis. Conclusion Indicator markers of RA may be valuable for case definition in cases which do not yet have a diagnostic code. The clinical diagnosis of RA is likely to occur some months before it is coded, shown by markers frequently occurring ≥6 months before recorded diagnosis. It is difficult to differentiate delay in diagnosis from delay in recording. Information concealed in free text may be required for the accurate identification of patients and to assess the quality of care in general practice. PMID:23451024

  3. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1

    PubMed Central

    Orchard, John; Rae, Katherine; Brooks, John; Hägglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    The Orchard Sports Injury Classification System (OSICS) is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer), Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes) and OSICS 10.1 (four digit codes) are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements. PMID:24198559

  4. Flexible Generation of Kalman Filter Code

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Wilson, Edward

    2006-01-01

    Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator

  5. Village power options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lilienthal, P.

    1997-12-01

    This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less

  6. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations.

    PubMed

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-06-15

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use.

  7. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations

    PubMed Central

    Fernández-Isabel, Alberto; Fuentes-Fernández, Rubén

    2015-01-01

    Intelligent Transportation Systems (ITSs) integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use. PMID:26083232

  8. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  9. Development of an agility assessment module for preliminary fighter design

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Bauer, Brent; Biezad, Daniel; Hahn, Andrew

    1996-01-01

    A FORTRAN computer program is presented to perform agility analysis on fighter aircraft configurations. This code is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. FORTRAN programs were developed for two specific metrics, CCT (Combat Cycle Time) and PM (Pointing Margin), as part of the agility module. The validity of the code was evaluated by comparing with existing flight test data. Example trade studies using the agility module along with ACSYNT were conducted using Northrop F-20 Tigershark and McDonnell Douglas F/A-18 Hornet aircraft models. The sensitivity of thrust loading and wing loading on agility criteria were investigated. The module can compare the agility potential between different configurations and has the capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements.

  10. Use of Systematic Methods to Improve Disease Identification in Administrative Data: The Case of Severe Sepsis.

    PubMed

    Shahraz, Saeid; Lagu, Tara; Ritter, Grant A; Liu, Xiadong; Tompkins, Christopher

    2017-03-01

    Selection of International Classification of Diseases (ICD)-based coded information for complex conditions such as severe sepsis is a subjective process and the results are sensitive to the codes selected. We use an innovative data exploration method to guide ICD-based case selection for severe sepsis. Using the Nationwide Inpatient Sample, we applied Latent Class Analysis (LCA) to determine if medical coders follow any uniform and sensible coding for observations with severe sepsis. We examined whether ICD-9 codes specific to sepsis (038.xx for septicemia, a subset of 995.9 codes representing Systemic Inflammatory Response syndrome, and 785.52 for septic shock) could all be members of the same latent class. Hospitalizations coded with sepsis-specific codes could be assigned to a latent class of their own. This class constituted 22.8% of all potential sepsis observations. The probability of an observation with any sepsis-specific codes being assigned to the residual class was near 0. The chance of an observation in the residual class having a sepsis-specific code as the principal diagnosis was close to 0. Validity of sepsis class assignment is supported by empirical results, which indicated that in-hospital deaths in the sepsis-specific class were around 4 times as likely as that in the residual class. The conventional methods of defining severe sepsis cases in observational data substantially misclassify sepsis cases. We suggest a methodology that helps reliable selection of ICD codes for conditions that require complex coding.

  11. A critical analysis of the accuracy of several numerical techniques for combustion kinetic rate equations

    NASA Technical Reports Server (NTRS)

    Radhadrishnan, Krishnan

    1993-01-01

    A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.

  12. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  13. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  14. Optimization of a Turboprop UAV for Maximum Loiter and Specific Power Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Dinc, Ali

    2016-09-01

    In this study, a genuine code was developed for optimization of selected parameters of a turboprop engine for an unmanned aerial vehicle (UAV) by employing elitist genetic algorithm. First, preliminary sizing of a UAV and its turboprop engine was done, by the code in a given mission profile. Secondly, single and multi-objective optimization were done for selected engine parameters to maximize loiter duration of UAV or specific power of engine or both. In single objective optimization, as first case, UAV loiter time was improved with an increase of 17.5% from baseline in given boundaries or constraints of compressor pressure ratio and burner exit temperature. In second case, specific power was enhanced by 12.3% from baseline. In multi-objective optimization case, where previous two objectives are considered together, loiter time and specific power were increased by 14.2% and 9.7% from baseline respectively, for the same constraints.

  15. Non-coding RNAs: new biomarkers and therapeutic targets for esophageal cancer

    PubMed Central

    Ren, Zhipeng; Zhang, Guoliang

    2017-01-01

    Esophageal cancer is one of the most common gastrointestinal malignant diseases and there is still no effective treatment. The incidence of esophageal cancer in the world is relatively high and on the increase year by year. Thus, the elaboration on the carcinogenesis of esophageal cancer and the identification of new biomarkers and therapeutic targets is quite beneficial to optimizing the current therapeutic regimen for treating such deadly disease. More and more evidence has shown that non-coding RNAs play an important role in the development and progression of multiple human cancers, including esophageal cancer. microRNAs (miRNAs) and long non-coding RNAs (lncRNAs) are two functional kinds of non-coding RNAs that have been well investigated. They exert tumor suppressive or promoting effect by specifically regulating the expression of certain downstream target genes, which is tumor specific. It is also proved that miRNAs and lncRNAs level in tissue and plasma from esophageal cancer patients are closely correlated with the survival and disease progression, which could be used as a prognostic factor and therapeutic target for esophageal cancer. PMID:28388588

  16. Non-coding RNAs: new biomarkers and therapeutic targets for esophageal cancer.

    PubMed

    Hou, Xiaobin; Wen, Jiaxin; Ren, Zhipeng; Zhang, Guoliang

    2017-06-27

    Esophageal cancer is one of the most common gastrointestinal malignant diseases and there is still no effective treatment. The incidence of esophageal cancer in the world is relatively high and on the increase year by year. Thus, the elaboration on the carcinogenesis of esophageal cancer and the identification of new biomarkers and therapeutic targets is quite beneficial to optimizing the current therapeutic regimen for treating such deadly disease. More and more evidence has shown that non-coding RNAs play an important role in the development and progression of multiple human cancers, including esophageal cancer. microRNAs (miRNAs) and long non-coding RNAs (lncRNAs) are two functional kinds of non-coding RNAs that have been well investigated. They exert tumor suppressive or promoting effect by specifically regulating the expression of certain downstream target genes, which is tumor specific. It is also proved that miRNAs and lncRNAs level in tissue and plasma from esophageal cancer patients are closely correlated with the survival and disease progression, which could be used as a prognostic factor and therapeutic target for esophageal cancer.

  17. Regional and temporal variations in coding of hospital diagnoses referring to upper gastrointestinal and oesophageal bleeding in Germany.

    PubMed

    Langner, Ingo; Mikolajczyk, Rafael; Garbe, Edeltraut

    2011-08-17

    Health insurance claims data are increasingly used for health services research in Germany. Hospital diagnoses in these data are coded according to the International Classification of Diseases, German modification (ICD-10-GM). Due to the historical division into West and East Germany, different coding practices might persist in both former parts. Additionally, the introduction of Diagnosis Related Groups (DRGs) in Germany in 2003/2004 might have changed the coding. The aim of this study was to investigate regional and temporal variations in coding of hospitalisation diagnoses in Germany. We analysed hospitalisation diagnoses for oesophageal bleeding (OB) and upper gastrointestinal bleeding (UGIB) from the official German Hospital Statistics provided by the Federal Statistical Office. Bleeding diagnoses were classified as "specific" (origin of bleeding provided) or "unspecific" (origin of bleeding not provided) coding. We studied regional (former East versus West Germany) differences in incidence of hospitalisations with specific or unspecific coding for OB and UGIB and temporal variations between 2000 and 2005. For each year, incidence ratios of hospitalisations for former East versus West Germany were estimated with log-linear regression models adjusting for age, gender and population density. Significant differences in specific and unspecific coding between East and West Germany and over time were found for both, OB and UGIB hospitalisation diagnoses, respectively. For example in 2002, incidence ratios of hospitalisations for East versus West Germany were 1.24 (95% CI 1.16-1.32) for specific and 0.67 (95% CI 0.60-0.74) for unspecific OB diagnoses and 1.43 (95% CI 1.36-1.51) for specific and 0.83 (95% CI 0.80-0.87) for unspecific UGIB. Regional differences nearly disappeared and time trends were less marked when using combined specific and unspecific diagnoses of OB or UGIB, respectively. During the study period, there were substantial regional and temporal variations in the coding of OB and UGIB diagnoses in hospitalised patients. Possible explanations for the observed regional variations are different coding preferences, further influenced by changes in coding and reimbursement rules. Analysing groups of diagnoses including specific and unspecific codes reduces the influence of varying coding practices.

  18. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  19. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  20. [Differentiation of coding quality in orthopaedics by special, illustration-oriented case group analysis in the G-DRG System 2005].

    PubMed

    Schütz, U; Reichel, H; Dreinhöfer, K

    2007-01-01

    We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.

  1. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    PubMed

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  2. Code of Conduct on Biosecurity for Biological Resource Centres: procedural implementation.

    PubMed

    Rohde, Christine; Smith, David; Martin, Dunja; Fritze, Dagmar; Stalpers, Joost

    2013-07-01

    A globally applicable code of conduct specifically dedicated to biosecurity has been developed together with guidance for its procedural implementation. This is to address the regulations governing potential dual-use of biological materials, associated information and technologies, and reduce the potential for their malicious use. Scientists researching and exchanging micro-organisms have a responsibility to prevent misuse of the inherently dangerous ones, that is, those possessing characters such as pathogenicity or toxin production. The code of conduct presented here is based on best practice principles for scientists and their institutions working with biological resources with a specific focus on micro-organisms. It aims to raise awareness of regulatory needs and to protect researchers, their facilities and stakeholders. It reflects global activities in this area in response to legislation such as that in the USA, the PATRIOT Act of 2001, Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001; the Anti-Terrorism Crime and Security Act 2001 and subsequent amendments in the UK; the EU Dual-Use Regulation; and the recommendations of the Organization for Economic Co-operation and Development (OECD), under their Biological Resource Centre (BRC) Initiative at the beginning of the millennium (OECD, 2001). Two project consortia with international partners came together with experts in the field to draw up a Code of Conduct on Biosecurity for BRCs to ensure that culture collections and microbiologists in general worked in a way that met the requirements of such legislation. A BRC is the modern day culture collection that adds value to its holdings and implements common best practice in the collection and supply of strains for research and development. This code of conduct specifically addresses the work of public service culture collections and describes the issues of importance and the controls or practices that should be in place. However, these best practices are equally applicable to all other microbiology laboratories holding, using and sharing microbial resources. The code was introduced to the Seventh Review Conference to the Biological and Toxin Weapons Convention (BTWC), United Nations, Geneva, 2011; the delegates to the States' parties recommended that this code of conduct be broadly applied in the life sciences and disseminated amongst microbiologists, hence the publishing of it here along with practical implementation guidance. This paper considers the regulatory and working environment for microbiology, defines responsibilities and provides practical advice on the implementation of best practice in handling the organism itself, associated data and technical know-how.

  3. Accuracy of external cause-of-injury coding in VA polytrauma patient discharge records.

    PubMed

    Carlson, Kathleen F; Nugent, Sean M; Grill, Joseph; Sayer, Nina A

    2010-01-01

    Valid and efficient methods of identifying the etiology of treated injuries are critical for characterizing patient populations and developing prevention and rehabilitation strategies. We examined the accuracy of external cause-of-injury codes (E-codes) in Veterans Health Administration (VHA) administrative data for a population of injured patients. Chart notes and E-codes were extracted for 566 patients treated at any one of four VHA Polytrauma Rehabilitation Center sites between 2001 and 2006. Two expert coders, blinded to VHA E-codes, used chart notes to assign "gold standard" E-codes to injured patients. The accuracy of VHA E-coding was examined based on these gold standard E-codes. Only 382 of 517 (74%) injured patients were assigned E-codes in VHA records. Sensitivity of VHA E-codes varied significantly by site (range: 59%-91%, p < 0.001). Sensitivity was highest for combat-related injuries (81%) and lowest for fall-related injuries (60%). Overall specificity of E-codes was high (92%). E-coding accuracy was markedly higher when we restricted analyses to records that had been assigned VHA E-codes. E-codes may not be valid for ascertaining source-of-injury data for all injuries among VHA rehabilitation inpatients at this time. Enhanced training and policies may ensure more widespread, standardized use and accuracy of E-codes for injured veterans treated in the VHA.

  4. The accuracy of burn diagnosis codes in health administrative data: A validation study.

    PubMed

    Mason, Stephanie A; Nathens, Avery B; Byrne, James P; Fowler, Rob; Gonzalez, Alejandro; Karanicolas, Paul J; Moineddin, Rahim; Jeschke, Marc G

    2017-03-01

    Health administrative databases may provide rich sources of data for the study of outcomes following burn. We aimed to determine the accuracy of International Classification of Diseases diagnoses codes for burn in a population-based administrative database. Data from a regional burn center's clinical registry of patients admitted between 2006-2013 were linked to administrative databases. Burn total body surface area (TBSA), depth, mechanism, and inhalation injury were compared between the registry and administrative records. The sensitivity, specificity, and positive and negative predictive values were determined, and coding agreement was assessed with the kappa statistic. 1215 burn center patients were linked to administrative records. TBSA codes were highly sensitive and specific for ≥10 and ≥20% TBSA (89/93% sensitive and 95/97% specific), with excellent agreement (κ, 0.85/κ, 0.88). Codes were weakly sensitive (68%) in identifying ≥10% TBSA full-thickness burn, though highly specific (86%) with moderate agreement (κ, 0.46). Codes for inhalation injury had limited sensitivity (43%) but high specificity (99%) with moderate agreement (κ, 0.54). Burn mechanism had excellent coding agreement (κ, 0.84). Administrative data diagnosis codes accurately identify burn by burn size and mechanism, while identification of inhalation injury or full-thickness burns is less sensitive but highly specific. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  5. Overview of the relevant CFD work at Thiokol Corporation

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Loh, Hai-Tien

    1992-01-01

    An in-house developed proprietary advanced computational fluid dynamics code called SHARP (Trademark) is a primary tool for many flow simulations and design analyses. The SHARP code is a time dependent, two dimensional (2-D) axisymmetric numerical solution technique for the compressible Navier-Stokes equations. The solution technique in SHARP uses a vectorizable implicit, second order accurate in time and space, finite volume scheme based on an upwind flux-difference splitting of a Roe-type approximated Riemann solver, Van Leer's flux vector splitting, and a fourth order artificial dissipation scheme with a preconditioning to accelerate the flow solution. Turbulence is simulated by an algebraic model, and ultimately the kappa-epsilon model. Some other capabilities of the code are 2-D two-phase Lagrangian particle tracking and cell blockages. Extensive development and testing has been conducted on the 3-D version of the code with flow, combustion, and turbulence interactions. The emphasis here is on the specific applications of SHARP in Solid Rocket Motor design. Information is given in viewgraph form.

  6. Mesh-based Monte Carlo code for fluorescence modeling in complex tissues with irregular boundaries

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Chen, Leng-Chun; Lloyd, William; Kuo, Shiuhyang; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann

    2011-07-01

    There is a growing need for the development of computational models that can account for complex tissue morphology in simulations of photon propagation. We describe the development and validation of a user-friendly, MATLAB-based Monte Carlo code that uses analytically-defined surface meshes to model heterogeneous tissue geometry. The code can use information from non-linear optical microscopy images to discriminate the fluorescence photons (from endogenous or exogenous fluorophores) detected from different layers of complex turbid media. We present a specific application of modeling a layered human tissue-engineered construct (Ex Vivo Produced Oral Mucosa Equivalent, EVPOME) designed for use in repair of oral tissue following surgery. Second-harmonic generation microscopic imaging of an EVPOME construct (oral keratinocytes atop a scaffold coated with human type IV collagen) was employed to determine an approximate analytical expression for the complex shape of the interface between the two layers. This expression can then be inserted into the code to correct the simulated fluorescence for the effect of the irregular tissue geometry.

  7. MODELLING OF FUEL BEHAVIOUR DURING LOSS-OF-COOLANT ACCIDENTS USING THE BISON CODE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pastore, G.; Novascone, S. R.; Williamson, R. L.

    2015-09-01

    This work presents recent developments to extend the BISON code to enable fuel performance analysis during LOCAs. This newly developed capability accounts for the main physical phenomena involved, as well as the interactions among them and with the global fuel rod thermo-mechanical analysis. Specifically, new multiphysics models are incorporated in the code to describe (1) transient fission gas behaviour, (2) rapid steam-cladding oxidation, (3) Zircaloy solid-solid phase transition, (4) hydrogen generation and transport through the cladding, and (5) Zircaloy high-temperature non-linear mechanical behaviour and failure. Basic model characteristics are described, and a demonstration BISON analysis of a LWR fuel rodmore » undergoing a LOCA accident is presented. Also, as a first step of validation, the code with the new capability is applied to the simulation of experiments investigating cladding behaviour under LOCA conditions. The comparison of the results with the available experimental data of cladding failure due to burst is presented.« less

  8. The View Behind and Ahead: Implications of Certification *

    PubMed Central

    Darling, Louise

    1973-01-01

    The Medical Library Association's certification plan, never of real significance in employment and promotion practices in health sciences librarianship, does not reflect the many changes which have occurred in swift progression since adoption of the code in 1949. Solutions to the problems which have accumulated since then are sought in a brief examination of trends in credentialing and certification in the health professions and in the library field, both general and special. Emphasis is given to the historical development of provisions in the MLA Code for the Training and Certification of Medical Librarians, the limited opportunity for practical implementation of most of the provisions, the importance of the code in stimulating the Association's educational programs, the impact of the Medical Library Assistance Act, Regional Medical Programs, and increases in demand for health information on manpower requirements for health science libraries, the specific dissatisfactions MLA members have expressed over certification, and the role of the Ad Hoc Committee to Develop a New Certification Code. PMID:4744343

  9. AutoBayes Program Synthesis System Users Manual

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd

    2008-01-01

    Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.

  10. Decoding the Emerging Patterns Exhibited in Non-coding RNAs Characteristic of Lung Cancer with Regard to their Clinical Significance.

    PubMed

    Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana

    2018-05-01

    Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.

  11. Genome-wide analysis of alternative splicing during human heart development

    NASA Astrophysics Data System (ADS)

    Wang, He; Chen, Yanmei; Li, Xinzhong; Chen, Guojun; Zhong, Lintao; Chen, Gangbing; Liao, Yulin; Liao, Wangjun; Bin, Jianping

    2016-10-01

    Alternative splicing (AS) drives determinative changes during mouse heart development. Recent high-throughput technological advancements have facilitated genome-wide AS, while its analysis in human foetal heart transition to the adult stage has not been reported. Here, we present a high-resolution global analysis of AS transitions between human foetal and adult hearts. RNA-sequencing data showed extensive AS transitions occurred between human foetal and adult hearts, and AS events occurred more frequently in protein-coding genes than in long non-coding RNA (lncRNA). A significant difference of AS patterns was found between foetal and adult hearts. The predicted difference in AS events was further confirmed using quantitative reverse transcription-polymerase chain reaction analysis of human heart samples. Functional foetal-specific AS event analysis showed enrichment associated with cell proliferation-related pathways including cell cycle, whereas adult-specific AS events were associated with protein synthesis. Furthermore, 42.6% of foetal-specific AS events showed significant changes in gene expression levels between foetal and adult hearts. Genes exhibiting both foetal-specific AS and differential expression were highly enriched in cell cycle-associated functions. In conclusion, we provided a genome-wide profiling of AS transitions between foetal and adult hearts and proposed that AS transitions and deferential gene expression may play determinative roles in human heart development.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.

  13. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes.

    PubMed

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-02-01

    To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.

  14. Object-oriented code SUR for plasma kinetic simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levchenko, V.D.; Sigov, Y.S.

    1995-12-31

    We have developed a self-consistent simulation code based on object-oriented model of plasma (OOMP) for solving the Vlasov/Poisson (V/P), Vlasov/Maxwell (V/M), Bhatnagar-Gross-Krook (BGK) as well as Fokker-Planck (FP) kinetic equations. The application of an object-oriented approach (OOA) to simulation of plasmas and plasma-like media by means of splitting methods permits to uniformly describe and solve the wide circle of plasma kinetics problems, including those being very complicated: many-dimensional, relativistic, with regard for collisions, specific boundary conditions etc. This paper gives the brief description of possibilities of the SUR code, as a concrete realization of OOMP.

  15. Requirements for migration of NSSD code systems from LTSS to NLTSS

    NASA Technical Reports Server (NTRS)

    Pratt, M.

    1984-01-01

    The purpose of this document is to address the requirements necessary for a successful conversion of the Nuclear Design (ND) application code systems to the NLTSS environment. The ND application code system community can be characterized as large-scale scientific computation carried out on supercomputers. NLTSS is a distributed operating system being developed at LLNL to replace the LTSS system currently in use. The implications of change are examined including a description of the computational environment and users in ND. The discussion then turns to requirements, first in a general way, followed by specific requirements, including a proposal for managing the transition.

  16. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less

  17. Population-based drug-related anaphylaxis in children and adolescents captured by South Carolina Emergency Room Hospital Discharge Database (SCERHDD) (2000-2002).

    PubMed

    West, Suzanne L; D'Aloisio, Aimee A; Ringel-Kulka, Tamar; Waller, Anna E; Clayton Bordley, W

    2007-12-01

    Anaphylaxis is a life-threatening condition; drug-related anaphylaxis represents approximately 10% of all cases. We assessed the utility of a statewide emergency department (ED) database for identifying drug-related anaphylaxis in children by developing and validating an algorithm composed of ICD-9-CM codes. There were 1 314,760 visits to South Carolina (SC) emergency departments (EDs) for patients <19 years in 2000-2002. We used ICD-9-CM disease or external cause of injury codes (E-codes) that suggested drug-related anaphylaxis or a severe drug-related allergic reaction. We found 50 cases classifiable as probable or possible drug-related anaphylaxis and 13 as drug-related allergic reactions. We used clinical evaluation by two pediatricians as the 'alloyed gold standard'1 for estimating sensitivity, specificity, and positive predictive value (PPV) of our algorithm. ED-treated drug-related anaphylaxis in the SC pediatric population was 1.56/100,000 person-years based on the algorithm and 0.50/100,000 person-years based on clinical evaluation. Assuming the disease codes we used identified all potential anaphylaxis cases in the database, the sensitivity was 1.00 (95%CI: 0.79, 1.00), specificity was 0.28 (95%CI: 0.16, 0.43), and the PPV was 0.32 (0.20, 0.47) for the algorithm. Sensitivity analyses improved the measurement properties of the algorithm. E-codes were invaluable for developing an anaphylaxis algorithm although the frequently used code of E947.9 was often incorrectly applied. We believe that our algorithm may have over-ascertained drug-related anaphylaxis patients seen in an ED, but the clinical evaluation may have under-represented this diagnosis due to limited information on the offending agent in the abstracted ED records. Post-marketing drug surveillance using ED records may be viable if clinicians were to document drug-related anaphylaxis in the charts so that billing codes could be assigned properly. Copyright 2007 John Wiley & Sons, Ltd.

  18. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    PubMed

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. The openEHR Java reference implementation project.

    PubMed

    Chen, Rong; Klein, Gunnar

    2007-01-01

    The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.

  20. Xyce Parallel Electronic Simulator Users' Guide Version 6.8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Aadithya, Karthik Venkatraman; Mei, Ting

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase$-$ a message passing parallel implementation $-$ which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  1. Circular codes revisited: a statistical approach.

    PubMed

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. ETF system code: composition and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less

  3. Comparison Between Simulated and Experimentally Measured Performance of a Four Port Wave Rotor

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Wilson, Jack; Welch, Gerard E.

    2007-01-01

    Performance and operability testing has been completed on a laboratory-scale, four-port wave rotor, of the type suitable for use as a topping cycle on a gas turbine engine. Many design aspects, and performance estimates for the wave rotor were determined using a time-accurate, one-dimensional, computational fluid dynamics-based simulation code developed specifically for wave rotors. The code follows a single rotor passage as it moves past the various ports, which in this reference frame become boundary conditions. This paper compares wave rotor performance predicted with the code to that measured during laboratory testing. Both on and off-design operating conditions were examined. Overall, the match between code and rig was found to be quite good. At operating points where there were disparities, the assumption of larger than expected internal leakage rates successfully realigned code predictions and laboratory measurements. Possible mechanisms for such leakage rates are discussed.

  4. Reading the Second Code: Mapping Epigenomes to Understand Plant Growth, Development, and Adaptation to the Environment[OA

    PubMed Central

    2012-01-01

    We have entered a new era in agricultural and biomedical science made possible by remarkable advances in DNA sequencing technologies. The complete sequence of an individual’s set of chromosomes (collectively, its genome) provides a primary genetic code for what makes that individual unique, just as the contents of every personal computer reflect the unique attributes of its owner. But a second code, composed of “epigenetic” layers of information, affects the accessibility of the stored information and the execution of specific tasks. Nature’s second code is enigmatic and must be deciphered if we are to fully understand and optimize the genetic potential of crop plants. The goal of the Epigenomics of Plants International Consortium is to crack this second code, and ultimately master its control, to help catalyze a new green revolution. PMID:22751210

  5. The feasibility of adapting a population-based asthma-specific job exposure matrix (JEM) to NHANES.

    PubMed

    McHugh, Michelle K; Symanski, Elaine; Pompeii, Lisa A; Delclos, George L

    2010-12-01

    To determine the feasibility of applying a job exposure matrix (JEM) for classifying exposures to 18 asthmagens in the National Health and Nutrition Examination Survey (NHANES), 1999-2004. We cross-referenced 490 National Center for Health Statistics job codes used to develop the 40 NHANES occupation groups with 506 JEM job titles and assessed homogeneity in asthmagen exposure across job codes within each occupation group. In total, 399 job codes corresponded to one JEM job title, 32 to more than one job title, and 59 were not in the JEM. Three occupation groups had the same asthmagen exposure across job codes, 11 had no asthmagen exposure, and 26 groups had heterogeneous exposures across jobs codes. The NHANES classification of occupations limits the use of the JEM to evaluate the association between workplace exposures and asthma and more refined occupational data are needed to enhance work-related injury/illness surveillance efforts.

  6. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  7. Computer code for analyzing the performance of aquifer thermal energy storage systems

    NASA Astrophysics Data System (ADS)

    Vail, L. W.; Kincaid, C. T.; Kannberg, L. D.

    1985-05-01

    A code called Aquifer Thermal Energy Storage System Simulator (ATESSS) has been developed to analyze the operational performance of ATES systems. The ATESSS code provides an ability to examine the interrelationships among design specifications, general operational strategies, and unpredictable variations in the demand for energy. The uses of the code can vary the well field layout, heat exchanger size, and pumping/injection schedule. Unpredictable aspects of supply and demand may also be examined through the use of a stochastic model of selected system parameters. While employing a relatively simple model of the aquifer, the ATESSS code plays an important role in the design and operation of ATES facilities by augmenting experience provided by the relatively few field experiments and demonstration projects. ATESSS has been used to characterize the effect of different pumping/injection schedules on a hypothetical ATES system and to estimate the recovery at the St. Paul, Minnesota, field experiment.

  8. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  9. Multi-optimization Criteria-based Robot Behavioral Adaptability and Motion Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, Francois G.

    2002-06-01

    Robotic tasks are typically defined in Task Space (e.g., the 3-D World), whereas robots are controlled in Joint Space (motors). The transformation from Task Space to Joint Space must consider the task objectives (e.g., high precision, strength optimization, torque optimization), the task constraints (e.g., obstacles, joint limits, non-holonomic constraints, contact or tool task constraints), and the robot kinematics configuration (e.g., tools, type of joints, mobile platform, manipulator, modular additions, locked joints). Commercially available robots are optimized for a specific set of tasks, objectives and constraints and, therefore, their control codes are extremely specific to a particular set of conditions. Thus,more » there exist a multiplicity of codes, each handling a particular set of conditions, but none suitable for use on robots with widely varying tasks, objectives, constraints, or environments. On the other hand, most DOE missions and tasks are typically ''batches of one''. Attempting to use commercial codes for such work requires significant personnel and schedule costs for re-programming or adding code to the robots whenever a change in task objective, robot configuration, number and type of constraint, etc. occurs. The objective of our project is to develop a ''generic code'' to implement this Task-space to Joint-Space transformation that would allow robot behavior adaptation, in real time (at loop rate), to changes in task objectives, number and type of constraints, modes of controls, kinematics configuration (e.g., new tools, added module). Our specific goal is to develop a single code for the general solution of under-specified systems of algebraic equations that is suitable for solving the inverse kinematics of robots, is useable for all types of robots (mobile robots, manipulators, mobile manipulators, etc.) with no limitation on the number of joints and the number of controlled Task-Space variables, can adapt to real time changes in number and type of constraints and in task objectives, and can adapt to changes in kinematics configurations (change of module, change of tool, joint failure adaptation, etc.).« less

  10. [Standardization of hospital feeding].

    PubMed

    Caracuel García, Ángel Manuel

    2015-05-07

    Normalization can be understood as the establishing measures against repetitive situations through the development, dissemination, and application of technical design documents called standards. In Andalusia there are 45 public hospitals with 14,606 beds, and in which 11,700 full pensions / day are served. The Working Group on Hospital Food Standardization of the Andalusian Society for Clinical Nutrition and Dietetics, started in 2010, working on the certification of suppliers, product specifications, and meals technical card. - Develop a specific tool to help improving food safety through the certification of their suppliers. - Develop a standardized technical specifications of foodstuffs necessary for the development of menus established codes diets Andalusian hospitals document. - Develop a catalog of data sheets plates of hospital meals, to homogenize menus, respecting local and unifying criteria for qualitative and quantitative ingredients. - Providing documentation and studying of several public hospitals in Andalusia: • Product specifications and certification of suppliers. • International standards certification and distribution companies. • Legislation. • Data sheets for the menu items. • Specifications of different product procurement procedures. - Development of the draft standard HOSPIFOOD®, and approval of the version “0.0”. - Training course for auditors to this standard. - Development of a raw materials catalog as technical cards. - Meals Technical cards review and election of the ones which will be part of the document. After nearly three years of work, we have achieved the following products: - Standardized database of technical specifications for the production of food dietary codes for: fish, seafood, meat and meat products, meats and pates, ready meals, bread and pastries, preserves, milk and dairy products, oils, cereals, legumes , vegetables, fruits, fresh and frozen vegetables, condiments and spices. - Standardized database of technical cards for meals containing the following data: SAS Code, Province, Hospital, name plate, ingredients (g), edible ingredients (g) kcal, Proteins, HC, Fat and Fiber. - HOSPIFOOD® standard certification for food providers in hospitals, school cafeterias and other institutions of social restoration. Patients expect food that is offered during the stay in the hospital, meet basic standards of quality and safety, and therefore it is necessary to design and develop control systems from the award and / or acquisition of food (raw materials and finished) products which subsequently become part of the menu that is offered as part of their treatment. To avoid the effect of fraudulent practice in public health, it’s needed to ensure the quality and safety of the food from the origin and establish the standards for acquisition and subsequent use of it.

  11. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  12. Understanding the Role of Non-Coding RNAs in Bladder Cancer: From Dark Matter to Valuable Therapeutic Targets

    PubMed Central

    Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana

    2017-01-01

    The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are—for the time being—not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells. PMID:28703782

  13. Understanding the Role of Non-Coding RNAs in Bladder Cancer: From Dark Matter to Valuable Therapeutic Targets.

    PubMed

    Pop-Bica, Cecilia; Gulei, Diana; Cojocneanu-Petric, Roxana; Braicu, Cornelia; Petrut, Bogdan; Berindan-Neagoe, Ioana

    2017-07-13

    The mortality and morbidity that characterize bladder cancer compel this malignancy into the category of hot topics in terms of biomolecular research. Therefore, a better knowledge of the specific molecular mechanisms that underlie the development and progression of bladder cancer is demanded. Tumor heterogeneity among patients with similar diagnosis, as well as intratumor heterogeneity, generates difficulties in terms of targeted therapy. Furthermore, late diagnosis represents an ongoing issue, significantly reducing the response to therapy and, inevitably, the overall survival. The role of non-coding RNAs in bladder cancer emerged in the last decade, revealing that microRNAs (miRNAs) may act as tumor suppressor genes, respectively oncogenes, but also as biomarkers for early diagnosis. Regarding other types of non-coding RNAs, especially long non-coding RNAs (lncRNAs) which are extensively reviewed in this article, their exact roles in tumorigenesis are-for the time being-not as evident as in the case of miRNAs, but, still, clearly suggested. Therefore, this review covers the non-coding RNA expression profile of bladder cancer patients and their validated target genes in bladder cancer cell lines, with repercussions on processes such as proliferation, invasiveness, apoptosis, cell cycle arrest, and other molecular pathways which are specific for the malignant transformation of cells.

  14. Epigenetics: a new frontier in dentistry.

    PubMed

    Williams, S D; Hughes, T E; Adler, C J; Brook, A H; Townsend, G C

    2014-06-01

    In 2007, only four years after the completion of the Human Genome Project, the journal Science announced that epigenetics was the 'breakthrough of the year'. Time magazine placed it second in the top 10 discoveries of 2009. While our genetic code (i.e. our DNA) contains all of the information to produce the elements we require to function, our epigenetic code determines when and where genes in the genetic code are expressed. Without the epigenetic code, the genetic code is like an orchestra without a conductor. Although there is now a substantial amount of published research on epigenetics in medicine and biology, epigenetics in dental research is in its infancy. However, epigenetics promises to become increasingly relevant to dentistry because of the role it plays in gene expression during development and subsequently potentially influencing oral disease susceptibility. This paper provides a review of the field of epigenetics aimed specifically at oral health professionals. It defines epigenetics, addresses the underlying concepts and provides details about specific epigenetic molecular mechanisms. Further, we discuss some of the key areas where epigenetics is implicated, and review the literature on epigenetics research in dentistry, including its relevance to clinical disciplines. This review considers some implications of epigenetics for the future of dental practice, including a 'personalized medicine' approach to the management of common oral diseases. © 2014 Australian Dental Association.

  15. Adapting the coping in deliberation (CODE) framework: a multi-method approach in the context of familial ovarian cancer risk management.

    PubMed

    Witt, Jana; Elwyn, Glyn; Wood, Fiona; Rogers, Mark T; Menon, Usha; Brain, Kate

    2014-11-01

    To test whether the coping in deliberation (CODE) framework can be adapted to a specific preference-sensitive medical decision: risk-reducing bilateral salpingo-oophorectomy (RRSO) in women at increased risk of ovarian cancer. We performed a systematic literature search to identify issues important to women during deliberations about RRSO. Three focus groups with patients (most were pre-menopausal and untested for genetic mutations) and 11 interviews with health professionals were conducted to determine which issues mattered in the UK context. Data were used to adapt the generic CODE framework. The literature search yielded 49 relevant studies, which highlighted various issues and coping options important during deliberations, including mutation status, risks of surgery, family obligations, physician recommendation, peer support and reliable information sources. Consultations with UK stakeholders confirmed most of these factors as pertinent influences on deliberations. Questions in the generic framework were adapted to reflect the issues and coping options identified. The generic CODE framework was readily adapted to a specific preference-sensitive medical decision, showing that deliberations and coping are linked during deliberations about RRSO. Adapted versions of the CODE framework may be used to develop tailored decision support methods and materials in order to improve patient-centred care. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample

    PubMed Central

    Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.

    2012-01-01

    Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609

  17. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  18. Development of real-time software environments for NASA's modern telemetry systems

    NASA Technical Reports Server (NTRS)

    Horner, Ward; Sabia, Steve

    1989-01-01

    An effort has been made to maintain maximum performance and flexibility for NASA-Goddard's VLSI telemetry system elements through the development of two real-time systems: (1) the Base System Environment, which supports generic system integration and furnishes the basic porting of various manufacturers' cards, and (2) the Modular Environment for Data Systems, which supports application-specific developments and furnishes designers with a set of tested generic library functions that can be employed to speed up the development of such application-specific real-time codes. The performance goals and design rationale for these two systems are discussed.

  19. Support for Systematic Code Reviews with the SCRUB Tool

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerald J.

    2010-01-01

    SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.

  20. acme: The Amendable Coal-Fire Modeling Exercise. A C++ Class Library for the Numerical Simulation of Coal-Fires

    NASA Astrophysics Data System (ADS)

    Wuttke, Manfred W.

    2017-04-01

    At LIAG, we use numerical models to develop and enhance understanding of coupled transport processes and to predict the dynamics of the system under consideration. Topics include geothermal heat utilization, subrosion processes, and spontaneous underground coal fires. Although the details make it inconvenient if not impossible to apply a single code implementation to all systems, their investigations go along similar paths: They all depend on the solution of coupled transport equations. We thus saw a need for a modular code system with open access for the various communities to maximize the shared synergistic effects. To this purpose we develop the oops! ( open object-oriented parallel solutions) - toolkit, a C++ class library for the numerical solution of mathematical models of coupled thermal, hydraulic and chemical processes. This is used to develop problem-specific libraries like acme( amendable coal-fire modeling exercise), a class library for the numerical simulation of coal-fires and applications like kobra (Kohlebrand, german for coal-fire), a numerical simulation code for standard coal-fire models. Basic principle of the oops!-code system is the provision of data types for the description of space and time dependent data fields, description of terms of partial differential equations (pde), their discretisation and solving methods. Coupling of different processes, described by their particular pde is modeled by an automatic timescale-ordered operator-splitting technique. acme is a derived coal-fire specific application library, depending on oops!. If specific functionalities of general interest are implemented and have been tested they will be assimilated into the main oops!-library. Interfaces to external pre- and post-processing tools are easily implemented. Thus a construction kit which can be arbitrarily amended is formed. With the kobra-application constructed with acme we study the processes and propagation of shallow coal seam fires in particular in Xinjiang, China, as well as analyze and interpret results from lab experiments.

  1. Coverage of Developed and Developing Nations in American Wire Services to Asia.

    ERIC Educational Resources Information Center

    Giffard, C. Anthony

    A study was conducted to contrast the news coverage of developed and developing nations, and of the United States specifically, as transmitted to Asia by the Associated Press (AP) and United Press International (UPI). A total of 556 AP reports and 453 UPI reports drawn from a 6-week period were coded for more than 100 variables and 47 topics. The…

  2. DEVELOPMENT OF PERMANENT MECHANICAL REPAIR SLEEVE FOR PLASTIC PIPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hitesh Patadia

    2004-09-30

    The report presents a comprehensive summary of the project status related to the development of a permanent mechanical repair fitting intended to be installed on damaged PE mains under blowing gas conditions. Specifically, the product definition has been developed taking into account relevant codes and standards and industry input. A conceptual design for the mechanical repair sleeve has been developed which meets the product definition.

  3. Creating and Testing Simulation Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  4. Prediction task guided representation learning of medical codes in EHR.

    PubMed

    Cui, Liwen; Xie, Xiaolei; Shen, Zuojun

    2018-06-18

    There have been rapidly growing applications using machine learning models for predictive analytics in Electronic Health Records (EHR) to improve the quality of hospital services and the efficiency of healthcare resource utilization. A fundamental and crucial step in developing such models is to convert medical codes in EHR to feature vectors. These medical codes are used to represent diagnoses or procedures. Their vector representations have a tremendous impact on the performance of machine learning models. Recently, some researchers have utilized representation learning methods from Natural Language Processing (NLP) to learn vector representations of medical codes. However, most previous approaches are unsupervised, i.e. the generation of medical code vectors is independent from prediction tasks. Thus, the obtained feature vectors may be inappropriate for a specific prediction task. Moreover, unsupervised methods often require a lot of samples to obtain reliable results, but most practical problems have very limited patient samples. In this paper, we develop a new method called Prediction Task Guided Health Record Aggregation (PTGHRA), which aggregates health records guided by prediction tasks, to construct training corpus for various representation learning models. Compared with unsupervised approaches, representation learning models integrated with PTGHRA yield a significant improvement in predictive capability of generated medical code vectors, especially for limited training samples. Copyright © 2018. Published by Elsevier Inc.

  5. Inquiry-Based Learning Case Studies for Computing and Computing Forensic Students

    ERIC Educational Resources Information Center

    Campbell, Jackie

    2012-01-01

    Purpose: The purpose of this paper is to describe and discuss the use of specifically-developed, inquiry-based learning materials for Computing and Forensic Computing students. Small applications have been developed which require investigation in order to de-bug code, analyse data issues and discover "illegal" behaviour. The applications…

  6. Advanced software development workstation project: Engineering scripting language. Graphical editor

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  7. 44 CFR 206.400 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...

  8. 44 CFR 206.400 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...

  9. 44 CFR 206.400 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...

  10. 44 CFR 206.400 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...

  11. 44 CFR 206.400 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...

  12. Resistance and Seakeeping Database for USCG 157 FT WLM

    DTIC Science & Technology

    1991-07-01

    Technical Information Service, Springfield, Virginia 22161 X Prepared for: U.S. Coast Guard Research and Development Center 1082 Shennecossett Road...specification or regulation. . SAMUEL F. POWEL, III Technical Director United States Coast Guard Research & Development Center 1082 Shennecossett Road...14. Sponsoring Agency Code 1082 Shennecossett Road Office of Engineering, Logistics, Groton, Connecticut 06340-6096 and Development Washington, D.C

  13. Fast Model Generalized Pseudopotential Theory Interatomic Potential Routine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-03-18

    MGPT is an unclassified source code for the fast evaluation and application of quantum-based MGPT interatomic potentials for mrtals. The present version of MGPT has been developed entirely at LLNL, but is specifically designed for implementation in the open-source molecular0dynamics code LAMMPS maintained by Sandia National Laboratories. Using MGPT in LAMMPS, with separate input potential data, one can perform large-scale atomistic simulations of the structural, thermodynamic, defeat and mechanical properties of transition metals with quantum-mechanical realism.

  14. Wavelet-based compression of M-FISH images.

    PubMed

    Hua, Jianping; Xiong, Zixiang; Wu, Qiang; Castleman, Kenneth R

    2005-05-01

    Multiplex fluorescence in situ hybridization (M-FISH) is a recently developed technology that enables multi-color chromosome karyotyping for molecular cytogenetic analysis. Each M-FISH image set consists of a number of aligned images of the same chromosome specimen captured at different optical wavelength. This paper presents embedded M-FISH image coding (EMIC), where the foreground objects/chromosomes and the background objects/images are coded separately. We first apply critically sampled integer wavelet transforms to both the foreground and the background. We then use object-based bit-plane coding to compress each object and generate separate embedded bitstreams that allow continuous lossy-to-lossless compression of the foreground and the background. For efficient arithmetic coding of bit planes, we propose a method of designing an optimal context model that specifically exploits the statistical characteristics of M-FISH images in the wavelet domain. Our experiments show that EMIC achieves nearly twice as much compression as Lempel-Ziv-Welch coding. EMIC also performs much better than JPEG-LS and JPEG-2000 for lossless coding. The lossy performance of EMIC is significantly better than that of coding each M-FISH image with JPEG-2000.

  15. Differentiation of ileostomy from colostomy procedures: assessing the accuracy of current procedural terminology codes and the utility of natural language processing.

    PubMed

    Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K

    2013-08-01

    Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75.2% and 80.9%). NLP performed with greater sensitivity, specificity, and accuracy than CPT codes in identifying stoma procedures and stoma types. Major differences where NLP outperformed CPT included identifying ileostomy (specificity 95.8%, sensitivity 88.3%, and accuracy 91.5%) and colostomy (97.6%, 90.5%, and 92.8%, respectively). CPT codes can identify effectively patients who have had stoma procedures and are adequate in distinguishing between formation and reversal; however, CPT codes cannot differentiate ileostomy from colostomy. NLP can be used to differentiate between ileostomy- and colostomy-related procedures. The role of NLP in conjunction with electronic medical records in data retrieval warrants further investigation. Published by Mosby, Inc.

  16. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  17. Risk of preterm birth by subtype among Medi-Cal participants with mental illness.

    PubMed

    Baer, Rebecca J; Chambers, Christina D; Bandoli, Gretchen; Jelliffe-Pawlowski, Laura L

    2016-10-01

    Previous studies have demonstrated an association between mental illness and preterm birth (before 37 weeks). However, these investigations have not simultaneously considered gestation of preterm birth, the indication (eg, spontaneous or medically indicated), and specific mental illness classifications. The objective of the study was to examine the likelihood of preterm birth across gestational lengths and indications among Medi-Cal (California's Medicaid program) participants with a diagnostic code for mental illness. Mental illnesses were studied by specific illness classification. The study population was drawn from singleton live births in California from 2007 through 2011 in the birth cohort file maintained by the California Office of Statewide Health Planning and Development, which includes birth certificate and hospital discharge records. The sample was restricted to women with Medi-Cal coverage for prenatal care. Women with mental illness were identified using International Classification of Diseases, ninth revision, codes from their hospital discharge record. Women without a mental illness International Classification of Diseases, ninth revision, code were randomly selected at a 4:1 ratio. Adjusting for maternal characteristics and obstetric complications, relative risks and 95% confidence intervals were calculated for preterm birth comparing women with a mental illness diagnostic code with women without such a code. We identified 6198 women with a mental illness diagnostic code and selected 24,792 women with no such code. The risk of preterm birth in women with a mental illness were 1.2 times higher than women without a mental illness (adjusted relative risk, 1.2, 95% confidence interval, 1.1-1.3). Among the specific mental illnesses, schizophrenia, major depression, and personality disorders had the strongest associations with preterm birth (adjusted relative risks, 2.0, 2.0 and 3.3, respectively). Women receiving prenatal care through California's low-income health insurance who had at least 1 mental illness diagnostic code were 1.2-3.3-times more likely to have a preterm birth than women without a mental illness, and these risks persisted across most illness classifications. Although it cannot be determined from these data whether specific treatments for mental illness contribute to the observed associations, elevated risk across different diagnoses suggests that some aspects of mental illness itself may confer risk. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  19. Framework GRASP: routine library for optimize processing of aerosol remote sensing observation

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Torres, Benjamin; Dubovik, Oleg; Litvinov, Pavel; Lapyonok, Tatyana; Ducos, Fabrice; Aspetsberger, Michael; Federspiel, Christian

    The present the development of a Framework for the Generalized Retrieval of Aerosol and Surface Properties (GRASP) developed by Dubovik et al., (2011). The framework is a source code project that attempts to strengthen the value of the GRASP inversion algorithm by transforming it into a library that will be used later for a group of customized application modules. The functions of the independent modules include the managing of the configuration of the code execution, as well as preparation of the input and output. The framework provides a number of advantages in utilization of the code. First, it implements loading data to the core of the scientific code directly from memory without passing through intermediary files on disk. Second, the framework allows consecutive use of the inversion code without the re-initiation of the core routine when new input is received. These features are essential for optimizing performance of the data production in processing of large observation sets, such as satellite images by the GRASP. Furthermore, the framework is a very convenient tool for further development, because this open-source platform is easily extended for implementing new features. For example, it could accommodate loading of raw data directly onto the inversion code from a specific instrument not included in default settings of the software. Finally, it will be demonstrated that from the user point of view, the framework provides a flexible, powerful and informative configuration system.

  20. Duct flow nonuniformities for Space Shuttle Main Engine (SSME)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A three-duct Space Shuttle Main Engine (SSME) Hot Gas Manifold geometry code was developed for use. The methodology of the program is described, recommendations on its implementation made, and an input guide, input deck listing, and a source code listing provided. The code listing is strewn with an abundance of comments to assist the user in following its development and logic. A working source deck will be provided. A thorough analysis was made of the proper boundary conditions and chemistry kinetics necessary for an accurate computational analysis of the flow environment in the SSME fuel side preburner chamber during the initial startup transient. Pertinent results were presented to facilitate incorporation of these findings into an appropriate CFD code. The computation must be a turbulent computation, since the flow field turbulent mixing will have a profound effect on the chemistry. Because of the additional equations demanded by the chemistry model it is recommended that for expediency a simple algebraic mixing length model be adopted. Performing this computation for all or selected time intervals of the startup time will require an abundance of computer CPU time regardless of the specific CFD code selected.

  1. [Representation of knowledge in respiratory medicine: ontology should help the coding process].

    PubMed

    Blanc, F-X; Baneyx, A; Charlet, J; Housset, B

    2010-09-01

    Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.

  2. Using fault tree analysis to identify causes of non-compliance: enhancing violation outcome data for the purposes of education and prevention.

    PubMed

    Emery, R J; Charlton, M A; Orders, A B; Hernandez, M

    2001-02-01

    An enhanced coding system for the characterization of notices of violation (NOV's) issued to radiation permit holders in the State of Texas was developed based on a series of fault tree analyses serving to identify a set of common causes. The coding system enhancement was retroactively applied to a representative sample (n = 185) of NOV's issued to specific licensees of radioactive materials in Texas during calendar year 1999. The results obtained were then compared to the currently available summary NOV information for the same year. In addition to identifying the most common NOV's, the enhanced coding system revealed that approximately 70% of the sampled NOV's were issued for non-compliance with a specific regulation as opposed to a permit condition. Furthermore, an underlying cause of 94% of the NOV's was the failure on the part of the licensee to execute a specific task. The findings suggest that opportunities exist to improve permit holder compliance through various means, including the creation of summaries which detail specific tasks to be completed, and revising training programs with more focus on the identification and scheduling of permit-related requirements. Broad application of these results is cautioned due to the bias associated with the restricted scope of the project.

  3. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  4. Applications and development of communication models for the touchstone GAMMA and DELTA prototypes

    NASA Technical Reports Server (NTRS)

    Seidel, Steven R.

    1993-01-01

    The goal of this project was to develop models of the interconnection networks of the Intel iPSC/860 and DELTA multicomputers to guide the design of efficient algorithms for interprocessor communication in problems that commonly occur in CFD codes and other applications. Interprocessor communication costs of codes for message-passing architectures such as the iPSC/860 and DELTA significantly affect the level of performance that can be obtained from those machines. This project addressed several specific problems in the achievement of efficient communication on the Intel iPSC/860 hypercube and DELTA mesh. In particular, an efficient global processor synchronization algorithm was developed for the iPSC/860 and numerous broadcast algorithms were designed for the DELTA.

  5. Introduction to study and simulation of low rate video coding schemes

    NASA Technical Reports Server (NTRS)

    1992-01-01

    During this period, the development of simulators for the various HDTV systems proposed to the FCC were developed. These simulators will be tested using test sequences from the MPEG committee. The results will be extrapolated to HDTV video sequences. Currently, the simulator for the compression aspects of the Advanced Digital Television (ADTV) was completed. Other HDTV proposals are at various stages of development. A brief overview of the ADTV system is given. Some coding results obtained using the simulator are discussed. These results are compared to those obtained using the CCITT H.261 standard. These results in the context of the CCSDS specifications are evaluated and some suggestions as to how the ADTV system could be implemented in the NASA network are made.

  6. Developing a Vocational Index for Adults with Autism Spectrum Disorders

    PubMed Central

    Seltzer, Marsha Mailick

    2012-01-01

    Existing methods of indexing the vocational activities of adults with autism spectrum disorders (ASD) have made significant contributions to research. Nonetheless, they are limited by problems with sensitivity and reliability. We developed an index of vocational and educational outcomes that captures the full range of activities experienced by adults with ASD, and that can be reliably coded across studies using specific decision rules. To develop this index, we used employment, vocational, and educational data collected from nearly 350 adults with ASD at 6 times over 12 years, as part of a larger longitudinal study. The resulting index consists of 11 categories coded on a 9-point scale, ranging from competitive employment and/or postsecondary educational program to no vocational/educational activities. PMID:22466690

  7. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI)

    PubMed Central

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-01-01

    Introduction: Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. Aim: The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. Methods: first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. Results: There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. Conclusion: The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research. PMID:28883671

  8. A lncRNA Perspective into (Re)Building the Heart.

    PubMed

    Frank, Stefan; Aguirre, Aitor; Hescheler, Juergen; Kurian, Leo

    2016-01-01

    Our conception of the human genome, long focused on the 2% that codes for proteins, has profoundly changed since its first draft assembly in 2001. Since then, an unanticipatedly expansive functionality and convolution has been attributed to the majority of the genome that is transcribed in a cell-type/context-specific manner into transcripts with no apparent protein coding ability. While the majority of these transcripts, currently annotated as long non-coding RNAs (lncRNAs), are functionally uncharacterized, their prominent role in embryonic development and tissue homeostasis, especially in the context of the heart, is emerging. In this review, we summarize and discuss the latest advances in understanding the relevance of lncRNAs in (re)building the heart.

  9. Testing and Performance Analysis of the Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    Soni, Nitin J.

    1996-01-01

    This report provides the test results and performance analysis of the multichannel error correction code decoder (MED) system for a regenerative satellite with asynchronous, frequency-division multiple access (FDMA) uplink channels. It discusses the system performance relative to various critical parameters: the coding length, data pattern, unique word value, unique word threshold, and adjacent-channel interference. Testing was performed under laboratory conditions and used a computer control interface with specifically developed control software to vary these parameters. Needed technologies - the high-speed Bose Chaudhuri-Hocquenghem (BCH) codec from Harris Corporation and the TRW multichannel demultiplexer/demodulator (MCDD) - were fully integrated into the mesh very small aperture terminal (VSAT) onboard processing architecture and were demonstrated.

  10. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  11. A generalized chemistry version of SPARK

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.

    1988-01-01

    An extension of the reacting H2-air computer code SPARK is presented, which enables the code to be used on any reacting flow problem. Routines are developed calculating in a general fashion, the reaction rates, and chemical Jacobians of any reacting system. In addition, an equilibrium routine is added so that the code will have frozen, finite rate, and equilibrium capabilities. The reaction rate for the species is determined from the law of mass action using Arrhenius expressions for the rate constants. The Jacobian routines are determined by numerically or analytically differentiating the law of mass action for each species. The equilibrium routine is based on a Gibbs free energy minimization routine. The routines are written in FORTRAN 77, with special consideration given to vectorization. Run times for the generalized routines are generally 20 percent slower than reaction specific routines. The numerical efficiency of the generalized analytical Jacobian, however, is nearly 300 percent better than the reaction specific numerical Jacobian used in SPARK.

  12. LHCb migration from Subversion to Git

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Couturier, B.; Closier, J.; Cattaneo, M.

    2017-10-01

    Due to user demand and to support new development workflows based on code review and multiple development streams, LHCb decided to port the source code management from Subversion to Git, using the CERN GitLab hosting service. Although tools exist for this kind of migration, LHCb specificities and development models required careful planning of the migration, development of migration tools, changes to the development model, and redefinition of the release procedures. Moreover we had to support a hybrid situation with some software projects hosted in Git and others still in Subversion, or even branches of one projects hosted in different systems. We present the way we addressed the special LHCb requirements, the technical details of migrating large non standard Subversion repositories, and how we managed to smoothly migrate the software projects following the schedule of each project manager.

  13. Automating FEA programming

    NASA Technical Reports Server (NTRS)

    Sharma, Naveen

    1992-01-01

    In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.

  14. Lunar module voice recorder

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A feasibility unit suitable for use as a voice recorder on the space shuttle was developed. A modification, development, and test program is described. A LM-DSEA recorder was modified to achieve the following goals: (1) redesign case to allow in-flight cartridge change; (2) time code change from LM code to IRIG-B 100 pps code; (3) delete cold plate requirements (also requires deletion of long-term thermal vacuum operation at 0.00001 MMHg); (4) implement track sequence reset during cartridge change; (5) reduce record time per cartridge because of unavailability of LM thin-base tape; and (6) add an internal Vox key circuit to turn on/off transport and electronics with voice data input signal. The recorder was tested at both the LM and shuttle vibration levels. The modified recorder achieved the same level of flutter during vibration as the DSEA recorder prior to modification. Several improvements were made over the specification requirements. The high manufacturing cost is discussed.

  15. Technological Developments in lncRNA Biology.

    PubMed

    Jathar, Sonali; Kumar, Vikram; Srivastava, Juhi; Tripathi, Vidisha

    2017-01-01

    It is estimated that more than 90% of the mammalian genome is transcribed as non-coding RNAs. Recent evidences have established that these non-coding transcripts are not junk or just transcriptional noise, but they do serve important biological purpose. One of the rapidly expanding fields of this class of transcripts is the regulatory lncRNAs, which had been a major challenge in terms of their molecular functions and mechanisms of action. The emergence of high-throughput technologies and the development in various conventional approaches have led to the expansion of the lncRNA world. The combination of multidisciplinary approaches has proven to be essential to unravel the complexity of their regulatory networks and helped establish the importance of their existence. Here, we review the current methodologies available for discovering and investigating functions of long non-coding RNAs (lncRNAs) and focus on the powerful technological advancement available to specifically address their functional importance.

  16. Momentary Patterns of Covariation between Specific Affects and Interpersonal Behavior: Linking Relationship Science and Personality Assessment

    PubMed Central

    Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.

    2016-01-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786

  17. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes

    PubMed Central

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-01-01

    OBJECTIVE: To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. DESIGN: Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. SETTING: Paediatric residency program at BC Children’s Hospital, Vancouver, British Columbia. INTERVENTIONS: The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. RESULTS: A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. CONCLUSIONS: A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes. PMID:23372405

  18. Momentary patterns of covariation between specific affects and interpersonal behavior: Linking relationship science and personality assessment.

    PubMed

    Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A

    2017-02-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. The Nuclear Energy Knowledge and Validation Center Summary of Activities Conducted in FY16

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans David

    The Nuclear Energy Knowledge and Validation Center (NEKVaC) is a new initiative by the Department of Energy (DOE) and Idaho National Laboratory (INL) to coordinate and focus the resources and expertise that exist with the DOE toward solving issues in modern nuclear code validation and knowledge management. In time, code owners, users, and developers will view the NEKVaC as a partner and essential resource for acquiring the best practices and latest techniques for validating codes, providing guidance in planning and executing experiments, facilitating access to and maximizing the usefulness of existing data, and preserving knowledge for continual use by nuclearmore » professionals and organizations for their own validation needs. The scope of the NEKVaC covers many interrelated activities that will need to be cultivated carefully in the near term and managed properly once the NEKVaC is fully functional. Three areas comprise the principal mission: (1) identify and prioritize projects that extend the field of validation science and its application to modern codes, (2) develop and disseminate best practices and guidelines for high-fidelity multiphysics/multiscale analysis code development and associated experiment design, and (3) define protocols for data acquisition and knowledge preservation and provide a portal for access to databases currently scattered among numerous organizations. These mission areas, while each having a unique focus, are interdependent and complementary. Likewise, all activities supported by the NEKVaC, both near term and long term, must possess elements supporting all three areas. This cross cutting nature is essential to ensuring that activities and supporting personnel do not become “stove piped” (i.e., focused a specific function that the activity itself becomes the objective rather than achieving the larger vision). This report begins with a description of the mission areas; specifically, the role played by each major committee and the types of activities for which they are responsible. It then lists and describes the proposed near term tasks upon which future efforts can build.« less

  20. PCG: A prototype incremental compilation facility for the SAGA environment, appendix F

    NASA Technical Reports Server (NTRS)

    Kimball, Joseph John

    1985-01-01

    A programming environment supports the activity of developing and maintaining software. New environments provide language-oriented tools such as syntax-directed editors, whose usefulness is enhanced because they embody language-specific knowledge. When syntactic and semantic analysis occur early in the cycle of program production, that is, during editing, the use of a standard compiler is inefficient, for it must re-analyze the program before generating code. Likewise, it is inefficient to recompile an entire file, when the editor can determine that only portions of it need updating. The pcg, or Pascal code generation, facility described here generates code directly from the syntax trees produced by the SAGA syntax directed Pascal editor. By preserving the intermediate code used in the previous compilation, it can limit recompilation to the routines actually modified by editing.

  1. The use of the SRIM code for calculation of radiation damage induced by neutrons

    NASA Astrophysics Data System (ADS)

    Mohammadi, A.; Hamidi, S.; Asadabad, Mohsen Asadi

    2017-12-01

    Materials subjected to neutron irradiation will being evolve to structural changes by the displacement cascades initiated by nuclear reaction. This study discusses a methodology to compute primary knock-on atoms or PKAs information that lead to radiation damage. A program AMTRACK has been developed for assessing of the PKAs information. This software determines the specifications of recoil atoms (using PTRAC card of MCNPX code) and also the kinematics of interactions. The deterministic method was used for verification of the results of (MCNPX+AMTRACK). The SRIM (formely TRIM) code is capable to compute neutron radiation damage. The PKAs information was extracted by AMTRACK program, which can be used as an input of SRIM codes for systematic analysis of primary radiation damage. Then the Bushehr Nuclear Power Plant (BNPP) radiation damage on reactor pressure vessel is calculated.

  2. Beyond Molecular Codes: Simple Rules to Wire Complex Brains

    PubMed Central

    Hassan, Bassem A.; Hiesinger, P. Robin

    2015-01-01

    Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480

  3. AMPS/PC - AUTOMATIC MANUFACTURING PROGRAMMING SYSTEM

    NASA Technical Reports Server (NTRS)

    Schroer, B. J.

    1994-01-01

    The AMPS/PC system is a simulation tool designed to aid the user in defining the specifications of a manufacturing environment and then automatically writing code for the target simulation language, GPSS/PC. The domain of problems that AMPS/PC can simulate are manufacturing assembly lines with subassembly lines and manufacturing cells. The user defines the problem domain by responding to the questions from the interface program. Based on the responses, the interface program creates an internal problem specification file. This file includes the manufacturing process network flow and the attributes for all stations, cells, and stock points. AMPS then uses the problem specification file as input for the automatic code generator program to produce a simulation program in the target language GPSS. The output of the generator program is the source code of the corresponding GPSS/PC simulation program. The system runs entirely on an IBM PC running PC DOS Version 2.0 or higher and is written in Turbo Pascal Version 4 requiring 640K memory and one 360K disk drive. To execute the GPSS program, the PC must have resident the GPSS/PC System Version 2.0 from Minuteman Software. The AMPS/PC program was developed in 1988.

  4. Modular adeno-associated virus (rAAV) vectors used for cellular virus-directed enzyme prodrug therapy

    PubMed Central

    Hagen, Sven; Baumann, Tobias; Wagner, Hanna J.; Morath, Volker; Kaufmann, Beate; Fischer, Adrian; Bergmann, Stefan; Schindler, Patrick; Arndt, Katja M.; Müller, Kristian M.

    2014-01-01

    The pre-clinical and clinical development of viral vehicles for gene transfer increased in recent years, and a recombinant adeno-associated virus (rAAV) drug took center stage upon approval in the European Union. However, lack of standardization, inefficient purification methods and complicated retargeting limit general usability. We address these obstacles by fusing rAAV-2 capsids with two modular targeting molecules (DARPin or Affibody) specific for a cancer cell-surface marker (EGFR) while simultaneously including an affinity tag (His-tag) in a surface-exposed loop. Equipping these particles with genes coding for prodrug converting enzymes (thymidine kinase or cytosine deaminase) we demonstrate tumor marker specific transduction and prodrug-dependent apoptosis of cancer cells. Coding terminal and loop modifications in one gene enabled specific and scalable purification. Our genetic parts for viral production adhere to a standardized cloning strategy facilitating rapid prototyping of virus directed enzyme prodrug therapy (VDEPT). PMID:24457557

  5. Laminar fMRI and computational theories of brain function.

    PubMed

    Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J

    2017-11-02

    Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. The development, evolution, and modifications of ICD-10: challenges to the international comparability of morbidity data.

    PubMed

    Jetté, Nathalie; Quan, Hude; Hemmelgarn, Brenda; Drosler, Saskia; Maass, Christina; Moskal, Lori; Paoin, Wansa; Sundararajan, Vijaya; Gao, Song; Jakob, Robert; Ustün, Bedihran; Ghali, William A

    2010-12-01

    The United States is about to make a major nationwide transition from ICD-9-CM coding of hospital discharges to ICD-10-CM, a country-specific modification of the World Health Organization's ICD-10. As this transition occurs, the WHO is already in the midst of developing ICD-11. Given this context, we undertook this review to discuss: (1) the history of the International Classification of Diseases (a core information "building block" for health systems everywhere) from its introduction to the current era of ICD-11 development; (2) differences across country-specific ICD-10 clinical modifications and the challenges that these differences pose to the international comparability of morbidity data; (3) potential strategic approaches to achieving better international ICD-11 comparability. A literature review and stakeholder consultation was carried out. The various ICD-10 clinical modifications (ICD-10-AM [Australia], ICD-10-CA [Canada], ICD-10-GM [Germany], ICD-10-TM [Thailand], ICD-10-CM [United States]) were compared. These ICD-10 modifications differ in their number of codes, chapters, and subcategories. Specific conditions are present in some but not all of the modifications. ICD-11, with a similar structure to ICD-10, will function in an electronic health records environment and also provide disease descriptive characteristics (eg, causal properties, functional impact, and treatment). The threat to the comparability of international clinical morbidity is growing with the development of many country-specific ICD-10 versions. One solution to this threat is to develop a meta-database including all country-specific modifications to ensure more efficient use of people and resources, decrease omissions and errors but most importantly provide a platform for future ICD updates.

  7. Neuronal cell fate specification by the molecular convergence of different spatio-temporal cues on a common initiator terminal selector gene

    PubMed Central

    Stratmann, Johannes

    2017-01-01

    The extensive genetic regulatory flows underlying specification of different neuronal subtypes are not well understood at the molecular level. The Nplp1 neuropeptide neurons in the developing Drosophila nerve cord belong to two sub-classes; Tv1 and dAp neurons, generated by two distinct progenitors. Nplp1 neurons are specified by spatial cues; the Hox homeotic network and GATA factor grn, and temporal cues; the hb -> Kr -> Pdm -> cas -> grh temporal cascade. These spatio-temporal cues combine into two distinct codes; one for Tv1 and one for dAp neurons that activate a common terminal selector feedforward cascade of col -> ap/eya -> dimm -> Nplp1. Here, we molecularly decode the specification of Nplp1 neurons, and find that the cis-regulatory organization of col functions as an integratory node for the different spatio-temporal combinatorial codes. These findings may provide a logical framework for addressing spatio-temporal control of neuronal sub-type specification in other systems. PMID:28414802

  8. Domain Specific Language Support for Exascale. Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baden, Scott

    The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically

  9. Host computer software specifications for a zero-g payload manhandling simulator

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1986-01-01

    The HP PASCAL source code was developed for the Mission Planning and Analysis Division (MPAD) of NASA/JSC, and takes the place of detailed flow charts defining the host computer software specifications for MANHANDLE, a digital/graphical simulator that can be used to analyze the dynamics of onorbit (zero-g) payload manhandling operations. Input and output data for representative test cases are contained.

  10. General object-oriented software development

    NASA Technical Reports Server (NTRS)

    Seidewitz, Edwin V.; Stark, Mike

    1986-01-01

    Object-oriented design techniques are gaining increasing popularity for use with the Ada programming language. A general approach to object-oriented design which synthesizes the principles of previous object-oriented methods into the overall software life-cycle, providing transitions from specification to design and from design to code. It therefore provides the basis for a general object-oriented development methodology.

  11. Flight Experiment Demonstration System (FEDS): Mathematical specification

    NASA Technical Reports Server (NTRS)

    Shank, D. E.

    1984-01-01

    Computational models for the flight experiment demonstration system (FEDS) code 580 were developed. The FEDS is a modification of the automated orbit determination system which was developed during 1981 and 1982. The purpose of FEDS is to demonstrate, in a simulated spacecraft environment, the feasibility of using microprocessors to perform onboard orbit determination with limited ground support.

  12. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  13. Conversion of the agent-oriented domain-specific language ALAS into JavaScript

    NASA Astrophysics Data System (ADS)

    Sredojević, Dejan; Vidaković, Milan; Okanović, Dušan; Mitrović, Dejan; Ivanović, Mirjana

    2016-06-01

    This paper shows generation of JavaScript code from code written in agent-oriented domain-specific language ALAS. ALAS is an agent-oriented domain-specific language for writing software agents that are executed within XJAF middleware. Since the agents can be executed on various platforms, they must be converted into a language of the target platform. We also try to utilize existing tools and technologies to make the whole conversion process as simple as possible, as well as faster and more efficient. We use the Xtext framework that is compatible with Java to implement ALAS infrastructure - editor and code generator. Since Xtext supports Java, generation of Java code from ALAS code is straightforward. To generate a JavaScript code that will be executed within the target JavaScript XJAF implementation, Google Web Toolkit (GWT) is used.

  14. The semantic specificity of gestures when verbal communication is not possible: the case of emergency evacuation.

    PubMed

    Prati, Gabriele; Pietrantoni, Luca

    2013-01-01

    The aim of the present study was to examine the comprehension of gesture in a situation in which the communicator cannot (or can only with difficulty) use verbal communication. Based on theoretical considerations, we expected to obtain higher semantic comprehension for emblems (gestures with a direct verbal definition or translation that is well known by all members of a group, or culture) compared to illustrators (gestures regarded as spontaneous and idiosyncratic and that do not have a conventional definition). Based on the extant literature, we predicted higher semantic specificity associated with arbitrarily coded and iconically coded emblems compared to intrinsically coded illustrators. Using a scenario of emergency evacuation, we tested the difference in semantic specificity between different categories of gestures. 138 participants saw 10 videos each illustrating a gesture performed by a firefighter. They were requested to imagine themselves in a dangerous situation and to report the meaning associated with each gesture. The results showed that intrinsically coded illustrators were more successfully understood than arbitrarily coded emblems, probably because the meaning of intrinsically coded illustrators is immediately comprehensible without recourse to symbolic interpretation. Furthermore, there was no significant difference between the comprehension of iconically coded emblems and that of both arbitrarily coded emblems and intrinsically coded illustrators. It seems that the difference between the latter two types of gestures was supported by their difference in semantic specificity, although in a direction opposite to that predicted. These results are in line with those of Hadar and Pinchas-Zamir (2004), which showed that iconic gestures have higher semantic specificity than conventional gestures.

  15. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Qing; Whaley, Richard Clint; Qasem, Apan

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less

  16. Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trambauer, K.

    1997-07-01

    The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonablemore » accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running time.« less

  17. Comparison of numerical techniques for integration of stiff ordinary differential equations arising in combustion chemistry

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.

    1984-01-01

    The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.

  18. A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.

    PubMed

    Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H

    2001-03-01

    The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.

  19. Aspect-Oriented Programming

    NASA Technical Reports Server (NTRS)

    Elrad, Tzilla (Editor); Filman, Robert E. (Editor); Bader, Atef (Editor)

    2001-01-01

    Computer science has experienced an evolution in programming languages and systems from the crude assembly and machine codes of the earliest computers through concepts such as formula translation, procedural programming, structured programming, functional programming, logic programming, and programming with abstract data types. Each of these steps in programming technology has advanced our ability to achieve clear separation of concerns at the source code level. Currently, the dominant programming paradigm is object-oriented programming - the idea that one builds a software system by decomposing a problem into objects and then writing the code of those objects. Such objects abstract together behavior and data into a single conceptual and physical entity. Object-orientation is reflected in the entire spectrum of current software development methodologies and tools - we have OO methodologies, analysis and design tools, and OO programming languages. Writing complex applications such as graphical user interfaces, operating systems, and distributed applications while maintaining comprehensible source code has been made possible with OOP. Success at developing simpler systems leads to aspirations for greater complexity. Object orientation is a clever idea, but has certain limitations. We are now seeing that many requirements do not decompose neatly into behavior centered on a single locus. Object technology has difficulty localizing concerns invoking global constraints and pandemic behaviors, appropriately segregating concerns, and applying domain-specific knowledge. Post-object programming (POP) mechanisms that look to increase the expressiveness of the OO paradigm are a fertile arena for current research. Examples of POP technologies include domain-specific languages, generative programming, generic programming, constraint languages, reflection and metaprogramming, feature-oriented development, views/viewpoints, and asynchronous message brokering. (Czarneclu and Eisenecker s book includes a good survey of many of these technologies).

  20. Administrative database code accuracy did not vary notably with changes in disease prevalence.

    PubMed

    van Walraven, Carl; English, Shane; Austin, Peter C

    2016-11-01

    Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Current Development Status of an Integrated Tool for Modeling Quasi-static Deformation in the Solid Earth

    NASA Astrophysics Data System (ADS)

    Williams, C. A.; Dicaprio, C.; Simons, M.

    2003-12-01

    With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.

  2. A Comparison of Athletic Movement Among Talent-Identified Juniors From Different Football Codes in Australia: Implications for Talent Development.

    PubMed

    Woods, Carl T; Keller, Brad S; McKeown, Ian; Robertson, Sam

    2016-09-01

    Woods, CT, Keller, BS, McKeown, I, and Robertson, S. A comparison of athletic movement among talent-identified juniors from different football codes in Australia: implications for talent development. J Strength Cond Res 30(9): 2440-2445, 2016-This study aimed to compare the athletic movement skill of talent-identified (TID) junior Australian Rules football (ARF) and soccer players. The athletic movement skill of 17 TID junior ARF players (17.5-18.3 years) was compared against 17 TID junior soccer players (17.9-18.7 years). Players in both groups were members of an elite junior talent development program within their respective football codes. All players performed an athletic movement assessment that included an overhead squat, double lunge, single-leg Romanian deadlift (both movements performed on right and left legs), a push-up, and a chin-up. Each movement was scored across 3 essential assessment criteria using a 3-point scale. The total score for each movement (maximum of 9) and the overall total score (maximum of 63) were used as the criterion variables for analysis. A multivariate analysis of variance tested the main effect of football code (2 levels) on the criterion variables, whereas a 1-way analysis of variance identified where differences occurred. A significant effect was noted, with the TID junior ARF players outscoring their soccer counterparts when performing the overhead squat and push-up. No other criterions significantly differed according to the main effect. Practitioners should be aware that specific sporting requirements may incur slight differences in athletic movement skill among TID juniors from different football codes. However, given the low athletic movement skill noted in both football codes, developmental coaches should address the underlying movement skill capabilities of juniors when prescribing physical training in both codes.

  3. An expanding universe of the non-coding genome in cancer biology.

    PubMed

    Xue, Bin; He, Lin

    2014-06-01

    Neoplastic transformation is caused by accumulation of genetic and epigenetic alterations that ultimately convert normal cells into tumor cells with uncontrolled proliferation and survival, unlimited replicative potential and invasive growth [Hanahan,D. et al. (2011) Hallmarks of cancer: the next generation. Cell, 144, 646-674]. Although the majority of the cancer studies have focused on the functions of protein-coding genes, emerging evidence has started to reveal the importance of the vast non-coding genome, which constitutes more than 98% of the human genome. A number of non-coding RNAs (ncRNAs) derived from the 'dark matter' of the human genome exhibit cancer-specific differential expression and/or genomic alterations, and it is increasingly clear that ncRNAs, including small ncRNAs and long ncRNAs (lncRNAs), play an important role in cancer development by regulating protein-coding gene expression through diverse mechanisms. In addition to ncRNAs, nearly half of the mammalian genomes consist of transposable elements, particularly retrotransposons. Once depicted as selfish genomic parasites that propagate at the expense of host fitness, retrotransposon elements could also confer regulatory complexity to the host genomes during development and disease. Reactivation of retrotransposons in cancer, while capable of causing insertional mutagenesis and genome rearrangements to promote oncogenesis, could also alter host gene expression networks to favor tumor development. Taken together, the functional significance of non-coding genome in tumorigenesis has been previously underestimated, and diverse transcripts derived from the non-coding genome could act as integral functional components of the oncogene and tumor suppressor network. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Peridigm summary report : lessons learned in development with agile components.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of thismore » approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.« less

  5. Xyce parallel electronic simulator : users' guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.

    2011-05-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers; (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-artmore » algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only); and (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.« less

  6. Application of the verona coding definitions of emotional sequences (VR-CoDES) on a pediatric data set.

    PubMed

    Vatne, Torun M; Finset, Arnstein; Ørnes, Knut; Ruland, Cornelia M

    2010-09-01

    Adult patients present concerns as defined in the Verona Coding Definitions of Emotional Sequences (VR-CoDES), but we do not know how children express their concerns during medical consultations. This study aimed to evaluate the applicability of VR-CoDES to pediatric oncology consultations. Twenty-eight pediatric consultations were coded with the Verona Coding Definitions of Emotional Sequences (VR-CoDES), and the material was also qualitatively analyzed for descriptive purposes. Five consultations were randomly selected for reliability testing and descriptive statistics were computed. Perfect inter-rater reliability for concerns and moderate reliability for cues were obtained. Cues and/or concerns were present in over half of the consultations. Cues were more frequent than concerns, with the majority of cues being verbal hints to hidden concerns or non-verbal cues. Intensity of expressions, limitations in vocabulary, commonality of statements, and complexity of the setting complicated the use of VR-CoDES. Child-specific cues; use of the imperative, cues about past experiences, and use of onomatopoeia were observed. Children with cancer express concerns during medical consultations. VR-CoDES is a reliable tool for coding concerns in pediatric data sets. For future applications in pediatric settings an appendix should be developed to incorporate the child-specific traits. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Expert system for maintenance management of a boiling water reactor power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong Shen; Liou, L.W.; Levine, S.

    1992-01-01

    An expert system code has been developed for the maintenance of two boiling water reactor units in Berwick, Pennsylvania, that are operated by the Pennsylvania Power and Light Company (PP and L). The objective of this expert system code, where the knowledge of experienced operators and engineers is captured and implemented, is to support the decisions regarding which components can be safely and reliably removed from service for maintenance. It can also serve as a query-answering facility for checking the plant system status and for training purposes. The operating and maintenance information of a large number of support systems, whichmore » must be available for emergencies and/or in the event of an accident, is stored in the data base of the code. It identifies the relevant technical specifications and management rules for shutting down any one of the systems or removing a component from service to support maintenance. Because of the complexity and time needed to incorporate a large number of systems and their components, the first phase of the expert system develops a prototype code, which includes only the reactor core isolation coolant system, the high-pressure core injection system, the instrument air system, the service water system, and the plant electrical system. The next phase is scheduled to expand the code to include all other systems. This paper summarizes the prototype code and the design concept of the complete expert system code for maintenance management of all plant systems and components.« less

  8. A genome-wide survey of maternal and embryonic transcripts during Xenopus tropicalis development.

    PubMed

    Paranjpe, Sarita S; Jacobi, Ulrike G; van Heeringen, Simon J; Veenstra, Gert Jan C

    2013-11-06

    Dynamics of polyadenylation vs. deadenylation determine the fate of several developmentally regulated genes. Decay of a subset of maternal mRNAs and new transcription define the maternal-to-zygotic transition, but the full complement of polyadenylated and deadenylated coding and non-coding transcripts has not yet been assessed in Xenopus embryos. To analyze the dynamics and diversity of coding and non-coding transcripts during development, both polyadenylated mRNA and ribosomal RNA-depleted total RNA were harvested across six developmental stages and subjected to high throughput sequencing. The maternally loaded transcriptome is highly diverse and consists of both polyadenylated and deadenylated transcripts. Many maternal genes show peak expression in the oocyte and include genes which are known to be the key regulators of events like oocyte maturation and fertilization. Of all the transcripts that increase in abundance between early blastula and larval stages, about 30% of the embryonic genes are induced by fourfold or more by the late blastula stage and another 35% by late gastrulation. Using a gene model validation and discovery pipeline, we identified novel transcripts and putative long non-coding RNAs (lncRNA). These lncRNA transcripts were stringently selected as spliced transcripts generated from independent promoters, with limited coding potential and a codon bias characteristic of noncoding sequences. Many lncRNAs are conserved and expressed in a developmental stage-specific fashion. These data reveal dynamics of transcriptome polyadenylation and abundance and provides a high-confidence catalogue of novel and long non-coding RNAs.

  9. Modeling radiation belt dynamics using a 3-D layer method code

    NASA Astrophysics Data System (ADS)

    Wang, C.; Ma, Q.; Tao, X.; Zhang, Y.; Teng, S.; Albert, J. M.; Chan, A. A.; Li, W.; Ni, B.; Lu, Q.; Wang, S.

    2017-08-01

    A new 3-D diffusion code using a recently published layer method has been developed to analyze radiation belt electron dynamics. The code guarantees the positivity of the solution even when mixed diffusion terms are included. Unlike most of the previous codes, our 3-D code is developed directly in equatorial pitch angle (α0), momentum (p), and L shell coordinates; this eliminates the need to transform back and forth between (α0,p) coordinates and adiabatic invariant coordinates. Using (α0,p,L) is also convenient for direct comparison with satellite data. The new code has been validated by various numerical tests, and we apply the 3-D code to model the rapid electron flux enhancement following the geomagnetic storm on 17 March 2013, which is one of the Geospace Environment Modeling Focus Group challenge events. An event-specific global chorus wave model, an AL-dependent statistical plasmaspheric hiss wave model, and a recently published radial diffusion coefficient formula from Time History of Events and Macroscale Interactions during Substorms (THEMIS) statistics are used. The simulation results show good agreement with satellite observations, in general, supporting the scenario that the rapid enhancement of radiation belt electron flux for this event results from an increased level of the seed population by radial diffusion, with subsequent acceleration by chorus waves. Our results prove that the layer method can be readily used to model global radiation belt dynamics in three dimensions.

  10. Statistical computation of tolerance limits

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1993-01-01

    Based on a new theory, two computer codes were developed specifically to calculate the exact statistical tolerance limits for normal distributions within unknown means and variances for the one-sided and two-sided cases for the tolerance factor, k. The quantity k is defined equivalently in terms of the noncentral t-distribution by the probability equation. Two of the four mathematical methods employ the theory developed for the numerical simulation. Several algorithms for numerically integrating and iteratively root-solving the working equations are written to augment the program simulation. The program codes generate some tables of k's associated with the varying values of the proportion and sample size for each given probability to show accuracy obtained for small sample sizes.

  11. What to do with a Dead Research Code

    NASA Astrophysics Data System (ADS)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  12. Generating Test Templates via Automated Theorem Proving

    NASA Technical Reports Server (NTRS)

    Kancherla, Mani Prasad

    1997-01-01

    Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.

  13. Centrifugal and Axial Pump Design and Off-Design Performance Prediction

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    1995-01-01

    A meanline pump-flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump-flow code PUMPA was written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design-point rotor efficiency and slip factors are obtained from empirical correlations to rotor-specific speed and geometry. The pump code can model axial, inducer, mixed-flow, and centrifugal pumps and can model multistage pumps in series. The rapid input setup and computer run time for this meanline pump flow code make it an effective analysis and conceptual design tool. The map-generation capabilities of the code provide the information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of PUMPA permit the user to do parametric design space exploration of candidate pump configurations and to provide head-flow maps for engine system evaluation.

  14. Unit Testing for the Application Control Language (ACL) Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina Marie

    2014-01-01

    In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.

  15. CERISE, a French radioprotection code, to assess the radiological impact and acceptance criteria of installations for material handling, and recycling or disposal of very low-level radioactive waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santucci, P.; Guetat, P.

    1993-12-31

    This document describes the code CERISE, Code d`Evaluations Radiologiques Individuelles pour des Situations en Enterprise et dans l`Environnement. This code has been developed in the frame of European studies to establish acceptance criteria of very low-level radioactive waste and materials. This code is written in Fortran and runs on PC. It calculates doses received by the different pathways: external exposure, ingestion, inhalation and skin contamination. Twenty basic scenarios are already elaborated, which have been determined from previous studies. Calculations establish the relation between surface, specific and/or total activities, and doses. Results can be expressed as doses for an average activitymore » unit, or as average activity limits for a set of reference doses (defined for each scenario analyzed). In this last case, the minimal activity values and the corresponding limiting scenarios, are selected and summarized in a final table.« less

  16. The H.264/AVC advanced video coding standard: overview and introduction to the fidelity range extensions

    NASA Astrophysics Data System (ADS)

    Sullivan, Gary J.; Topiwala, Pankaj N.; Luthra, Ajay

    2004-11-01

    H.264/MPEG-4 AVC is the latest international video coding standard. It was jointly developed by the Video Coding Experts Group (VCEG) of the ITU-T and the Moving Picture Experts Group (MPEG) of ISO/IEC. It uses state-of-the-art coding tools and provides enhanced coding efficiency for a wide range of applications, including video telephony, video conferencing, TV, storage (DVD and/or hard disk based, especially high-definition DVD), streaming video, digital video authoring, digital cinema, and many others. The work on a new set of extensions to this standard has recently been completed. These extensions, known as the Fidelity Range Extensions (FRExt), provide a number of enhanced capabilities relative to the base specification as approved in the Spring of 2003. In this paper, an overview of this standard is provided, including the highlights of the capabilities of the new FRExt features. Some comparisons with the existing MPEG-2 and MPEG-4 Part 2 standards are also provided.

  17. New primary renal diagnosis codes for the ERA-EDTA

    PubMed Central

    Venkat-Raman, Gopalakrishnan; Tomson, Charles R.V.; Gao, Yongsheng; Cornet, Ronald; Stengel, Benedicte; Gronhagen-Riska, Carola; Reid, Chris; Jacquelinet, Christian; Schaeffner, Elke; Boeschoten, Els; Casino, Francesco; Collart, Frederic; De Meester, Johan; Zurriaga, Oscar; Kramar, Reinhard; Jager, Kitty J.; Simpson, Keith

    2012-01-01

    The European Renal Association-European Dialysis and Transplant Association (ERA-EDTA) Registry has produced a new set of primary renal diagnosis (PRD) codes that are intended for use by affiliated registries. It is designed specifically for use in renal centres and registries but is aligned with international coding standards supported by the WHO (International Classification of Diseases) and the International Health Terminology Standards Development Organization (SNOMED Clinical Terms). It is available as supplementary material to this paper and free on the internet for non-commercial, clinical, quality improvement and research use, and by agreement with the ERA-EDTA Registry for use by commercial organizations. Conversion between the old and the new PRD codes is possible. The new codes are very flexible and will be actively managed to keep them up-to-date and to ensure that renal medicine can remain at the forefront of the electronic revolution in medicine, epidemiology research and the use of decision support systems to improve the care of patients. PMID:23175621

  18. Psychometric Properties of the System for Coding Couples’ Interactions in Therapy - Alcohol

    PubMed Central

    Owens, Mandy D.; McCrady, Barbara S.; Borders, Adrienne Z.; Brovko, Julie M.; Pearson, Matthew R.

    2014-01-01

    Few systems are available for coding in-session behaviors for couples in therapy. Alcohol Behavior Couples Therapy (ABCT) is an empirically supported treatment, but little is known about its mechanisms of behavior change. In the current study, an adapted version of the Motivational Interviewing for Significant Others coding system was developed into the System for Coding Couples’ Interactions in Therapy – Alcohol (SCCIT-A), which was used to code couples’ interactions and behaviors during ABCT. Results showed good inter-rater reliability of the SCCIT-A and provided evidence that the SCCIT-A may be a promising measure for understanding couples in therapy. A three factor model of the SCCIT-A was examined (Positive, Negative, and Change Talk/Counter-Change Talk) using a confirmatory factor analysis, but model fit was poor. Due to poor model fit, ratios were computed for Positive/Negative ratings and for Change Talk/Counter-Change Talk codes based on previous research in the couples and Motivational Interviewing literature. Post-hoc analyses examined correlations between specific SCCIT-A codes and baseline characteristics and indicated some concurrent validity. Correlations were run between ratios and baseline characteristics; ratios may be an alternative to using the factors from the SCCIT-A. Reliability and validity analyses suggest that the SCCIT-A has the potential to be a useful measure for coding in-session behaviors of both partners in couples therapy and could be used to identify mechanisms of behavior change for ABCT. Additional research is needed to improve the reliability of some codes and to further develop the SCCIT-A and other measures of couples’ interactions in therapy. PMID:25528049

  19. Development of a patient-specific dosimetry estimation system in nuclear medicine examination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H. H.; Dong, S. L.; Yang, H. J.

    2011-07-01

    The purpose of this study is to develop a patient-specific dosimetry estimation system in nuclear medicine examination using a SimSET-based Monte Carlo code. We added a dose deposition routine to store the deposited energy of the photons during their flights in SimSET and developed a user-friendly interface for reading PET and CT images. Dose calculated on ORNL phantom was used to validate the accuracy of this system. The S values for {sup 99m}Tc, {sup 18}F and {sup 131}I obtained by the system were compared to those from the MCNP4C code and OLINDA. The ratios of S values computed by thismore » system to those obtained with OLINDA for various organs were ranged from 0.93 to 1.18, which are comparable to that obtained from MCNP4C code (0.94 to 1.20). The average ratios of S value were 0.99{+-}0.04, 1.03{+-}0.05, and 1.00{+-}0.07 for isotopes {sup 131}I, {sup 18}F, and {sup 99m}Tc, respectively. The simulation time of SimSET was two times faster than MCNP4C's for various isotopes. A 3D dose calculation was also performed on a patient data set with PET/CT examination using this system. Results from the patient data showed that the estimated S values using this system differed slightly from those of OLINDA for ORNL phantom. In conclusion, this system can generate patient-specific dose distribution and display the isodose curves on top of the anatomic structure through a friendly graphic user interface. It may also provide a useful tool to establish an appropriate dose-reduction strategy to patients in nuclear medicine environments. (authors)« less

  20. Main steam line break accident simulation of APR1400 using the model of ATLAS facility

    NASA Astrophysics Data System (ADS)

    Ekariansyah, A. S.; Deswandri; Sunaryo, Geni R.

    2018-02-01

    A main steam line break simulation for APR1400 as an advanced design of PWR has been performed using the RELAP5 code. The simulation was conducted in a model of thermal-hydraulic test facility called as ATLAS, which represents a scaled down facility of the APR1400 design. The main steam line break event is described in a open-access safety report document, in which initial conditions and assumptionsfor the analysis were utilized in performing the simulation and analysis of the selected parameter. The objective of this work was to conduct a benchmark activities by comparing the simulation results of the CESEC-III code as a conservative approach code with the results of RELAP5 as a best-estimate code. Based on the simulation results, a general similarity in the behavior of selected parameters was observed between the two codes. However the degree of accuracy still needs further research an analysis by comparing with the other best-estimate code. Uncertainties arising from the ATLAS model should be minimized by taking into account much more specific data in developing the APR1400 model.

  1. Clean Energy in City Codes: A Baseline Analysis of Municipal Codification across the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Jeffrey J.; Aznar, Alexandra; Dane, Alexander

    Municipal governments in the United States are well positioned to influence clean energy (energy efficiency and alternative energy) and transportation technology and strategy implementation within their jurisdictions through planning, programs, and codification. Municipal governments are leveraging planning processes and programs to shape their energy futures. There is limited understanding in the literature related to codification, the primary way that municipal governments enact enforceable policies. The authors fill the gap in the literature by documenting the status of municipal codification of clean energy and transportation across the United States. More directly, we leverage online databases of municipal codes to develop nationalmore » and state-specific representative samples of municipal governments by population size. Our analysis finds that municipal governments with the authority to set residential building energy codes within their jurisdictions frequently do so. In some cases, communities set codes higher than their respective state governments. Examination of codes across the nation indicates that municipal governments are employing their code as a policy mechanism to address clean energy and transportation.« less

  2. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  3. Standards application and development plan for solar thermal technologies

    NASA Astrophysics Data System (ADS)

    Cobb, H. R. W.

    1981-07-01

    Functional and standards matrices, developed from input from ST users and from the industry that will be continually reviewed and updated as commercial aspects develop are presented. The matrices highlight codes, standards, test methods, functions and definitions that need to be developed. They will be submitted through ANSI for development by national consensus bodies. A contingency action is proposed for standards development if specific input is lacking at the committee level or if early development of a standard would hasten commercialization or gain needed jurisdictional acceptance.

  4. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  5. GEANT4 Tuning For pCT Development

    NASA Astrophysics Data System (ADS)

    Yevseyeva, Olga; de Assis, Joaquim T.; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, João A. P.; Díaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2011-08-01

    Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. Moreover, the spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through aluminum absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadrontherapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code, or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development.

  6. Spatial Correlations in Natural Scenes Modulate Response Reliability in Mouse Visual Cortex

    PubMed Central

    Rikhye, Rajeev V.

    2015-01-01

    Intrinsic neuronal variability significantly limits information encoding in the primary visual cortex (V1). Certain stimuli can suppress this intertrial variability to increase the reliability of neuronal responses. In particular, responses to natural scenes, which have broadband spatiotemporal statistics, are more reliable than responses to stimuli such as gratings. However, very little is known about which stimulus statistics modulate reliable coding and how this occurs at the neural ensemble level. Here, we sought to elucidate the role that spatial correlations in natural scenes play in reliable coding. We developed a novel noise-masking method to systematically alter spatial correlations in natural movies, without altering their edge structure. Using high-speed two-photon calcium imaging in vivo, we found that responses in mouse V1 were much less reliable at both the single neuron and population level when spatial correlations were removed from the image. This change in reliability was due to a reorganization of between-neuron correlations. Strongly correlated neurons formed ensembles that reliably and accurately encoded visual stimuli, whereas reducing spatial correlations reduced the activation of these ensembles, leading to an unreliable code. Together with an ensemble-specific normalization model, these results suggest that the coordinated activation of specific subsets of neurons underlies the reliable coding of natural scenes. SIGNIFICANCE STATEMENT The natural environment is rich with information. To process this information with high fidelity, V1 neurons have to be robust to noise and, consequentially, must generate responses that are reliable from trial to trial. While several studies have hinted that both stimulus attributes and population coding may reduce noise, the details remain unclear. Specifically, what features of natural scenes are important and how do they modulate reliability? This study is the first to investigate the role of spatial correlations, which are a fundamental attribute of natural scenes, in shaping stimulus coding by V1 neurons. Our results provide new insights into how stimulus spatial correlations reorganize the correlated activation of specific ensembles of neurons to ensure accurate information processing in V1. PMID:26511254

  7. Methylation of microRNA genes regulates gene expression in bisexual flower development in andromonoecious poplar.

    PubMed

    Song, Yuepeng; Tian, Min; Ci, Dong; Zhang, Deqiang

    2015-04-01

    Previous studies showed sex-specific DNA methylation and expression of candidate genes in bisexual flowers of andromonoecious poplar, but the regulatory relationship between methylation and microRNAs (miRNAs) remains unclear. To investigate whether the methylation of miRNA genes regulates gene expression in bisexual flower development, the methylome, microRNA, and transcriptome were examined in female and male flowers of andromonoecious poplar. 27 636 methylated coding genes and 113 methylated miRNA genes were identified. In the coding genes, 64.5% of the methylated reads mapped to the gene body region; by contrast, 60.7% of methylated reads in miRNA genes mainly mapped in the 5' and 3' flanking regions. CHH methylation showed the highest methylation levels and CHG showed the lowest methylation levels. Correlation analysis showed a significant, negative, strand-specific correlation of methylation and miRNA gene expression (r=0.79, P <0.05). The methylated miRNA genes included eight long miRNAs (lmiRNAs) of 24 nucleotides and 11 miRNAs related to flower development. miRNA172b might play an important role in the regulation of bisexual flower development-related gene expression in andromonoecious poplar, via modification of methylation. Gynomonoecious, female, and male poplars were used to validate the methylation patterns of the miRNA172b gene, implying that hyper-methylation in andromonoecious and gynomonoecious poplar might function as an important regulator in bisexual flower development. Our data provide a useful resource for the study of flower development in poplar and improve our understanding of the effect of epigenetic regulation on genes other than protein-coding genes. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  8. Methylation of microRNA genes regulates gene expression in bisexual flower development in andromonoecious poplar

    PubMed Central

    Song, Yuepeng; Tian, Min; Ci, Dong; Zhang, Deqiang

    2015-01-01

    Previous studies showed sex-specific DNA methylation and expression of candidate genes in bisexual flowers of andromonoecious poplar, but the regulatory relationship between methylation and microRNAs (miRNAs) remains unclear. To investigate whether the methylation of miRNA genes regulates gene expression in bisexual flower development, the methylome, microRNA, and transcriptome were examined in female and male flowers of andromonoecious poplar. 27 636 methylated coding genes and 113 methylated miRNA genes were identified. In the coding genes, 64.5% of the methylated reads mapped to the gene body region; by contrast, 60.7% of methylated reads in miRNA genes mainly mapped in the 5′ and 3′ flanking regions. CHH methylation showed the highest methylation levels and CHG showed the lowest methylation levels. Correlation analysis showed a significant, negative, strand-specific correlation of methylation and miRNA gene expression (r=0.79, P <0.05). The methylated miRNA genes included eight long miRNAs (lmiRNAs) of 24 nucleotides and 11 miRNAs related to flower development. miRNA172b might play an important role in the regulation of bisexual flower development-related gene expression in andromonoecious poplar, via modification of methylation. Gynomonoecious, female, and male poplars were used to validate the methylation patterns of the miRNA172b gene, implying that hyper-methylation in andromonoecious and gynomonoecious poplar might function as an important regulator in bisexual flower development. Our data provide a useful resource for the study of flower development in poplar and improve our understanding of the effect of epigenetic regulation on genes other than protein-coding genes. PMID:25617468

  9. Sparse coding can predict primary visual cortex receptive field changes induced by abnormal visual input.

    PubMed

    Hunt, Jonathan J; Dayan, Peter; Goodhill, Geoffrey J

    2013-01-01

    Receptive fields acquired through unsupervised learning of sparse representations of natural scenes have similar properties to primary visual cortex (V1) simple cell receptive fields. However, what drives in vivo development of receptive fields remains controversial. The strongest evidence for the importance of sensory experience in visual development comes from receptive field changes in animals reared with abnormal visual input. However, most sparse coding accounts have considered only normal visual input and the development of monocular receptive fields. Here, we applied three sparse coding models to binocular receptive field development across six abnormal rearing conditions. In every condition, the changes in receptive field properties previously observed experimentally were matched to a similar and highly faithful degree by all the models, suggesting that early sensory development can indeed be understood in terms of an impetus towards sparsity. As previously predicted in the literature, we found that asymmetries in inter-ocular correlation across orientations lead to orientation-specific binocular receptive fields. Finally we used our models to design a novel stimulus that, if present during rearing, is predicted by the sparsity principle to lead robustly to radically abnormal receptive fields.

  10. Sparse Coding Can Predict Primary Visual Cortex Receptive Field Changes Induced by Abnormal Visual Input

    PubMed Central

    Hunt, Jonathan J.; Dayan, Peter; Goodhill, Geoffrey J.

    2013-01-01

    Receptive fields acquired through unsupervised learning of sparse representations of natural scenes have similar properties to primary visual cortex (V1) simple cell receptive fields. However, what drives in vivo development of receptive fields remains controversial. The strongest evidence for the importance of sensory experience in visual development comes from receptive field changes in animals reared with abnormal visual input. However, most sparse coding accounts have considered only normal visual input and the development of monocular receptive fields. Here, we applied three sparse coding models to binocular receptive field development across six abnormal rearing conditions. In every condition, the changes in receptive field properties previously observed experimentally were matched to a similar and highly faithful degree by all the models, suggesting that early sensory development can indeed be understood in terms of an impetus towards sparsity. As previously predicted in the literature, we found that asymmetries in inter-ocular correlation across orientations lead to orientation-specific binocular receptive fields. Finally we used our models to design a novel stimulus that, if present during rearing, is predicted by the sparsity principle to lead robustly to radically abnormal receptive fields. PMID:23675290

  11. A fast and low-cost genotyping method for hepatitis B virus based on pattern recognition in point-of-care settings

    PubMed Central

    Qiu, Xianbo; Song, Liuwei; Yang, Shuo; Guo, Meng; Yuan, Quan; Ge, Shengxiang; Min, Xiaoping; Xia, Ningshao

    2016-01-01

    A fast and low-cost method for HBV genotyping especially for genotypes A, B, C and D was developed and tested. A classifier was used to detect and analyze a one-step immunoassay lateral flow strip functionalized with genotype-specific monoclonal antibodies (mAbs) on multiple capture lines in the form of pattern recognition for point-of-care (POC) diagnostics. The fluorescent signals from the capture lines and the background of the strip were collected via multiple optical channels in parallel. A digital HBV genotyping model, whose inputs are the fluorescent signals and outputs are a group of genotype-specific digital binary codes (0/1), was developed based on the HBV genotyping strategy. Meanwhile, a companion decoding table was established to cover all possible pairing cases between the states of a group of genotype-specific digital binary codes and the HBV genotyping results. A logical analyzing module was constructed to process the detected signals in parallel without program control, and its outputs were used to drive a set of LED indicators, which determine the HBV genotype. Comparing to the nucleic acid analysis to HBV viruses, much faster HBV genotyping with significantly lower cost can be obtained with the developed method. PMID:27306485

  12. Modeling of transitional flows

    NASA Technical Reports Server (NTRS)

    Lund, Thomas S.

    1988-01-01

    An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.

  13. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    NASA Astrophysics Data System (ADS)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  14. Development and feasibility testing of the Pediatric Emergency Discharge Interaction Coding Scheme.

    PubMed

    Curran, Janet A; Taylor, Alexandra; Chorney, Jill; Porter, Stephen; Murphy, Andrea; MacPhee, Shannon; Bishop, Andrea; Haworth, Rebecca

    2017-08-01

    Discharge communication is an important aspect of high-quality emergency care. This study addresses the gap in knowledge on how to describe discharge communication in a paediatric emergency department (ED). The objective of this feasibility study was to develop and test a coding scheme to characterize discharge communication between health-care providers (HCPs) and caregivers who visit the ED with their children. The Pediatric Emergency Discharge Interaction Coding Scheme (PEDICS) and coding manual were developed following a review of the literature and an iterative refinement process involving HCP observations, inter-rater assessments and team consensus. The coding scheme was pilot-tested through observations of HCPs across a range of shifts in one urban paediatric ED. Overall, 329 patient observations were carried out across 50 observational shifts. Inter-rater reliability was evaluated in 16% of the observations. The final version of the PEDICS contained 41 communication elements. Kappa scores were greater than .60 for the majority of communication elements. The most frequently observed communication elements were under the Introduction node and the least frequently observed were under the Social Concerns node. HCPs initiated the majority of the communication. Pediatric Emergency Discharge Interaction Coding Scheme addresses an important gap in the discharge communication literature. The tool is useful for mapping patterns of discharge communication between HCPs and caregivers. Results from our pilot test identified deficits in specific areas of discharge communication that could impact adherence to discharge instructions. The PEDICS would benefit from further testing with a different sample of HCPs. © 2017 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  15. 78 FR 51139 - Notice of Proposed Changes to the National Handbook of Conservation Practices for the Natural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... (Code 324), Field Border (Code 386), Filter Strip (Code 393), Land Smoothing (Code 466), Livestock... the implementation requirement document to the specifications and plans. Filter Strip (Code 393)--The...

  16. Imitation Learning Errors Are Affected by Visual Cues in Both Performance and Observation Phases.

    PubMed

    Mizuguchi, Takashi; Sugimura, Ryoko; Shimada, Hideaki; Hasegawa, Takehiro

    2017-08-01

    Mechanisms of action imitation were examined. Previous studies have suggested that success or failure of imitation is determined at the point of observing an action. In other words, cognitive processing after observation is not related to the success of imitation; 20 university students participated in each of three experiments in which they observed a series of object manipulations consisting of four elements (hands, tools, object, and end points) and then imitated the manipulations. In Experiment 1, a specific intially observed element was color coded, and the specific manipulated object at the imitation stage was identically color coded; participants accurately imitated the color coded element. In Experiment 2, a specific element was color coded at the observation but not at the imitation stage, and there were no effects of color coding on imitation. In Experiment 3, participants were verbally instructed to attend to a specific element at the imitation stage, but the verbal instructions had no effect. Thus, the success of imitation may not be determined at the stage of observing an action and color coding can provide a clue for imitation at the imitation stage.

  17. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  18. Computation of Thermally Perfect Compressible Flow Properties

    NASA Technical Reports Server (NTRS)

    Witte, David W.; Tatum, Kenneth E.; Williams, S. Blake

    1996-01-01

    A set of compressible flow relations for a thermally perfect, calorically imperfect gas are derived for a value of c(sub p) (specific heat at constant pressure) expressed as a polynomial function of temperature and developed into a computer program, referred to as the Thermally Perfect Gas (TPG) code. The code is available free from the NASA Langley Software Server at URL http://www.larc.nasa.gov/LSS. The code produces tables of compressible flow properties similar to those found in NACA Report 1135. Unlike the NACA Report 1135 tables which are valid only in the calorically perfect temperature regime the TPG code results are also valid in the thermally perfect, calorically imperfect temperature regime, giving the TPG code a considerably larger range of temperature application. Accuracy of the TPG code in the calorically perfect and in the thermally perfect, calorically imperfect temperature regimes are verified by comparisons with the methods of NACA Report 1135. The advantages of the TPG code compared to the thermally perfect, calorically imperfect method of NACA Report 1135 are its applicability to any type of gas (monatomic, diatomic, triatomic, or polyatomic) or any specified mixture of gases, ease-of-use, and tabulated results.

  19. Efficient genome-wide association in biobanks using topic modeling identifies multiple novel disease loci.

    PubMed

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-08-31

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that may be unreliable and fail to capture the relationship between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records (EHR) for 10845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes are included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p<1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than for single phenome-wide diagnostic codes, and incorporation of less strongly-loading diagnostic codes enhanced association. This strategy provides a more efficient means of phenome-wide association in biobanks with coded clinical data.

  20. Efficient Genome-wide Association in Biobanks Using Topic Modeling Identifies Multiple Novel Disease Loci

    PubMed Central

    McCoy, Thomas H; Castro, Victor M; Snapper, Leslie A; Hart, Kamber L; Perlis, Roy H

    2017-01-01

    Biobanks and national registries represent a powerful tool for genomic discovery, but rely on diagnostic codes that can be unreliable and fail to capture relationships between related diagnoses. We developed an efficient means of conducting genome-wide association studies using combinations of diagnostic codes from electronic health records for 10,845 participants in a biobanking program at two large academic medical centers. Specifically, we applied latent Dirichilet allocation to fit 50 disease topics based on diagnostic codes, then conducted a genome-wide common-variant association for each topic. In sensitivity analysis, these results were contrasted with those obtained from traditional single-diagnosis phenome-wide association analysis, as well as those in which only a subset of diagnostic codes were included per topic. In meta-analysis across three biobank cohorts, we identified 23 disease-associated loci with p < 1e-15, including previously associated autoimmune disease loci. In all cases, observed significant associations were of greater magnitude than single phenome-wide diagnostic codes, and incorporation of less strongly loading diagnostic codes enhanced association. This strategy provides a more efficient means of identifying phenome-wide associations in biobanks with coded clinical data. PMID:28861588

  1. Propel: A Discontinuous-Galerkin Finite Element Code for Solving the Reacting Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Johnson, Ryan; Kercher, Andrew; Schwer, Douglas; Corrigan, Andrew; Kailasanath, Kazhikathra

    2017-11-01

    This presentation focuses on the development of a Discontinuous Galerkin (DG) method for application to chemically reacting flows. The in-house code, called Propel, was developed by the Laboratory of Computational Physics and Fluid Dynamics at the Naval Research Laboratory. It was designed specifically for developing advanced multi-dimensional algorithms to run efficiently on new and innovative architectures such as GPUs. For these results, Propel solves for convection and diffusion simultaneously with detailed transport and thermodynamics. Chemistry is currently solved in a time-split approach using Strang-splitting with finite element DG time integration of chemical source terms. Results presented here show canonical unsteady reacting flow cases, such as co-flow and splitter plate, and we report performance for higher order DG on CPU and GPUs.

  2. Many human accelerated regions are developmental enhancers

    PubMed Central

    Capra, John A.; Erwin, Genevieve D.; McKinsey, Gabriel; Rubenstein, John L. R.; Pollard, Katherine S.

    2013-01-01

    The genetic changes underlying the dramatic differences in form and function between humans and other primates are largely unknown, although it is clear that gene regulatory changes play an important role. To identify regulatory sequences with potentially human-specific functions, we and others used comparative genomics to find non-coding regions conserved across mammals that have acquired many sequence changes in humans since divergence from chimpanzees. These regions are good candidates for performing human-specific regulatory functions. Here, we analysed the DNA sequence, evolutionary history, histone modifications, chromatin state and transcription factor (TF) binding sites of a combined set of 2649 non-coding human accelerated regions (ncHARs) and predicted that at least 30% of them function as developmental enhancers. We prioritized the predicted ncHAR enhancers using analysis of TF binding site gain and loss, along with the functional annotations and expression patterns of nearby genes. We then tested both the human and chimpanzee sequence for 29 ncHARs in transgenic mice, and found 24 novel developmental enhancers active in both species, 17 of which had very consistent patterns of activity in specific embryonic tissues. Of these ncHAR enhancers, five drove expression patterns suggestive of different activity for the human and chimpanzee sequence at embryonic day 11.5. The changes to human non-coding DNA in these ncHAR enhancers may modify the complex patterns of gene expression necessary for proper development in a human-specific manner and are thus promising candidates for understanding the genetic basis of human-specific biology. PMID:24218637

  3. Large liquid rocket engine transient performance simulation system

    NASA Technical Reports Server (NTRS)

    Mason, J. R.; Southwick, R. D.

    1989-01-01

    Phase 1 of the Rocket Engine Transient Simulation (ROCETS) program consists of seven technical tasks: architecture; system requirements; component and submodel requirements; submodel implementation; component implementation; submodel testing and verification; and subsystem testing and verification. These tasks were completed. Phase 2 of ROCETS consists of two technical tasks: Technology Test Bed Engine (TTBE) model data generation; and system testing verification. During this period specific coding of the system processors was begun and the engineering representations of Phase 1 were expanded to produce a simple model of the TTBE. As the code was completed, some minor modifications to the system architecture centering on the global variable common, GLOBVAR, were necessary to increase processor efficiency. The engineering modules completed during Phase 2 are listed: INJTOO - main injector; MCHBOO - main chamber; NOZLOO - nozzle thrust calculations; PBRNOO - preburner; PIPE02 - compressible flow without inertia; PUMPOO - polytropic pump; ROTROO - rotor torque balance/speed derivative; and TURBOO - turbine. Detailed documentation of these modules is in the Appendix. In addition to the engineering modules, several submodules were also completed. These submodules include combustion properties, component performance characteristics (maps), and specific utilities. Specific coding was begun on the system configuration processor. All functions necessary for multiple module operation were completed but the SOLVER implementation is still under development. This system, the Verification Checkout Facility (VCF) allows interactive comparison of module results to store data as well as provides an intermediate checkout of the processor code. After validation using the VCF, the engineering modules and submodules were used to build a simple TTBE.

  4. Guidelines for the Design, Fabrication, Testing, Installation and Operation of Srf Cavities

    NASA Astrophysics Data System (ADS)

    Theilacker, J.; Carter, H.; Foley, M.; Hurh, P.; Klebaner, A.; Krempetz, K.; Nicol, T.; Olis, D.; Page, T.; Peterson, T.; Pfund, P.; Pushka, D.; Schmitt, R.; Wands, R.

    2010-04-01

    Superconducting Radio-Frequency (SRF) cavities containing cryogens under pressure pose a potential rupture hazard to equipment and personnel. Generally, pressure vessels fall within the scope of the ASME Boiler and Pressure Vessel Code however, the use of niobium as a material for the SRF cavities is beyond the applicability of the Code. Fermilab developed a guideline to ensure sound engineering practices governing the design, fabrication, testing, installation and operation of SRF cavities. The objective of the guideline is to reduce hazards and to achieve an equivalent level of safety afforded by the ASME Code. The guideline addresses concerns specific to SRF cavities in the areas of materials, design and analysis, welding and brazing, pressure relieving requirements, pressure testing and quality control.

  5. 7 CFR 1924.5 - Planning development work.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... cash to be furnished by the borrower, proceeds from cost sharing programs such as Agricultural...) Drawings, specifications, and estimates will fully describe the work. Technical data, tests, or engineering... building code. (i) Agricultural buildings that are not intended for human habitation are exempt from these...

  6. 7 CFR 1924.5 - Planning development work.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... cash to be furnished by the borrower, proceeds from cost sharing programs such as Agricultural...) Drawings, specifications, and estimates will fully describe the work. Technical data, tests, or engineering... building code. (i) Agricultural buildings that are not intended for human habitation are exempt from these...

  7. 7 CFR 1924.5 - Planning development work.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... cash to be furnished by the borrower, proceeds from cost sharing programs such as Agricultural...) Drawings, specifications, and estimates will fully describe the work. Technical data, tests, or engineering... building code. (i) Agricultural buildings that are not intended for human habitation are exempt from these...

  8. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younes, W.

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less

  9. Auditing Consistency and Usefulness of LOINC Use among Three Large Institutions - Using Version Spaces for Grouping LOINC Codes

    PubMed Central

    Lin, M.C.; Vreeman, D.J.; Huff, S.M.

    2012-01-01

    Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382

  10. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  11. [Global aspects of medical ethics: conditions and possibilities].

    PubMed

    Neitzke, G

    2001-01-01

    A global or universal code of medical ethics seems paradoxical in the era of pluralism and postmodernism. A different conception of globalisation will be developed in terms of a "procedural universality". According to this philosophical concept, a code of medical ethics does not oblige physicians to accept certain specific, preset, universal values and rules. It rather obliges every culture and society to start a culture-sensitive, continuous, and active discourse on specific issues, mentioned in the codex. This procedure might result in regional, intra-cultural consensus, which should be presented to an inter-cultural dialogue. To exemplify this procedure, current topics of medical ethics (spiritual foundations of medicine, autonomy, definitions concerning life and death, physicians' duties, conduct within therapeutic teams) will be discussed from the point of view of western medicine.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  13. Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikio Kurihara; Masahiro Aoki; Yu Maruyama

    2006-07-01

    Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less

  14. Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.

    PubMed

    Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R

    2015-12-01

    To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. H-division quarterly report, October--December 1977. [Lawrence Livermore Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-02-10

    The Theoretical EOS Group develops theoretical techniques for describing material properties under extreme conditions and constructs equation-of-state (EOS) tables for specific applications. Work this quarter concentrated on a Li equation of state, equation of state for equilibrium plasma, improved ion corrections to the Thomas--Fermi--Kirzhnitz theory, and theoretical estimates of high-pressure melting in metals. The Experimental Physics Group investigates properties of materials at extreme conditions of pressure and temperature, and develops new experimental techniques. Effort this quarter concerned the following: parabolic projectile distortion in the two-state light-gas gun, construction of a ballistic range for long-rod penetrators, thermodynamics and sound velocities inmore » liquid metals, isobaric expansion measurements in Pt, and calculation of the velocity--mass profile of a jet produced by a shaped charge. Code development was concentrated on the PELE code, a multimaterial, multiphase, explicit finite-difference Eulerian code for pool suppression dynamics of a hypothetical loss-of-coolant accident in a nuclear reactor. Activities of the Fluid Dynamics Group were directed toward development of a code to compute the equations of state and transport properties of liquid metals (e.g. Li) and partially ionized dense plasmas, jet stability in the Li reactor system, and the study and problem application of fluid dynamic turbulence theory. 19 figures, 5 tables. (RWR)« less

  16. Specific and Modular Binding Code for Cytosine Recognition in Pumilio/FBF (PUF) RNA-binding Domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Shuyun; Wang, Yang; Cassidy-Amstutz, Caleb

    2011-10-28

    Pumilio/fem-3 mRNA-binding factor (PUF) proteins possess a recognition code for bases A, U, and G, allowing designed RNA sequence specificity of their modular Pumilio (PUM) repeats. However, recognition side chains in a PUM repeat for cytosine are unknown. Here we report identification of a cytosine-recognition code by screening random amino acid combinations at conserved RNA recognition positions using a yeast three-hybrid system. This C-recognition code is specific and modular as specificity can be transferred to different positions in the RNA recognition sequence. A crystal structure of a modified PUF domain reveals specific contacts between an arginine side chain and themore » cytosine base. We applied the C-recognition code to design PUF domains that recognize targets with multiple cytosines and to generate engineered splicing factors that modulate alternative splicing. Finally, we identified a divergent yeast PUF protein, Nop9p, that may recognize natural target RNAs with cytosine. This work deepens our understanding of natural PUF protein target recognition and expands the ability to engineer PUF domains to recognize any RNA sequence.« less

  17. Development of a Grid-Based Gyro-Kinetic Simulation Code

    NASA Astrophysics Data System (ADS)

    Lapillonne, Xavier; Brunetti, Maura; Tran, Trach-Minh; Brunner, Stephan

    2006-10-01

    A grid-based semi-Lagrangian code using cubic spline interpolation is being developed at CRPP, for solving the electrostatic drift-kinetic equations [M. Brunetti et. al, Comp. Phys. Comm. 163, 1 (2004)] in a cylindrical system. This 4-dim code, CYGNE, is part of a project with long term aim of studying microturbulence in toroidal fusion devices, in the more general frame of gyro-kinetic equations. Towards their non-linear phase, the simulations from this code are subject to significant overshoot problems, reflected by the development of negative value regions of the distribution function, which leads to bad energy conservation. This has motivated the study of alternative schemes. On the one hand, new time integration algorithms are considered in the semi-Lagrangian frame. On the other hand, fully Eulerian schemes, which separate time and space discretisation (method of lines), are investigated. In particular, the Essentially Non Oscillatory (ENO) approach, constructed so as to minimize the overshoot problem, has been considered. All these methods have first been tested in the simpler case of the 2-dim guiding-center model for the Kelvin-Helmholtz instability, which enables to address the specific issue of the E xB drift also met in the more complex gyrokinetic-type equations. Based on these preliminary studies, the most promising methods are being implemented and tested in CYGNE.

  18. Divergent transcription is associated with promoters of transcriptional regulators

    PubMed Central

    2013-01-01

    Background Divergent transcription is a wide-spread phenomenon in mammals. For instance, short bidirectional transcripts are a hallmark of active promoters, while longer transcripts can be detected antisense from active genes in conditions where the RNA degradation machinery is inhibited. Moreover, many described long non-coding RNAs (lncRNAs) are transcribed antisense from coding gene promoters. However, the general significance of divergent lncRNA/mRNA gene pair transcription is still poorly understood. Here, we used strand-specific RNA-seq with high sequencing depth to thoroughly identify antisense transcripts from coding gene promoters in primary mouse tissues. Results We found that a substantial fraction of coding-gene promoters sustain divergent transcription of long non-coding RNA (lncRNA)/mRNA gene pairs. Strikingly, upstream antisense transcription is significantly associated with genes related to transcriptional regulation and development. Their promoters share several characteristics with those of transcriptional developmental genes, including very large CpG islands, high degree of conservation and epigenetic regulation in ES cells. In-depth analysis revealed a unique GC skew profile at these promoter regions, while the associated coding genes were found to have large first exons, two genomic features that might enforce bidirectional transcription. Finally, genes associated with antisense transcription harbor specific H3K79me2 epigenetic marking and RNA polymerase II enrichment profiles linked to an intensified rate of early transcriptional elongation. Conclusions We concluded that promoters of a class of transcription regulators are characterized by a specialized transcriptional control mechanism, which is directly coupled to relaxed bidirectional transcription. PMID:24365181

  19. The Optimizer Topology Characteristics in Seismic Hazards

    NASA Astrophysics Data System (ADS)

    Sengor, T.

    2015-12-01

    The characteristic data of the natural phenomena are questioned in a topological space approach to illuminate whether there is an algorithm behind them bringing the situation of physics of phenomena to optimized states even if they are hazards. The optimized code designing the hazard on a topological structure mashes the metric of the phenomena. The deviations in the metric of different phenomena push and/or pull the fold of the other suitable phenomena. For example if the metric of a specific phenomenon A fits to the metric of another specific phenomenon B after variation processes generated with the deviation of the metric of previous phenomenon A. Defining manifold processes covering the metric characteristics of each of every phenomenon is possible for all the physical events; i.e., natural hazards. There are suitable folds in those manifold groups so that each subfold fits to the metric characteristics of one of the natural hazard category at least. Some variation algorithms on those metric structures prepare a gauge effect bringing the long time stability of Earth for largely scaled periods. The realization of that stability depends on some specific conditions. These specific conditions are called optimized codes. The analytical basics of processes in topological structures are developed in [1]. The codes are generated according to the structures in [2]. Some optimized codes are derived related to the seismicity of NAF beginning from the quakes of the year 1999. References1. Taner SENGOR, "Topological theory and analytical configuration for a universal community model," Procedia- Social and Behavioral Sciences, Vol. 81, pp. 188-194, 28 June 2013, 2. Taner SENGOR, "Seismic-Climatic-Hazardous Events Estimation Processes via the Coupling Structures in Conserving Energy Topologies of the Earth," The 2014 AGU Fall Meeting, Abstract no.: 31374, ABD.

  20. Researcher Perceptions of Ethical Guidelines and Codes of Conduct

    PubMed Central

    Giorgini, Vincent; Mecca, Jensen T.; Gibson, Carter; Medeiros, Kelsey; Mumford, Michael D.; Connelly, Shane; Devenport, Lynn D.

    2014-01-01

    Ethical codes of conduct exist in almost every profession. Field-specific codes of conduct have been around for decades, each articulating specific ethical and professional guidelines. However, there has been little empirical research on researchers’ perceptions of these codes of conduct. In the present study, we interviewed faculty members in six research disciplines and identified five themes bearing on the circumstances under which they use ethical guidelines and the underlying reasons for not adhering to such guidelines. We then identify problems with the manner in which codes of conduct in academia are constructed and offer solutions for overcoming these problems. PMID:25635845

  1. The help of simulation codes in designing waste assay systems using neutron measurement methods: Application to the alpha low level waste assay system PROMETHEE 6

    NASA Astrophysics Data System (ADS)

    Mariani, A.; Passard, C.; Jallu, F.; Toubon, H.

    2003-11-01

    The design of a specific nuclear assay system for a dedicated application begins with a phase of development, which relies on information from the literature or on knowledge resulting from experience, and on specific experimental verifications. The latter ones may require experimental devices which can be restricting in terms of deadline, cost and safety. One way generally chosen to bypass these difficulties is to use simulation codes to study particular aspects. This paper deals with the potentialities offered by the simulation in the case of a passive-active neutron (PAN) assay system for alpha low level waste characterization; this system has been carried out at the Nuclear Measurements Development Laboratory of the French Atomic Energy Commission. Due to the high number of parameters to be taken into account for its development, this is a particularly sophisticated example. Since the PAN assay system, called PROMETHEE (prompt epithermal and thermal interrogation experiment), must have a detection efficiency of more than 20% and preserve a high level of modularity for various applications, an improved version has been studied using the MCNP4 (Monte Carlo N-Particle) transport code. Parameters such as the dimensions of the assay system, of the cavity and of the detection blocks, and the thicknesses of the nuclear materials of neutronic interest have been optimised. Therefore, the number of necessary experiments was reduced.

  2. Growth of multi-component alloy films with controlled graded chemical composition on sub-nanometer scale

    DOEpatents

    Bajt, Sasa; Vernon, Stephen P.

    2005-03-15

    The chemical composition of thin films is modulated during their growth. A computer code has been developed to design specific processes for producing a desired chemical composition for various deposition geometries. Good agreement between theoretical and experimental results was achieved.

  3. Institute for Defense Analysis. Annual Report 1995.

    DTIC Science & Technology

    1995-01-01

    staff have been involved in the community-wide development of MPI as well as in its application to specific NSA problems. 35 Parallel Groebner ...Basis Code — Symbolic Computing on Parallel Machines The Groebner basis method is a set of algorithms for reformulating very complex algebraic expres

  4. HAL/S-FC compiler system functional specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  5. EVA - A Textual Data Processing Tool.

    ERIC Educational Resources Information Center

    Jakopin, Primoz

    EVA, a text processing tool designed to be self-contained and useful for a variety of languages, is described briefly, and its extensive coded character set is illustrated. Features, specifications, and database functions are noted. Its application in development of a Slovenian literary dictionary is also described. (MSE)

  6. Xyce Parallel Electronic Simulator : users' guide, version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont

    2004-06-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less

  7. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    NASA Astrophysics Data System (ADS)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  8. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  9. Comprehensive Identification of Long Non-coding RNAs in Purified Cell Types from the Brain Reveals Functional LncRNA in OPC Fate Determination.

    PubMed

    Dong, Xiaomin; Chen, Kenian; Cuevas-Diaz Duran, Raquel; You, Yanan; Sloan, Steven A; Zhang, Ye; Zong, Shan; Cao, Qilin; Barres, Ben A; Wu, Jia Qian

    2015-12-01

    Long non-coding RNAs (lncRNAs) (> 200 bp) play crucial roles in transcriptional regulation during numerous biological processes. However, it is challenging to comprehensively identify lncRNAs, because they are often expressed at low levels and with more cell-type specificity than are protein-coding genes. In the present study, we performed ab initio transcriptome reconstruction using eight purified cell populations from mouse cortex and detected more than 5000 lncRNAs. Predicting the functions of lncRNAs using cell-type specific data revealed their potential functional roles in Central Nervous System (CNS) development. We performed motif searches in ENCODE DNase I digital footprint data and Mouse ENCODE promoters to infer transcription factor (TF) occupancy. By integrating TF binding and cell-type specific transcriptomic data, we constructed a novel framework that is useful for systematically identifying lncRNAs that are potentially essential for brain cell fate determination. Based on this integrative analysis, we identified lncRNAs that are regulated during Oligodendrocyte Precursor Cell (OPC) differentiation from Neural Stem Cells (NSCs) and that are likely to be involved in oligodendrogenesis. The top candidate, lnc-OPC, shows highly specific expression in OPCs and remarkable sequence conservation among placental mammals. Interestingly, lnc-OPC is significantly up-regulated in glial progenitors from experimental autoimmune encephalomyelitis (EAE) mouse models compared to wild-type mice. OLIG2-binding sites in the upstream regulatory region of lnc-OPC were identified by ChIP (chromatin immunoprecipitation)-Sequencing and validated by luciferase assays. Loss-of-function experiments confirmed that lnc-OPC plays a functional role in OPC genesis. Overall, our results substantiated the role of lncRNA in OPC fate determination and provided an unprecedented data source for future functional investigations in CNS cell types. We present our datasets and analysis results via the interactive genome browser at our laboratory website that is freely accessible to the research community. This is the first lncRNA expression database of collective populations of glia, vascular cells, and neurons. We anticipate that these studies will advance the knowledge of this major class of non-coding genes and their potential roles in neurological development and diseases.

  10. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  11. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  12. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  13. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  14. Decoding the ubiquitous role of microRNAs in neurogenesis.

    PubMed

    Nampoothiri, Sreekala S; Rajanikant, G K

    2017-04-01

    Neurogenesis generates fledgling neurons that mature to form an intricate neuronal circuitry. The delusion on adult neurogenesis was far resolved in the past decade and became one of the largely explored domains to identify multifaceted mechanisms bridging neurodevelopment and neuropathology. Neurogenesis encompasses multiple processes including neural stem cell proliferation, neuronal differentiation, and cell fate determination. Each neurogenic process is specifically governed by manifold signaling pathways, several growth factors, coding, and non-coding RNAs. A class of small non-coding RNAs, microRNAs (miRNAs), is ubiquitously expressed in the brain and has emerged to be potent regulators of neurogenesis. It functions by fine-tuning the expression of specific neurogenic gene targets at the post-transcriptional level and modulates the development of mature neurons from neural progenitor cells. Besides the commonly discussed intrinsic factors, the neuronal morphogenesis is also under the control of several extrinsic temporal cues, which in turn are regulated by miRNAs. This review enlightens on dicer controlled switch from neurogenesis to gliogenesis, miRNA regulation of neuronal maturation and the differential expression of miRNAs in response to various extrinsic cues affecting neurogenesis.

  15. Identification and functional analysis of long non-coding RNAs in human and mouse early embryos based on single-cell transcriptome data

    PubMed Central

    Qiu, Jia-jun; Ren, Zhao-rui; Yan, Jing-bin

    2016-01-01

    Epigenetics regulations have an important role in fertilization and proper embryonic development, and several human diseases are associated with epigenetic modification disorders, such as Rett syndrome, Beckwith-Wiedemann syndrome and Angelman syndrome. However, the dynamics and functions of long non-coding RNAs (lncRNAs), one type of epigenetic regulators, in human pre-implantation development have not yet been demonstrated. In this study, a comprehensive analysis of human and mouse early-stage embryonic lncRNAs was performed based on public single-cell RNA sequencing data. Expression profile analysis revealed that lncRNAs are expressed in a developmental stage–specific manner during human early-stage embryonic development, whereas a more temporal-specific expression pattern was identified in mouse embryos. Weighted gene co-expression network analysis suggested that lncRNAs involved in human early-stage embryonic development are associated with several important functions and processes, such as oocyte maturation, zygotic genome activation and mitochondrial functions. We also found that the network of lncRNAs involved in zygotic genome activation was highly preservative between human and mouse embryos, whereas in other stages no strong correlation between human and mouse embryo was observed. This study provides insight into the molecular mechanism underlying lncRNA involvement in human pre-implantation embryonic development. PMID:27542205

  16. Novel numerical techniques for magma dynamics

    NASA Astrophysics Data System (ADS)

    Rhebergen, S.; Katz, R. F.; Wathen, A.; Alisic, L.; Rudge, J. F.; Wells, G.

    2013-12-01

    We discuss the development of finite element techniques and solvers for magma dynamics computations. These are implemented within the FEniCS framework. This approach allows for user-friendly, expressive, high-level code development, but also provides access to powerful, scalable numerical solvers and a large family of finite element discretisations. With the recent addition of dolfin-adjoint, FeniCS supports automated adjoint and tangent-linear models, enabling the rapid development of Generalised Stability Analysis. The ability to easily scale codes to three dimensions with large meshes, and/or to apply intricate adjoint calculations means that efficiency of the numerical algorithms is vital. We therefore describe our development and analysis of preconditioners designed specifically for finite element discretizations of equations governing magma dynamics. The preconditioners are based on Elman-Silvester-Wathen methods for the Stokes equation, and we extend these to flows with compaction. Our simulations are validated by comparison of results with laboratory experiments on partially molten aggregates.

  17. OHD/HL - SHEF: code

    Science.gov Websites

    specification How to install the software How to use the software Download the source code (using .gz). Standard Exchange Format (SHEF) is a documented set of rules for coding of data in a form for both visual and information to describe the data. Current SHEF specification How to install the software How to use the

  18. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  19. Participation as an outcome measure in psychosocial oncology: content of cancer-specific health-related quality of life instruments.

    PubMed

    van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F

    2011-12-01

    To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.

  20. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  1. Development of Low Cost Satellite Communications System for Helicopters and General Aviation

    NASA Technical Reports Server (NTRS)

    Farazian, K.; Abbe, B.; Divsalar, D.; Raphaeli, D.; Tulintseff, A.; Wu, T.; Hinedi, S.

    1994-01-01

    In this paper, the development of low-cost satellite communications (SATCOM) system for helicopters and General Aviation (GA) aircrafts is described. System design and standards analysis have been conducted to meet the low-cost, light-weight, small-size and low-power system requirements for helicopters and GA aircraft environments. Other specific issues investigated include coding schemes, spatial diversity, and antenna arraying techniques. Coding schemes employing Channel State Information (CSI) and inverleaving have been studied in order to mitigate severe banking angle fading and the periodic RF signal blockage due to the helicopter rotor blades. In addition, space diversity and antenna arraying techniques have been investigated to further reduce the fading effects and increase the link margin.

  2. Three-Dimensional Simulation of Traveling-Wave Tube Cold-Test Characteristics Using MAFIA

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Wilson, Jeffrey D.

    1995-01-01

    The three-dimensional simulation code MAFIA was used to compute the cold-test parameters - frequency-phase dispersion, beam on-axis interaction impedance, and attenuation - for two types of traveling-wave tube (TWT) slow-wave circuits. The potential for this electromagnetic computer modeling code to reduce the time and cost of TWT development is demonstrated by the high degree of accuracy achieved in calculating these parameters. Generalized input files were developed for ferruled coupled-cavity and TunneLadder slow-wave circuits. These files make it easy to model circuits of arbitrary dimensions. The utility of these files was tested by applying each to a specific TWT slow-wave circuit and comparing the results with experimental data. Excellent agreement was obtained.

  3. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  4. Experimental identification of closely spaced modes using NExT-ERA

    NASA Astrophysics Data System (ADS)

    Hosseini Kordkheili, S. A.; Momeni Massouleh, S. H.; Hajirezayi, S.; Bahai, H.

    2018-01-01

    This article presents a study on the capability of the time domain OMA method, NExT-ERA, to identify closely spaced structural dynamic modes. A survey in the literature reveals that few experimental studies have been conducted on the effectiveness of the NExT-ERA methodology in case of closely spaced modes specifically. In this paper we present the formulation for NExT-ERA. This formulation is then implemented in an algorithm and a code, developed in house to identify the modal parameters of different systems using their generated time history data. Some numerical models are firstly investigated to validate the code. Two different case studies involving a plate with closely spaced modes and a pulley ring with greater extent of closeness in repeated modes are presented. Both structures are excited by random impulses under the laboratory condition. The resulting time response acceleration data are then used as input in the developed code to extract modal parameters of the structures. The accuracy of the results is checked against those obtained from experimental tests.

  5. Insights into inner ear-specific gene regulation: epigenetics and non-coding RNAs in inner ear development and regeneration

    PubMed Central

    Avraham, Karen B.

    2016-01-01

    The vertebrate inner ear houses highly specialized sensory organs, tuned to detect and encode sound, head motion and gravity. Gene expression programs under the control of transcription factors orchestrate the formation and specialization of the non-sensory inner ear labyrinth and its sensory constituents. More recently, epigenetic factors and non-coding RNAs emerged as an additional layer of gene regulation, both in inner ear development and disease. In this review, we provide an overview on how epigenetic modifications and non-coding RNAs, in particular microRNAs (miRNAs), influence gene expression and summarize recent discoveries that highlight their critical role in the proper formation of the inner ear labyrinth and its sensory organs. In contrast to non-mammalian vertebrates, adult mammals lack the ability to regenerate inner ear mechano-sensory hair cells. Finally, we discuss recent insights into how epigenetic factors and miRNAs may facilitate, or in the case of mammals, restrict sensory hair cell regeneration. PMID:27836639

  6. Operational advances in ring current modeling using RAM-SCB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welling, Daniel T; Jordanova, Vania K; Zaharia, Sorin G

    The Ring current Atmosphere interaction Model with Self-Consistently calculated 3D Magnetic field (RAM-SCB) combines a kinetic model of the ring current with a force-balanced model of the magnetospheric magnetic field to create an inner magnetospheric model that is magnetically self consistent. RAM-SCB produces a wealth of outputs that are valuable to space weather applications. For example, the anisotropic particle distribution of the KeV-energy population calculated by the code is key for predicting surface charging on spacecraft. Furthermore, radiation belt codes stand to benefit substantially from RAM-SCB calculated magnetic field values and plasma wave growth rates - both important for determiningmore » the evolution of relativistic electron populations. RAM-SCB is undergoing development to bring these benefits to the space weather community. Data-model validation efforts are underway to assess the performance of the system. 'Virtual Satellite' capability has been added to yield satellite-specific particle distribution and magnetic field output. The code's outer boundary is being expanded to 10 Earth Radii to encompass previously neglected geosynchronous orbits and allow the code to be driven completely by either empirical or first-principles based inputs. These advances are culminating towards a new, real-time version of the code, rtRAM-SCB, that can monitor the inner magnetosphere conditions on both a global and spacecraft-specific level. This paper summarizes these new features as well as the benefits they provide the space weather community.« less

  7. Diode-pumped solid state lasers (DPSSLs) for Inertial Fusion Energy (IFE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krupke, W.F.

    The status of diode-pumped, transverse-gas-flow cooled, Yb-S-FAP slab lasers is reviewed. Recently acquired experimental performance data are combined with a cost/performance IFE driver design code to define a cost-effective development path for IFE DPSSL drivers. Specific design parameters are described for the Mercury 100J/10 Hz, 1 kW system (first in the development scenario).

  8. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  9. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink.

    PubMed

    Olier, Ivan; Springate, David A; Ashcroft, Darren M; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists.

  10. A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics

    NASA Astrophysics Data System (ADS)

    Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger

    2017-09-01

    Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.

  11. Gyrofluid Modeling of Turbulent, Kinetic Physics

    NASA Astrophysics Data System (ADS)

    Despain, Kate Marie

    2011-12-01

    Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.

  12. Designing and Assessing Learning

    ERIC Educational Resources Information Center

    Quan, Hong; Liu, Dandan; Cun, Xiangqin; Lu, Yingchun

    2009-01-01

    This paper analyses the design, implementation and assessment of a level 2 module for non-English major students in higher vocational and professional education. 1132001 is a code of module that uses active methods to teach college English in China. It specifically reflects on the module's advantage and defect for developing and improving learning…

  13. Investigating the Advantages of Constructing Multidigit Numeration Understanding through Oneida and Lakota Native Languages.

    ERIC Educational Resources Information Center

    Hankes, Judith Elaine

    This paper documents a culturally specific language strength for developing number sense among Oneida- and Lakota-speaking primary students. Qualitative research methods scaffolded this research study: culture informants were interviewed and interviews were transcribed and coded for analysis; culture documents were selected for analysis; and…

  14. Teaching Quality Object-Oriented Programming

    ERIC Educational Resources Information Center

    Feldman, Yishai A.

    2005-01-01

    Computer science students need to learn how to write high-quality software. An important methodology for achieving quality is design-by-contract, in which code is developed together with its specification, which is given as class invariants and method pre- and postconditions. This paper describes practical experience in teaching design-by-contract…

  15. High density arrays of micromirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folta, J. M.; Decker, J. Y.; Kolman, J.

    We established and achieved our goal to (1) fabricate and evaluate test structures based on the micromirror design optimized for maskless lithography applications, (2) perform system analysis and code development for the maskless lithography concept, and (3) identify specifications for micromirror arrays (MMAs) for LLNL's adaptive optics (AO) applications and conceptualize new devices.

  16. 77 FR 20789 - Work Group on Measuring Systems for Taxis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    .... SUMMARY: The National Institute of Standards and Technology (NIST) is forming a Work Group (WG) to develop proposals to revise the current Taximeters Code in NIST Handbook 44 (HB 44), Specifications, Tolerances, and... CONTACT: Mr. John Barton, NIST, Office of Weights and Measures, 100 Bureau Drive, Stop 2600, Gaithersburg...

  17. 77 FR 41784 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-16

    ... health assessment program that evaluates quantitative and qualitative risk information on effects that..., National Center for Environmental Assessment, (mail code: 8601P), Office of Research and Development, U.S... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...

  18. Final Report from The University of Texas at Austin for DEGAS: Dynamic Global Address Space programming environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erez, Mattan; Yelick, Katherine; Sarkar, Vivek

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less

  19. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.

  20. Contrasting Five Different Theories of Letter Position Coding: Evidence from Orthographic Similarity Effects

    ERIC Educational Resources Information Center

    Davis, Colin J.; Bowers, Jeffrey S.

    2006-01-01

    Five theories of how letter position is coded are contrasted: position-specific slot-coding, Wickelcoding, open-bigram coding (discrete and continuous), and spatial coding. These theories make different predictions regarding the relative similarity of three different types of pairs of letter strings: substitution neighbors,…

  1. Quantifying a rare disease in administrative data: the example of calciphylaxis.

    PubMed

    Nigwekar, Sagar U; Solid, Craig A; Ankers, Elizabeth; Malhotra, Rajeev; Eggert, William; Turchin, Alexander; Thadhani, Ravi I; Herzog, Charles A

    2014-08-01

    Calciphylaxis, a rare disease seen in chronic dialysis patients, is associated with significant morbidity and mortality. As is the case with other rare diseases, the precise epidemiology of calciphylaxis remains unknown. Absence of a unique International Classification of Diseases (ICD) code impedes its identification in large administrative databases such as the United States Renal Data System (USRDS) and hinders patient-oriented research. This study was designed to develop an algorithm to accurately identify cases of calciphylaxis and to examine its incidence and mortality. Along with many other diagnoses, calciphylaxis is included in ICD-9 code 275.49, Other Disorders of Calcium Metabolism. Since calciphylaxis is the only disorder listed under this code that requires a skin biopsy for diagnosis, we theorized that simultaneous application of code 275.49 and skin biopsy procedure codes would accurately identify calciphylaxis cases. This novel algorithm was developed using the Partners Research Patient Data Registry (RPDR) (n = 11,451 chronic hemodialysis patients over study period January 2002 to December 2011) using natural language processing and review of medical and pathology records (the gold-standard strategy). We then applied this algorithm to the USRDS to investigate calciphylaxis incidence and mortality. Comparison of our novel research strategy against the gold standard yielded: sensitivity 89.2%, specificity 99.9%, positive likelihood ratio 3,382.3, negative likelihood ratio 0.11, and area under the curve 0.96. Application of the algorithm to the USRDS identified 649 incident calciphylaxis cases over the study period. Although calciphylaxis is rare, its incidence has been increasing, with a major inflection point during 2006-2007, which corresponded with specific addition of calciphylaxis under code 275.49 in October 2006. Calciphylaxis incidence continued to rise even after limiting the study period to 2007 onwards (from 3.7 to 5.7 per 10,000 chronic hemodialysis patients; r = 0.91, p = 0.02). Mortality rates among calciphylaxis patients were noted to be 2.5-3 times higher than average mortality rates for chronic hemodialysis patients. By developing and successfully applying a novel algorithm, we observed a significant increase in calciphylaxis incidence. Because calciphylaxis is associated with extremely high mortality, our study provides valuable information for future patient-oriented calciphylaxis research, and also serves as a template for investigating other rare diseases.

  2. Bioinformatics of prokaryotic RNAs

    PubMed Central

    Backofen, Rolf; Amman, Fabian; Costa, Fabrizio; Findeiß, Sven; Richter, Andreas S; Stadler, Peter F

    2014-01-01

    The genome of most prokaryotes gives rise to surprisingly complex transcriptomes, comprising not only protein-coding mRNAs, often organized as operons, but also harbors dozens or even hundreds of highly structured small regulatory RNAs and unexpectedly large levels of anti-sense transcripts. Comprehensive surveys of prokaryotic transcriptomes and the need to characterize also their non-coding components is heavily dependent on computational methods and workflows, many of which have been developed or at least adapted specifically for the use with bacterial and archaeal data. This review provides an overview on the state-of-the-art of RNA bioinformatics focusing on applications to prokaryotes. PMID:24755880

  3. User's Manual for the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Cheatwood, F. McNeil

    1996-01-01

    This user's manual provides detailed instructions for the installation and the application of version 4.1 of the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA). Also provides simulation of flow field in thermochemical nonequilibrium around vehicles traveling at hypersonic velocities through the atmosphere. Earlier versions of LAURA were predominantly research codes, and they had minimal (or no) documentation. This manual describes UNIX-based utilities for customizing the code for special applications that also minimize system resource requirements. The algorithm is reviewed, and the various program options are related to specific equations and variables in the theoretical development.

  4. DD3MAT - a code for yield criteria anisotropy parameters identification.

    NASA Astrophysics Data System (ADS)

    Barros, P. D.; Carvalho, P. D.; Alves, J. L.; Oliveira, M. C.; Menezes, L. F.

    2016-08-01

    This work presents the main strategies and algorithms adopted in the DD3MAT inhouse code, specifically developed for identifying the anisotropy parameters. The algorithm adopted is based on the minimization of an error function, using a downhill simplex method. The set of experimental values can consider yield stresses and r -values obtained from in-plane tension, for different angles with the rolling direction (RD), yield stress and r -value obtained for biaxial stress state, and yield stresses from shear tests performed also for different angles to RD. All these values can be defined for a specific value of plastic work. Moreover, it can also include the yield stresses obtained from in-plane compression tests. The anisotropy parameters are identified for an AA2090-T3 aluminium alloy, highlighting the importance of the user intervention to improve the numerical fit.

  5. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  6. [Complexity level simulation in the German diagnosis-related groups system: the financial effect of coding of comorbidity diagnostics in urology].

    PubMed

    Wenke, A; Gaber, A; Hertle, L; Roeder, N; Pühse, G

    2012-07-01

    Precise and complete coding of diagnoses and procedures is of value for optimizing revenues within the German diagnosis-related groups (G-DRG) system. The implementation of effective structures for coding is cost-intensive. The aim of this study was to prove whether higher costs can be refunded by complete acquisition of comorbidities and complications. Calculations were based on DRG data of the Department of Urology, University Hospital of Münster, Germany, covering all patients treated in 2009. The data were regrouped and subjected to a process of simulation (increase and decrease of patient clinical complexity levels, PCCL) with the help of recently developed software. In urology a strong dependency of quantity and quality of coding of secondary diagnoses on PCCL and subsequent profits was found. Departmental budgetary procedures can be optimized when coding is effective. The new simulation tool can be a valuable aid to improve profits available for distribution. Nevertheless, calculation of time use and financial needs by this procedure are subject to specific departmental terms and conditions. Completeness of coding of (secondary) diagnoses must be the ultimate administrative goal of patient case documentation in urology.

  7. Qualitative assessment of cause-of-injury coding in U.S. military hospitals: NATO standardization agreement (STANAG) 2050.

    PubMed

    Amoroso, P J; Smith, G S; Bell, N S

    2000-04-01

    Accurate injury cause data are essential for injury prevention research. U.S. military hospitals, unlike civilian hospitals, use the NATO STANAG system for cause-of-injury coding. Reported deficiencies in civilian injury cause data suggested a need to specifically evaluate the STANAG. The Total Army Injury and Health Outcomes Database (TAIHOD) was used to evaluate worldwide Army injury hospitalizations, especially STANAG Trauma, Injury, and Place of Occurrence coding. We conducted a review of hospital procedures at Tripler Army Medical Center (TAMC) including injury cause and intent coding, potential crossover between acute injuries and musculoskeletal conditions, and data for certain hospital patients who are not true admissions. We also evaluated the use of free-text injury comment fields in three hospitals. Army-wide review of injury records coding revealed full compliance with cause coding, although nonspecific codes appeared to be overused. A small but intensive single hospital records review revealed relatively poor intent coding but good activity and cause coding. Data on specific injury history were present on most acute injury records and 75% of musculoskeletal conditions. Place of Occurrence coding, although inherently nonspecific, was over 80% accurate. Review of text fields produced additional details of the injuries in over 80% of cases. STANAG intent coding specificity was poor, while coding of cause of injury was at least comparable to civilian systems. The strengths of military hospital data systems are an exceptionally high compliance with injury cause coding, the availability of free text, and capture of all population hospital records without regard to work-relatedness. Simple changes in procedures could greatly improve data quality.

  8. Childhood esotropia: child and parent concerns.

    PubMed

    Liebermann, Laura; Leske, David A; Castañeda, Yolanda S; Hatt, Sarah R; Wernimont, Suzanne M; Cheng, Christina S; Birch, Eileen E; Holmes, Jonathan M

    2016-08-01

    To identify specific health-related quality of life (HRQOL) concerns affecting children with esotropia as expressed by children or one of their parents (proxy) and concerns affecting the parents themselves. Sixty children with esotropia (0-17 years of age) and 1 parent for each child were prospectively enrolled. Individual semistructured interviews were conducted with children aged 5-17 years (n = 40) and 1 parent each for child ages 0-17 years. Transcripts of recorded interviews were evaluated using NVivo software. Specific concerns were identified from both child and parent interviews and coded. From these specific codes, broad themes were identified. Frequency of each theme was calculated, along with the frequency of specific codes within each theme. Regarding the child's experience 6 broad themes were identified: visual function (mentioned by 32 of 40 children (80%) and by 50 of 60 parents (proxy assessment of child, 83%), treatment (78% and 85%), emotions (65% and 67%), social (58% and 68%), physical (58% and 32%), and worry (45% and 7%). Regarding the parents' own experience, 5 broad themes were identified: treatment (59 of 60 parents, 98%), worry (97%), emotions (82%), compensation for condition (80%), and affects family (23%). A wide range of concerns were identified from interviews of children with esotropia and their parents. Concerns reflect the impact of esotropia in physical, emotional, and social domains, and specific concerns will be used for the development of questionnaires to quantify the effects of esotropia on children's and parents' quality of life. Copyright © 2016 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  9. Combat injury coding: a review and reconfiguration.

    PubMed

    Lawnick, Mary M; Champion, Howard R; Gennarelli, Thomas; Galarneau, Michael R; D'Souza, Edwin; Vickers, Ross R; Wing, Vern; Eastridge, Brian J; Young, Lee Ann; Dye, Judy; Spott, Mary Ann; Jenkins, Donald H; Holcomb, John; Blackbourne, Lorne H; Ficke, James R; Kalin, Ellen J; Flaherty, Stephen

    2013-10-01

    The current civilian Abbreviated Injury Scale (AIS), designed for automobile crash injuries, yields important information about civilian injuries. It has been recognized for some time, however, that both the AIS and AIS-based scores such as the Injury Severity Score (ISS) are inadequate for describing penetrating injuries, especially those sustained in combat. Existing injury coding systems do not adequately describe (they actually exclude) combat injuries such as the devastating multi-mechanistic injuries resulting from attacks with improvised explosive devices (IEDs). After quantifying the inapplicability of current coding systems, the Military Combat Injury Scale (MCIS), which includes injury descriptors that accurately characterize combat anatomic injury, and the Military Functional Incapacity Scale (MFIS), which indicates immediate tactical functional impairment, were developed by a large tri-service military and civilian group of combat trauma subject-matter experts. Assignment of MCIS severity levels was based on urgency, level of care needed, and risk of death from each individual injury. The MFIS was developed based on the casualty's ability to shoot, move, and communicate, and comprises four levels ranging from "Able to continue mission" to "Lost to military." Separate functional impairments were identified for injuries aboard ship. Preliminary evaluation of MCIS discrimination, calibration, and casualty disposition was performed on 992 combat-injured patients using two modeling processes. Based on combat casualty data, the MCIS is a new, simpler, comprehensive severity scale with 269 codes (vs. 1999 in AIS) that specifically characterize and distinguish the many unique injuries encountered in combat. The MCIS integrates with the MFIS, which associates immediate combat functional impairment with minor and moderate-severity injuries. Predictive validation on combat datasets shows improved performance over AIS-based tools in addition to improved face, construct, and content validity and coding inter-rater reliability. Thus, the MCIS has greater relevance, accuracy, and precision for many military-specific applications. Over a period of several years, the Military Combat Injury Scale and Military Functional Incapacity Scale were developed, tested and validated by teams of civilian and tri-service military expertise. MCIS shows significant promise in documenting the nature, severity and complexity of modern combat injury.

  10. Development of the Brief Romantic Relationship Interaction Coding Scheme (BRRICS)

    PubMed Central

    Humbad, Mikhila N.; Donnellan, M. Brent; Klump, Kelly L.; Burt, S. Alexandra

    2012-01-01

    Although observational studies of romantic relationships are common, many existing coding schemes require considerable amounts of time and resources to implement. The current study presents a new coding scheme, the Brief Romantic Relationship Interaction Coding Scheme (BRRICS), designed to assess various aspects of romantic relationship both quickly and efficiently. The BRRICS consists of four individual coding dimensions assessing positive and negative affect in each member of the dyad, as well as four codes assessing specific components of the dyadic interaction (i.e., positive reciprocity, demand-withdraw pattern, negative reciprocity, and overall satisfaction). Concurrent associations with measures of marital adjustment and conflict were evaluated in a sample of 118 married couples participating in the Michigan State University Twin Registry. Couples were asked to discuss common conflicts in their marriage while being videotaped. Undergraduate coders used the BRRICS to rate these interactions. The BRRICS scales were correlated in expected directions with self-reports of marital adjustment, as well as children’s perception of the severity and frequency of marital conflict. Based on these results, the BRRICS may be an efficient tool for researchers with large samples of observational data who are interested in coding global aspects of the relationship but do not have the resources to use labor intensive schemes. PMID:21875192

  11. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  12. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  13. The commerce of professional psychology and the new ethics code.

    PubMed

    Koocher, G P

    1994-11-01

    The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.

  14. Leap Frog and Time Step Sub-Cycle Scheme for Coupled Neutronics and Thermal-Hydraulic Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, S.

    2002-07-01

    As the result of the advancing TCP/IP based inter-process communication technology, more and more legacy thermal-hydraulic codes have been coupled with neutronics codes to provide best-estimate capabilities for reactivity related reactor transient analysis. Most of the coupling schemes are based on closely coupled serial or parallel approaches. Therefore, the execution of the coupled codes usually requires significant CPU time, when a complicated system is analyzed. Leap Frog scheme has been used to reduce the run time. The extent of the decoupling is usually determined based on a trial and error process for a specific analysis. It is the intent ofmore » this paper to develop a set of general criteria, which can be used to invoke the automatic Leap Frog algorithm. The algorithm will not only provide the run time reduction but also preserve the accuracy. The criteria will also serve as the base of an automatic time step sub-cycle scheme when a sudden reactivity change is introduced and the thermal-hydraulic code is marching with a relatively large time step. (authors)« less

  15. ORBIT: A Code for Collective Beam Dynamics in High-Intensity Rings

    NASA Astrophysics Data System (ADS)

    Holmes, J. A.; Danilov, V.; Galambos, J.; Shishlo, A.; Cousineau, S.; Chou, W.; Michelotti, L.; Ostiguy, J.-F.; Wei, J.

    2002-12-01

    We are developing a computer code, ORBIT, specifically for beam dynamics calculations in high-intensity rings. Our approach allows detailed simulation of realistic accelerator problems. ORBIT is a particle-in-cell tracking code that transports bunches of interacting particles through a series of nodes representing elements, effects, or diagnostics that occur in the accelerator lattice. At present, ORBIT contains detailed models for strip-foil injection, including painting and foil scattering; rf focusing and acceleration; transport through various magnetic elements; longitudinal and transverse impedances; longitudinal, transverse, and three-dimensional space charge forces; collimation and limiting apertures; and the calculation of many useful diagnostic quantities. ORBIT is an object-oriented code, written in C++ and utilizing a scripting interface for the convenience of the user. Ongoing improvements include the addition of a library of accelerator maps, BEAMLINE/MXYZPTLK; the introduction of a treatment of magnet errors and fringe fields; the conversion of the scripting interface to the standard scripting language, Python; and the parallelization of the computations using MPI. The ORBIT code is an open source, powerful, and convenient tool for studying beam dynamics in high-intensity rings.

  16. Interactive computer modeling of combustion chemistry and coalescence-dispersion modeling of turbulent combustion

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.

    1984-01-01

    An interactive computer code for simulation of a high-intensity turbulent combustor as a single point inhomogeneous stirred reactor was developed from an existing batch processing computer code CDPSR. The interactive CDPSR code was used as a guide for interpretation and direction of DOE-sponsored companion experiments utilizing Xenon tracer with optical laser diagnostic techniques to experimentally determine the appropriate mixing frequency, and for validation of CDPSR as a mixing-chemistry model for a laboratory jet-stirred reactor. The coalescence-dispersion model for finite rate mixing was incorporated into an existing interactive code AVCO-MARK I, to enable simulation of a combustor as a modular array of stirred flow and plug flow elements, each having a prescribed finite mixing frequency, or axial distribution of mixing frequency, as appropriate. Further increase the speed and reliability of the batch kinetics integrator code CREKID was increased by rewriting in vectorized form for execution on a vector or parallel processor, and by incorporating numerical techniques which enhance execution speed by permitting specification of a very low accuracy tolerance.

  17. Malnutrition coding 101: financial impact and more.

    PubMed

    Giannopoulos, Georgia A; Merriman, Louise R; Rumsey, Alissa; Zwiebel, Douglas S

    2013-12-01

    Recent articles have addressed the characteristics associated with adult malnutrition as published by the Academy of Nutrition and Dietetics (the Academy) and the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.). This article describes a successful interdisciplinary program developed by the Department of Food and Nutrition at New York-Presbyterian Hospital to maintain and monitor clinical documentation, ensure accurate International Classification of Diseases 9th Edition (ICD-9) coding, and identify subsequent incremental revenue resulting from the early identification, documentation, and treatment of malnutrition in an adult inpatient population. The first step in the process requires registered dietitians to identify patients with malnutrition; then clear and specifically worded diagnostic statements that include the type and severity of malnutrition are documented in the medical record by the physician, nurse practitioner, or physician's assistant. This protocol allows the Heath Information Management/Coding department to accurately assign ICD-9 codes associated with protein-energy malnutrition. Once clinical coding is complete, a final diagnosis related group (DRG) is generated to ensure appropriate hospital reimbursement. Successful interdisciplinary programs such as this can drive optimal care and ensure appropriate reimbursement.

  18. Decree No. 2737 issuing the Code of Minors, 27 November 1989.

    PubMed

    1989-01-01

    This document contains major provisions of the 1989 Code of Minors of Colombia. This Code spells out the rights of minors to protection, care, and adequate physical, mental, and social development. These rights go into force from the moment of conception. Minors have a specified right to life; to a defined filiation; to grow up within a family; to receive an education (compulsory to the ninth grade and free of charge); to be protected from abuse; to health care; to freedom of speech and to know their rights; to liberty of thought, conscience, and religion; to rest, recreation, and play; to participate in sports and the arts; and to be protected from labor exploitation. Handicapped minors have the right to care, education, and special training. Minors also have the right to be protected from the use of dependency-creating drugs. Any minor in an "irregular situation" will receive protective services. The Code defines abandoned minors and those in danger and provides specific protective measures which can be taken. Rules and procedures covering adoption are included in the Code, because adoption is viewed as primarily a protective measure.

  19. Parallelized direct execution simulation of message-passing parallel programs

    NASA Technical Reports Server (NTRS)

    Dickens, Phillip M.; Heidelberger, Philip; Nicol, David M.

    1994-01-01

    As massively parallel computers proliferate, there is growing interest in findings ways by which performance of massively parallel codes can be efficiently predicted. This problem arises in diverse contexts such as parallelizing computers, parallel performance monitoring, and parallel algorithm development. In this paper we describe one solution where one directly executes the application code, but uses a discrete-event simulator to model details of the presumed parallel machine such as operating system and communication network behavior. Because this approach is computationally expensive, we are interested in its own parallelization specifically the parallelization of the discrete-event simulator. We describe methods suitable for parallelized direct execution simulation of message-passing parallel programs, and report on the performance of such a system, Large Application Parallel Simulation Environment (LAPSE), we have built on the Intel Paragon. On all codes measured to date, LAPSE predicts performance well typically within 10 percent relative error. Depending on the nature of the application code, we have observed low slowdowns (relative to natively executing code) and high relative speedups using up to 64 processors.

  20. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  1. A numerical similarity approach for using retired Current Procedural Terminology (CPT) codes for electronic phenotyping in the Scalable Collaborative Infrastructure for a Learning Health System (SCILHS).

    PubMed

    Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N

    2015-12-11

    Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach's utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.

  2. The effect of cost construction based on either DRG or ICD-9 codes or risk group stratification on the resulting cost-effectiveness ratios.

    PubMed

    Chumney, Elinor C G; Biddle, Andrea K; Simpson, Kit N; Weinberger, Morris; Magruder, Kathryn M; Zelman, William N

    2004-01-01

    As cost-effectiveness analyses (CEAs) are increasingly used to inform policy decisions, there is a need for more information on how different cost determination methods affect cost estimates and the degree to which the resulting cost-effectiveness ratios (CERs) may be affected. The lack of specificity of diagnosis-related groups (DRGs) could mean that they are ill-suited for costing applications in CEAs. Yet, the implications of using International Classification of Diseases-9th edition (ICD-9) codes or a form of disease-specific risk group stratification instead of DRGs has yet to be clearly documented. To demonstrate the implications of different disease coding mechanisms on costs and the magnitude of error that could be introduced in head-to-head comparisons of resulting CERs. We based our analyses on a previously published Markov model for HIV/AIDS therapies. We used the Healthcare Cost and Utilisation Project Nationwide Inpatient Sample (HCUP-NIS) data release 6, which contains all-payer data on hospital inpatient stays from selected states. We added costs for the mean number of hospitalisations, derived from analyses based on either DRG or ICD-9 codes or risk group stratification cost weights, to the standard outpatient and prescription drug costs to yield an estimate of total charges for each AIDS-defining illness (ADI). Finally, we estimated the Markov model three times with the appropriate ADI cost weights to obtain CERs specific to the use of either DRG or ICD-9 codes or risk group. Contrary to expectations, we found that the choice of coding/grouping assumptions that are disease-specific by either DRG codes, ICD-9 codes or risk group resulted in very similar CER estimates for highly active antiretroviral therapy. The large variations in the specific ADI cost weights across the three different coding approaches was especially interesting. However, because no one approach produced consistently higher estimates than the others, the Markov model's weighted cost per event and resulting CERs were remarkably close in value to one another. Although DRG codes are based on broader categories and contain less information than ICD-9 codes, in practice the choice of whether to use DRGs or ICD-9 codes may have little effect on the CEA results in heterogeneous conditions such as HIV/AIDS.

  3. Establishing the Thematic Framework for a Diabetes-Specific Health-Related Quality of Life Item Bank for Use in an English-Speaking Asian Population

    PubMed Central

    Koh, Odelia; Lee, Jeannette; Tan, Maudrene L. S.; Tai, E-Shyong; Foo, Ce Jin; Chong, Kok Joon; Goh, Su-Yen; Bee, Yong Mong; Thumboo, Julian; Cheung, Yin-Bun; Singh, Avjeet; Wee, Hwee-Lin

    2014-01-01

    Aims To establish a thematic framework for a Diabetes Mellitus (DM)-specific health-related quality of life (HRQoL) item bank by identifying important HRQoL themes and content gaps in existing DM-specific HRQoL measures and determining whether Patient-Reported Outcomes Measurement Information System (PROMIS) item banks are useful as a starting point. Methodology English-speaking Type 2 DM patients were recruited from an outpatient specialist clinic in Singapore. Thematic analysis was performed through open coding and axial coding. Items from four existing DM-specific measures and PROMIS Version 1.0 and 2.0 item banks were compared with identified themes and sub-themes. Results 42 patients participated (25 men and 17 women; 28 Chinese, 4 Malay, 8 Indians, 2 other ethnicities). Median age was 53.70 years (IQR45.82–56.97) and the median disease duration was 11.13 (SD9.77) years. 10 subthemes (neutral emotions, coping emotions, empowered to help others, support from family, spend more time with family, relationships, financial burden on family, improved relationship, social support and religion/spirituality) were not covered by existing DM-specific measures. PROMIS covered 5 of 6 themes, 15 of 30 subthemes and 19 of 35 codes identified. Emotional distress (frustration, fear and anxiety) was most frequently mentioned (200 times). Conclusions We had developed a thematic framework for assessing DM-specific HRQoL in a multi-ethnic Asian population, identified new items that needed to be written and confirmed that PROMIS was a useful starting point. We hope that better understanding and measurement of HRQoL of Asian DM patients will translate to better quality of care for them. PMID:25531429

  4. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  5. Co-LncRNA: investigating the lncRNA combinatorial effects in GO annotations and KEGG pathways based on human RNA-Seq data

    PubMed Central

    Zhao, Zheng; Bai, Jing; Wu, Aiwei; Wang, Yuan; Zhang, Jinwen; Wang, Zishan; Li, Yongsheng; Xu, Juan; Li, Xia

    2015-01-01

    Long non-coding RNAs (lncRNAs) are emerging as key regulators of diverse biological processes and diseases. However, the combinatorial effects of these molecules in a specific biological function are poorly understood. Identifying co-expressed protein-coding genes of lncRNAs would provide ample insight into lncRNA functions. To facilitate such an effort, we have developed Co-LncRNA, which is a web-based computational tool that allows users to identify GO annotations and KEGG pathways that may be affected by co-expressed protein-coding genes of a single or multiple lncRNAs. LncRNA co-expressed protein-coding genes were first identified in publicly available human RNA-Seq datasets, including 241 datasets across 6560 total individuals representing 28 tissue types/cell lines. Then, the lncRNA combinatorial effects in a given GO annotations or KEGG pathways are taken into account by the simultaneous analysis of multiple lncRNAs in user-selected individual or multiple datasets, which is realized by enrichment analysis. In addition, this software provides a graphical overview of pathways that are modulated by lncRNAs, as well as a specific tool to display the relevant networks between lncRNAs and their co-expressed protein-coding genes. Co-LncRNA also supports users in uploading their own lncRNA and protein-coding gene expression profiles to investigate the lncRNA combinatorial effects. It will be continuously updated with more human RNA-Seq datasets on an annual basis. Taken together, Co-LncRNA provides a web-based application for investigating lncRNA combinatorial effects, which could shed light on their biological roles and could be a valuable resource for this community. Database URL: http://www.bio-bigdata.com/Co-LncRNA/ PMID:26363020

  6. The emergence of international food safety standards and guidelines: understanding the current landscape through a historical approach.

    PubMed

    Ramsingh, Brigit

    2014-07-01

    Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches.

  7. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tome, Carlos N; Caro, J A; Lebensohn, R A

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less

  8. Automated encoding of clinical documents based on natural language processing.

    PubMed

    Friedman, Carol; Shagina, Lyudmila; Lussier, Yves; Hripcsak, George

    2004-01-01

    The aim of this study was to develop a method based on natural language processing (NLP) that automatically maps an entire clinical document to codes with modifiers and to quantitatively evaluate the method. An existing NLP system, MedLEE, was adapted to automatically generate codes. The method involves matching of structured output generated by MedLEE consisting of findings and modifiers to obtain the most specific code. Recall and precision applied to Unified Medical Language System (UMLS) coding were evaluated in two separate studies. Recall was measured using a test set of 150 randomly selected sentences, which were processed using MedLEE. Results were compared with a reference standard determined manually by seven experts. Precision was measured using a second test set of 150 randomly selected sentences from which UMLS codes were automatically generated by the method and then validated by experts. Recall of the system for UMLS coding of all terms was .77 (95% CI.72-.81), and for coding terms that had corresponding UMLS codes recall was .83 (.79-.87). Recall of the system for extracting all terms was .84 (.81-.88). Recall of the experts ranged from .69 to .91 for extracting terms. The precision of the system was .89 (.87-.91), and precision of the experts ranged from .61 to .91. Extraction of relevant clinical information and UMLS coding were accomplished using a method based on NLP. The method appeared to be comparable to or better than six experts. The advantage of the method is that it maps text to codes along with other related information, rendering the coded output suitable for effective retrieval.

  9. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  10. Flexible high speed codec

    NASA Technical Reports Server (NTRS)

    Boyd, R. W.; Hartman, W. F.

    1992-01-01

    The project's objective is to develop an advanced high speed coding technology that provides substantial coding gains with limited bandwidth expansion for several common modulation types. The resulting technique is applicable to several continuous and burst communication environments. Decoding provides a significant gain with hard decisions alone and can utilize soft decision information when available from the demodulator to increase the coding gain. The hard decision codec will be implemented using a single application specific integrated circuit (ASIC) chip. It will be capable of coding and decoding as well as some formatting and synchronization functions at data rates up to 300 megabits per second (Mb/s). Code rate is a function of the block length and can vary from 7/8 to 15/16. Length of coded bursts can be any multiple of 32 that is greater than or equal to 256 bits. Coding may be switched in or out on a burst by burst basis with no change in the throughput delay. Reliability information in the form of 3-bit (8-level) soft decisions, can be exploited using applique circuitry around the hard decision codec. This applique circuitry will be discrete logic in the present contract. However, ease of transition to LSI is one of the design guidelines. Discussed here is the selected coding technique. Its application to some communication systems is described. Performance with 4, 8, and 16-ary Phase Shift Keying (PSK) modulation is also presented.

  11. Circumstances of Trauma and Accidents in Children: A Thesaurus-based Survey

    PubMed

    Séjourné, Claire; Philbois, Olivier; Vercherin, Paul; Patural, Hugues

    2016-11-25

    Introduction : Injuries and accidents are major causes of morbidity and mortality in children in France. Identification and description of the mechanisms of accidents are essential to develop adapted prevention methods. For this purpose, a specific thesaurus of ICD-10 codes relating to the circumstances of trauma and accidents in children was created in the French Loire department. The objective of this study was to evaluate the relevance and acceptability of the thesaurus in the pediatric emergency unit of Saint-Etienne university hospital.Material and Methods : This study was conducted in two phases. The first, longitudinal phase was conducted over three periods between May and October 2014 to compare codings by emergency room physicians before using the thesaurus with those defined on the basis of the thesaurus. The second phase retrospectively compared coding in July and August 2014 before introduction of the thesaurus with thesaurus-based coding in July and August 2015.Results : The first phase showed a loss of more than half of the information without the thesaurus. The circumstances of trauma can be described by an appropriate code in more than 90% of cases. The second phase showed a 13% increase in coding of the circumstances of trauma, which nevertheless remains insufficient.Discussion : The thesaurus facilitates coding and generally meets the coding physician’s expectations and should be used in large-scale epidemiological surveys.

  12. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  13. Transparent ICD and DRG Coding Using Information Technology: Linking and Associating Information Sources with the eXtensible Markup Language

    PubMed Central

    Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    This document outlines the development of a high fidelity, best estimate nuclear power plant severe transient simulation capability that will complement or enhance the integral system codes historically used for licensing and analysis of severe accidents. As with other tools in the Risk Informed Safety Margin Characterization (RISMC) Toolkit, the ultimate user of Enhanced Severe Transient Analysis and Prevention (ESTAP) capability is the plant decision-maker; the deliverable to that customer is a modern, simulation-based safety analysis capability, applicable to a much broader class of safety issues than is traditional Light Water Reactor (LWR) licensing analysis. Currently, the RISMC pathway’s majormore » emphasis is placed on developing RELAP-7, a next-generation safety analysis code, and on showing how to use RELAP-7 to analyze margin from a modern point of view: that is, by characterizing margin in terms of the probabilistic spectra of the “loads” applied to systems, structures, and components (SSCs), and the “capacity” of those SSCs to resist those loads without failing. The first objective of the ESTAP task, and the focus of one task of this effort, is to augment RELAP-7 analyses with user-selected multi-dimensional, multi-phase models of specific plant components to simulate complex phenomena that may lead to, or exacerbate, severe transients and core damage. Such phenomena include: coolant crossflow between PWR assemblies during a severe reactivity transient, stratified single or two-phase coolant flow in primary coolant piping, inhomogeneous mixing of emergency coolant water or boric acid with hot primary coolant, and water hammer. These are well-documented phenomena associated with plant transients but that are generally not captured in system codes. They are, however, generally limited to specific components, structures, and operating conditions. The second ESTAP task is to similarly augment a severe (post-core damage) accident integral analyses code with high fidelity simulations that would allow investigation of multi-dimensional, multi-phase containment phenomena that are only treated approximately in established codes.« less

  15. Behavior change techniques used in group-based behavioral support by the English stop-smoking services and preliminary assessment of association with short-term quit outcomes.

    PubMed

    West, Robert; Evans, Adam; Michie, Susan

    2011-12-01

    To develop a reliable coding scheme for components of group-based behavioral support for smoking cessation, to establish the frequency of inclusion in English Stop-Smoking Service (SSS) treatment manuals of specific components, and to investigate the associations between inclusion of behavior change techniques (BCTs) and service success rates. A taxonomy of BCTs specific to group-based behavioral support was developed and reliability of use assessed. All English SSSs (n = 145) were contacted to request their group-support treatment manuals. BCTs included in the manuals were identified using this taxonomy. Associations between inclusion of specific BCTs and short-term (4-week) self-reported quit outcomes were assessed. Fourteen group-support BCTs were identified with >90% agreement between coders. One hundred and seven services responded to the request for group-support manuals of which 30 had suitable documents. On average, 7 BCTs were included in each manual. Two were positively associated with 4-week quit rates: "communicate group member identities" and a "betting game" (a financial deposit that is lost if a stop-smoking "buddy" relapses). It is possible to reliably code group-specific BCTs for smoking cessation. Fourteen such techniques are present in guideline documents of which 2 appear to be associated with higher short-term self-reported quit rates when included in treatment manuals of English SSSs.

  16. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  17. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    PubMed

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and specificity of 99.9% (95% CI = 99.8% to 100.0%). Expanding the ICD-9-CM codes to include both nonspecified and general (i.e., without a decimal modifier) drug poisoning and drug abuse codes identified overdose ED visits with a sensitivity of 56.8% (95% CI = 43.6%-72.7%) and specificity of 96.2% (95% CI = 94.8%-97.2%). Additional ICD-9-CM codes not explicitly relevant to opioid overdose were necessary to further enhance sensitivity. Among the 44 overdose ED visits, neither naloxone administration during the visit, whether the patient responded to the naloxone, nor the specific opioids involved were associated with the assignment of an opioid poisoning ICD-9-CM code (p ≥ 0.05). Tracking opioid overdose ED visits by diagnostic coding is fairly specific but insensitive, and coding was not influenced by administration of naloxone or the specific opioids involved. The reason for the high rate of missed cases is uncertain, although these results suggest that a more clearly defined case definition for overdose may be necessary to ensure effective opioid overdose surveillance. Changes in coding practices under ICD-10 might help to address these deficiencies. © 2016 by the Society for Academic Emergency Medicine.

  18. Fluid Film Bearing Code Development

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The next generation of rocket engine turbopumps is being developed by industry through Government-directed contracts. These turbopumps will use fluid film bearings because they eliminate the life and shaft-speed limitations of rolling-element bearings, increase turbopump design flexibility, and reduce the need for turbopump overhauls and maintenance. The design of the fluid film bearings for these turbopumps, however, requires sophisticated analysis tools to model the complex physical behavior characteristic of fluid film bearings operating at high speeds with low viscosity fluids. State-of-the-art analysis and design tools are being developed at the Texas A&M University under a grant guided by the NASA Lewis Research Center. The latest version of the code, HYDROFLEXT, is a thermohydrodynamic bulk flow analysis with fluid compressibility, full inertia, and fully developed turbulence models. It can predict the static and dynamic force response of rigid and flexible pad hydrodynamic bearings and of rigid and tilting pad hydrostatic bearings. The Texas A&M code is a comprehensive analysis tool, incorporating key fluid phenomenon pertinent to bearings that operate at high speeds with low-viscosity fluids typical of those used in rocket engine turbopumps. Specifically, the energy equation was implemented into the code to enable fluid properties to vary with temperature and pressure. This is particularly important for cryogenic fluids because their properties are sensitive to temperature as well as pressure. As shown in the figure, predicted bearing mass flow rates vary significantly depending on the fluid model used. Because cryogens are semicompressible fluids and the bearing dynamic characteristics are highly sensitive to fluid compressibility, fluid compressibility effects are also modeled. The code contains fluid properties for liquid hydrogen, liquid oxygen, and liquid nitrogen as well as for water and air. Other fluids can be handled by the code provided that the user inputs information that relates the fluid transport properties to the temperature.

  19. A Fast Healthcare Interoperability Resources (FHIR) layer implemented over i2b2.

    PubMed

    Boussadi, Abdelali; Zapletal, Eric

    2017-08-14

    Standards and technical specifications have been developed to define how the information contained in Electronic Health Records (EHRs) should be structured, semantically described, and communicated. Current trends rely on differentiating the representation of data instances from the definition of clinical information models. The dual model approach, which combines a reference model (RM) and a clinical information model (CIM), sets in practice this software design pattern. The most recent initiative, proposed by HL7, is called Fast Health Interoperability Resources (FHIR). The aim of our study was to investigate the feasibility of applying the FHIR standard to modeling and exposing EHR data of the Georges Pompidou European Hospital (HEGP) integrating biology and the bedside (i2b2) clinical data warehouse (CDW). We implemented a FHIR server over i2b2 to expose EHR data in relation with five FHIR resources: DiagnosisReport, MedicationOrder, Patient, Encounter, and Medication. The architecture of the server combines a Data Access Object design pattern and FHIR resource providers, implemented using the Java HAPI FHIR API. Two types of queries were tested: query type #1 requests the server to display DiagnosticReport resources, for which the diagnosis code is equal to a given ICD-10 code. A total of 80 DiagnosticReport resources, corresponding to 36 patients, were displayed. Query type #2, requests the server to display MedicationOrder, for which the FHIR Medication identification code is equal to a given code expressed in a French coding system. A total of 503 MedicationOrder resources, corresponding to 290 patients, were displayed. Results were validated by manually comparing the results of each request to the results displayed by an ad-hoc SQL query. We showed the feasibility of implementing a Java layer over the i2b2 database model to expose data of the CDW as a set of FHIR resources. An important part of this work was the structural and semantic mapping between the i2b2 model and the FHIR RM. To accomplish this, developers must manually browse the specifications of the FHIR standard. Our source code is freely available and can be adapted for use in other i2b2 sites.

  20. Classifying Chinese Questions Related to Health Care Posted by Consumers Via the Internet.

    PubMed

    Guo, Haihong; Na, Xu; Hou, Li; Li, Jiao

    2017-06-20

    In question answering (QA) system development, question classification is crucial for identifying information needs and improving the accuracy of returned answers. Although the questions are domain-specific, they are asked by non-professionals, making the question classification task more challenging. This study aimed to classify health care-related questions posted by the general public (Chinese speakers) on the Internet. A topic-based classification schema for health-related questions was built by manually annotating randomly selected questions. The Kappa statistic was used to measure the interrater reliability of multiple annotation results. Using the above corpus, we developed a machine-learning method to automatically classify these questions into one of the following six classes: Condition Management, Healthy Lifestyle, Diagnosis, Health Provider Choice, Treatment, and Epidemiology. The consumer health question schema was developed with a four-hierarchical-level of specificity, comprising 48 quaternary categories and 35 annotation rules. The 2000 sample questions were coded with 2000 major codes and 607 minor codes. Using natural language processing techniques, we expressed the Chinese questions as a set of lexical, grammatical, and semantic features. Furthermore, the effective features were selected to improve the question classification performance. From the 6-category classification results, we achieved an average precision of 91.41%, recall of 89.62%, and F 1 score of 90.24%. In this study, we developed an automatic method to classify questions related to Chinese health care posted by the general public. It enables Artificial Intelligence (AI) agents to understand Internet users' information needs on health care. ©Haihong Guo, Xu Na, Li Hou, Jiao Li. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 20.06.2017.

  1. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less

  2. Research into language concepts for the mission control center

    NASA Technical Reports Server (NTRS)

    Dellenback, Steven W.; Barton, Timothy J.; Ratner, Jeremiah M.

    1990-01-01

    A final report is given on research into language concepts for the Mission Control Center (MCC). The Specification Driven Language research is described. The state of the image processing field and how image processing techniques could be applied toward automating the generation of the language known as COmputation Development Environment (CODE or Comp Builder) are discussed. Also described is the development of a flight certified compiler for Comps.

  3. Small Engine Technology (SET) Task 23 ANOPP Noise Prediction for Small Engines, Wing Reflection Code

    NASA Technical Reports Server (NTRS)

    Lieber, Lysbeth; Brown, Daniel; Golub, Robert A. (Technical Monitor)

    2000-01-01

    The work performed under Task 23 consisted of the development and demonstration of improvements for the NASA Aircraft Noise Prediction Program (ANOPP), specifically targeted to the modeling of engine noise enhancement due to wing reflection. This report focuses on development of the model and procedure to predict the effects of wing reflection, and the demonstration of the procedure, using a representative wing/engine configuration.

  4. Poly(A) code analyses reveal key determinants for tissue-specific mRNA alternative polyadenylation

    PubMed Central

    Weng, Lingjie; Li, Yi; Xie, Xiaohui; Shi, Yongsheng

    2016-01-01

    mRNA alternative polyadenylation (APA) is a critical mechanism for post-transcriptional gene regulation and is often regulated in a tissue- and/or developmental stage-specific manner. An ultimate goal for the APA field has been to be able to computationally predict APA profiles under different physiological or pathological conditions. As a first step toward this goal, we have assembled a poly(A) code for predicting tissue-specific poly(A) sites (PASs). Based on a compendium of over 600 features that have known or potential roles in PAS selection, we have generated and refined a machine-learning algorithm using multiple high-throughput sequencing-based data sets of tissue-specific and constitutive PASs. This code can predict tissue-specific PASs with >85% accuracy. Importantly, by analyzing the prediction performance based on different RNA features, we found that PAS context, including the distance between alternative PASs and the relative position of a PAS within the gene, is a key feature for determining the susceptibility of a PAS to tissue-specific regulation. Our poly(A) code provides a useful tool for not only predicting tissue-specific APA regulation, but also for studying its underlying molecular mechanisms. PMID:27095026

  5. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  6. Development, dissemination, and applications of a new terminological resource, the Q-Code taxonomy for professional aspects of general practice/family medicine.

    PubMed

    Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc

    2018-12-01

    While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.

  7. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.

  8. Creep and Creep-Fatigue Crack Growth at Structural Discontinuities and Welds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. F. W. Brust; Dr. G. M. Wilkowski; Dr. P. Krishnaswamy

    2010-01-27

    The subsection ASME NH high temperature design procedure does not admit crack-like defects into the structural components. The US NRC identified the lack of treatment of crack growth within NH as a limitation of the code and thus this effort was undertaken. This effort is broken into two parts. Part 1, summarized here, involved examining all high temperature creep-fatigue crack growth codes being used today and from these, the task objective was to choose a methodology that is appropriate for possible implementation within NH. The second part of this task, which has just started, is to develop design rules formore » possible implementation within NH. This second part is a challenge since all codes require step-by-step analysis procedures to be undertaken in order to assess the crack growth and life of the component. Simple rules for design do not exist in any code at present. The codes examined in this effort included R5, RCC-MR (A16), BS 7910, API 579, and ATK (and some lesser known codes). There are several reasons that the capability for assessing cracks in high temperature nuclear components is desirable. These include: (1) Some components that are part of GEN IV reactors may have geometries that have sharp corners - which are essentially cracks. Design of these components within the traditional ASME NH procedure is quite challenging. It is natural to ensure adequate life design by modeling these features as cracks within a creep-fatigue crack growth procedure. (2) Workmanship flaws in welds sometimes occur and are accepted in some ASME code sections. It can be convenient to consider these as flaws when making a design life assessment. (3) Non-destructive Evaluation (NDE) and inspection methods after fabrication are limited in the size of the crack or flaw that can be detected. It is often convenient to perform a life assessment using a flaw of a size that represents the maximum size that can elude detection. (4) Flaws that are observed using in-service detection methods often need to be addressed as plants age. Shutdown inspection intervals can only be designed using creep and creep-fatigue crack growth techniques. (5) The use of crack growth procedures can aid in examining the seriousness of creep damage in structural components. How cracks grow can be used to assess margins on components and lead to further safe operation. After examining the pros and cons of all these methods, the R5 code was chosen as the most up-to-date and validated high temperature creep and creep fatigue code currently used in the world at present. R5 is considered the leader because the code: (1) has well established and validated rules, (2) has a team of experts continually improving and updating it, (3) has software that can be used by designers, (4) extensive validation in many parts with available data from BE resources as well as input from Imperial college's database, and (5) was specifically developed for use in nuclear plants. R5 was specifically developed for use in gas cooled nuclear reactors which operate in the UK and much of the experience is based on materials and temperatures which are experienced in these reactors. If the next generation advanced reactors to be built in the US used these same materials within the same temperature ranges as these reactors, then R5 may be appropriate for consideration of direct implementation within ASME code NH or Section XI. However, until more verification and validation of these creep/fatigue crack growth rules for the specific materials and temperatures to be used in the GEN IV reactors is complete, ASME should consider delaying this implementation. With this in mind, it is this authors opinion that R5 methods are the best available for code use today. The focus of this work was to examine the literature for creep and creep-fatigue crack growth procedures that are well established in codes in other countries and choose a procedure to consider implementation into ASME NH. It is very important to recognize that all creep and creep fatigue crack growth procedures that are part of high temperature design codes are related and very similar. This effort made no attempt to develop a new creep-fatigue crack growth predictive methodology. Rather examination of current procedures was the only goal. The uncertainties in the R5 crack growth methods and recommendations for more work are summarized here also.« less

  9. Xyce Parallel Electronic Simulator Users Guide Version 6.2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. Trademarks The information herein is subject to change without notice. Copyright c 2002-2014 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TM are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less

  10. Xyce Parallel Electronic Simulator Users Guide Version 6.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been de- signed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel com- puting platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows onemore » to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. Trademarks The information herein is subject to change without notice. Copyright c 2002-2015 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TM are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less

  11. Gene-specific cell labeling using MiMIC transposons

    PubMed Central

    Gnerer, Joshua P.; Venken, Koen J. T.; Dierick, Herman A.

    2015-01-01

    Binary expression systems such as GAL4/UAS, LexA/LexAop and QF/QUAS have greatly enhanced the power of Drosophila as a model organism by allowing spatio-temporal manipulation of gene function as well as cell and neural circuit function. Tissue-specific expression of these heterologous transcription factors relies on random transposon integration near enhancers or promoters that drive the binary transcription factor embedded in the transposon. Alternatively, gene-specific promoter elements are directly fused to the binary factor within the transposon followed by random or site-specific integration. However, such insertions do not consistently recapitulate endogenous expression. We used Minos-Mediated Integration Cassette (MiMIC) transposons to convert host loci into reliable gene-specific binary effectors. MiMIC transposons allow recombinase-mediated cassette exchange to modify the transposon content. We developed novel exchange cassettes to convert coding intronic MiMIC insertions into gene-specific binary factor protein-traps. In addition, we expanded the set of binary factor exchange cassettes available for non-coding intronic MiMIC insertions. We show that binary factor conversions of different insertions in the same locus have indistinguishable expression patterns, suggesting that they reliably reflect endogenous gene expression. We show the efficacy and broad applicability of these new tools by dissecting the cellular expression patterns of the Drosophila serotonin receptor gene family. PMID:25712101

  12. Development of a new version of the Vehicle Protection Factor Code (VPF3)

    NASA Astrophysics Data System (ADS)

    Jamieson, Terrance J.

    1990-10-01

    The Vehicle Protection Factor (VPF) Code is an engineering tool for estimating radiation protection afforded by armoured vehicles and other structures exposed to neutron and gamma ray radiation from fission, thermonuclear, and fusion sources. A number of suggestions for modifications have been offered by users of early versions of the code. These include: implementing some of the more advanced features of the air transport rating code, ATR5, used to perform the air over ground radiation transport analyses; allowing the ability to study specific vehicle orientations within the free field; implementing an adjoint transport scheme to reduce the number of transport runs required; investigating the possibility of accelerating the transport scheme; and upgrading the computer automated design (CAD) package used by VPF. The generation of radiation free field fluences for infinite air geometries as required for aircraft analysis can be accomplished by using ATR with the air over ground correction factors disabled. Analysis of the effects of fallout bearing debris clouds on aircraft will require additional modelling of VPF.

  13. Fabrication of Circuit QED Quantum Processors, Part 1: Extensible Footprint for a Superconducting Surface Code

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Michalak, D. J.; Poletto, S.; Clarke, J. S.; Dicarlo, L.

    Large-scale quantum computation hinges on the ability to preserve and process quantum information with higher fidelity by increasing redundancy in a quantum error correction code. We present the realization of a scalable footprint for superconducting surface code based on planar circuit QED. We developed a tileable unit cell for surface code with all I/O routed vertically by means of superconducting through-silicon vias (TSVs). We address some of the challenges encountered during the fabrication and assembly of these chips, such as the quality of etch of the TSV, the uniformity of the ALD TiN coating conformal to the TSV, and the reliability of superconducting indium contact between the chips and PCB. We compare measured performance to a detailed list of specifications required for the realization of quantum fault tolerance. Our demonstration using centimeter-scale chips can accommodate the 50 qubits needed to target the experimental demonstration of small-distance logical qubits. Research funded by Intel Corporation and IARPA.

  14. Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST

    NASA Astrophysics Data System (ADS)

    Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan

    2018-04-01

    We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.

  15. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  16. The international implications of national and local coordination on building energy codes: Case studies in six cities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Yu, Sha; Staniszewski, Aaron

    Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less

  17. The international implications of national and local coordination on building energy codes: Case studies in six cities

    DOE PAGES

    Evans, Meredydd; Yu, Sha; Staniszewski, Aaron; ...

    2018-04-17

    Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less

  18. 24 CFR 3280.808 - Wiring methods and materials.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Wiring methods and materials. 3280... § 3280.808 Wiring methods and materials. (a) Except as specifically permitted by this part, the wiring methods and materials specified in the National Electrical Code, NFPA No. 70-2005, must be used in...

  19. A Computer-Assisted Nutrition Education Unit for Grades 4-6.

    ERIC Educational Resources Information Center

    Hills, Alvina M.

    1983-01-01

    A computer-assisted instructional unit (written for 32K Commodore PET microcomputer) was developed to identify four food groups outlined in Canada's Food Guide, place specific foods in correct groups, and identify food not belonging to the four groups. Animated color-coded keys are used to represent the food groups. (JN)

  20. 78 FR 48655 - Multistakeholder Meeting To Develop Consumer Data Privacy Code of Conduct Concerning Mobile...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... Consumer Privacy Bill of Rights applies in specific business contexts.\\2\\ On July 12, 2012, NTIA convened... most efficiently; how future processes might make stakeholder participation easier and more effective; and how future processes might start with one or more sessions that provide factual background on a...

  1. Recommended Practices for the Safe Design and Operation of Flywheels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Donald Arthur

    2015-12-01

    Flywheel energy storage systems are in use globally in increasing numbers . No codes pertaining specifically to flywheel energy storage exist. A number of industrial incidents have occurred. This protocol recommends a technical basis for safe flywheel de sign and operation for consideration by flywheel developers, users of flywheel systems and standards setting organizations.

  2. Binary Arithmetic From Hariot (CA, 1600 A.D.) to the Computer Age.

    ERIC Educational Resources Information Center

    Glaser, Anton

    This history of binary arithmetic begins with details of Thomas Hariot's contribution and includes specific references to Hariot's manuscripts kept at the British Museum. A binary code developed by Sir Francis Bacon is discussed. Briefly mentioned are contributions to binary arithmetic made by Leibniz, Fontenelle, Gauss, Euler, Benzout, Barlow,…

  3. Schools as Ethical or Schools as Political? Habermas between Dewey and Rawls

    ERIC Educational Resources Information Center

    Johnston, James Scott

    2012-01-01

    Education is oftentimes understood as a deeply ethical practice for the development of the person. Alternatively, education is construed as a state-enforced apparatus for inculcation of specific codes, conventions, beliefs, and norms about social and political practices. Though holding both of these beliefs about education is not necessarily…

  4. Modelling Conditions and Health Care Processes in Electronic Health Records: An Application to Severe Mental Illness with the Clinical Practice Research Datalink

    PubMed Central

    Olier, Ivan; Springate, David A.; Ashcroft, Darren M.; Doran, Tim; Reeves, David; Planner, Claire; Reilly, Siobhan; Kontopantelis, Evangelos

    2016-01-01

    Background The use of Electronic Health Records databases for medical research has become mainstream. In the UK, increasing use of Primary Care Databases is largely driven by almost complete computerisation and uniform standards within the National Health Service. Electronic Health Records research often begins with the development of a list of clinical codes with which to identify cases with a specific condition. We present a methodology and accompanying Stata and R commands (pcdsearch/Rpcdsearch) to help researchers in this task. We present severe mental illness as an example. Methods We used the Clinical Practice Research Datalink, a UK Primary Care Database in which clinical information is largely organised using Read codes, a hierarchical clinical coding system. Pcdsearch is used to identify potentially relevant clinical codes and/or product codes from word-stubs and code-stubs suggested by clinicians. The returned code-lists are reviewed and codes relevant to the condition of interest are selected. The final code-list is then used to identify patients. Results We identified 270 Read codes linked to SMI and used them to identify cases in the database. We observed that our approach identified cases that would have been missed with a simpler approach using SMI registers defined within the UK Quality and Outcomes Framework. Conclusion We described a framework for researchers of Electronic Health Records databases, for identifying patients with a particular condition or matching certain clinical criteria. The method is invariant to coding system or database and can be used with SNOMED CT, ICD or other medical classification code-lists. PMID:26918439

  5. MELCOR/CONTAIN LMR Implementation Report-Progress FY15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L.; Louie, David L.Y.

    2016-01-01

    This report describes the progress of the CONTAIN-LMR sodium physics and chemistry models to be implemented in to MELCOR 2.1. It also describes the progress to implement these models into CONT AIN 2 as well. In the past two years, the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laborat ory by modifying MELCOR to include liquid lithium equation of state as a working fluid to mode l the nuclear fusion safety research. The second source uses properties generatedmore » for the SIMMER code. Testing and results from this implementation of sodium pr operties are given. In addition, the CONTAIN-LMR code was derived from an early version of C ONTAIN code. Many physical models that were developed sin ce this early version of CONTAIN are not captured by this early code version. Therefore, CONTAIN 2 is being updated with the sodium models in CONTAIN-LMR in or der to facilitate verification of these models with the MELCOR code. Although CONTAIN 2, which represents the latest development of CONTAIN, now contains ma ny of the sodium specific models, this work is not complete due to challenges from the lower cell architecture in CONTAIN 2, which is different from CONTAIN- LMR. This implementation should be completed in the coming year, while sodi um models from C ONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use. In terms of implementing the sodium m odels into MELCOR, a separate sodium model branch was created for this document . Because of massive development in the main stream MELCOR 2.1 code and the require ment to merge the latest code version into this branch, the integration of the s odium models were re-directed to implement the sodium chemistry models first. This change led to delays of the actual implementation. For aid in the future implementation of sodium models, a new sodium chemistry package was created. Thus reporting for the implementation of the sodium chemistry is discussed in this report.« less

  6. Methodology, status and plans for development and assessment of Cathare code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less

  7. ICRF Development for the Variable Specific Impulse Magnetoplasma Rocket

    NASA Astrophysics Data System (ADS)

    Ryan, P. M.; Baity, F. W.; Barber, G. C.; Carter, M. D.; Hoffman, D. J.; Jaeger, E. F.; Taylor, D. J.; Chang-Diaz, F. R.; Squire, J. P.; McCaskill, G.

    1997-11-01

    The feasibility of using magnetically vectored and rf-heated plasmas for space propulsion (F. R. Chang-Diaz, et al., Bull. Am. Phys. Soc., 41, 1541 (1996)) is being investigated experimentally on an asymmetric magnetic mirror device at the Advanced Space Propulsion Laboratory (ASPL), Johnson Space Center, NASA. Analysis of the antenna interaction with and the wave propagation through the dense plasma propulsion system is being studied at ORNL(Oak Ridge National Laboratory, managed by Lockheed Martin Energy Research Corp. for the U.S. Department of Energy under contract number DE-AC05-96OR22464.), using antenna design codes developed for ICH systems and mirror codes developed for the EBT experiment at ORNL. The present modeling effort is directed toward the ASPL experimental device. Antenna optimization and performance, as well as the design considerations for space-qualified rf components and systems (minimizing weight while maximizing reliability) will be presented.

  8. A crystallographic model for nickel base single crystal alloys

    NASA Technical Reports Server (NTRS)

    Dame, L. T.; Stouffer, D. C.

    1988-01-01

    The purpose of this research is to develop a tool for the mechanical analysis of nickel-base single-crystal superalloys, specifically Rene N4, used in gas turbine engine components. This objective is achieved by developing a rate-dependent anisotropic constitutive model and implementing it in a nonlinear three-dimensional finite-element code. The constitutive model is developed from metallurgical concepts utilizing a crystallographic approach. An extension of Schmid's law is combined with the Bodner-Partom equations to model the inelastic tension/compression asymmetry and orientation-dependence in octahedral slip. Schmid's law is used to approximate the inelastic response of the material in cube slip. The constitutive equations model the tensile behavior, creep response and strain-rate sensitivity of the single-crystal superalloys. Methods for deriving the material constants from standard tests are also discussed. The model is implemented in a finite-element code, and the computed and experimental results are compared for several orientations and loading conditions.

  9. Predicting Regulatory Compliance in Beer Advertising on Facebook.

    PubMed

    Noel, Jonathan K; Babor, Thomas F

    2017-11-01

    The prevalence of alcohol advertising has been growing on social media platforms. The purpose of this study was to evaluate alcohol advertising on Facebook for regulatory compliance and thematic content. A total of 50 Budweiser and Bud Light ads posted on Facebook within 1 month of the 2015 NFL Super Bowl were evaluated for compliance with a self-regulated alcohol advertising code and for thematic content. An exploratory sensitivity/specificity analysis was conducted to determine if thematic content could predict code violations. The code violation rate was 82%, with violations prevalent in guidelines prohibiting the association of alcohol with success (Guideline 5) and health benefits (Guideline 3). Overall, 21 thematic content areas were identified. Displaying the product (62%) and adventure/sensation seeking (52%) were the most prevalent. There was perfect specificity (100%) for 10 content areas for detecting any code violation (animals, negative emotions, positive emotions, games/contests/promotions, female characters, minorities, party, sexuality, night-time, sunrise) and high specificity (>80%) for 10 content areas for detecting violations of guidelines intended to protect minors (animals, negative emotions, famous people, friendship, games/contests/promotions, minorities, responsibility messages, sexuality, sunrise, video games). The high prevalence of code violations indicates a failure of self-regulation to prevent potentially harmful content from appearing in alcohol advertising, including explicit code violations (e.g. sexuality). Routine violations indicate an unwillingness to restrict advertising content for public health purposes, and statutory restrictions may be necessary to sufficiently deter alcohol producers from repeatedly violating marketing codes. Violations of a self-regulated alcohol advertising code are prevalent in a sample of beer ads published on Facebook near the US National Football League's Super Bowl. Overall, 16 thematic content areas demonstrated high specificity for code violations. Alcohol advertising codes should be updated to expressly prohibit the use of such content. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  10. Coding OSICS sports injury diagnoses in epidemiological studies: does the background of the coder matter?

    PubMed

    Finch, Caroline F; Orchard, John W; Twomey, Dara M; Saad Saleem, Muhammad; Ekegren, Christina L; Lloyd, David G; Elliott, Bruce C

    2014-04-01

    To compare Orchard Sports Injury Classification System (OSICS-10) sports medicine diagnoses assigned by a clinical and non-clinical coder. Assessment of intercoder agreement. Community Australian football. 1082 standardised injury surveillance records. Direct comparison of the four-character hierarchical OSICS-10 codes assigned by two independent coders (a sports physician and an epidemiologist). Adjudication by a third coder (biomechanist). The coders agreed on the first character 95% of the time and on the first two characters 86% of the time. They assigned the same four-digit OSICS-10 code for only 46% of the 1082 injuries. The majority of disagreements occurred for the third character; 85% were because one coder assigned a non-specific 'X' code. The sports physician code was deemed correct in 53% of cases and the epidemiologist in 44%. Reasons for disagreement included the physician not using all of the collected information and the epidemiologist lacking specific anatomical knowledge. Sports injury research requires accurate identification and classification of specific injuries and this study found an overall high level of agreement in coding according to OSICS-10. The fact that the majority of the disagreements occurred for the third OSICS character highlights the fact that increasing complexity and diagnostic specificity in injury coding can result in a loss of reliability and demands a high level of anatomical knowledge. Injury report form details need to reflect this level of complexity and data management teams need to include a broad range of expertise.

  11. A Novel Family in Medicago truncatula Consisting of More Than 300 Nodule-Specific Genes Coding for Small, Secreted Polypeptides with Conserved Cysteine Motifs1[w

    PubMed Central

    Mergaert, Peter; Nikovics, Krisztina; Kelemen, Zsolt; Maunoury, Nicolas; Vaubert, Danièle; Kondorosi, Adam; Kondorosi, Eva

    2003-01-01

    Transcriptome analysis of Medicago truncatula nodules has led to the discovery of a gene family named NCR (nodule-specific cysteine rich) with more than 300 members. The encoded polypeptides were short (60–90 amino acids), carried a conserved signal peptide, and, except for a conserved cysteine motif, displayed otherwise extensive sequence divergence. Family members were found in pea (Pisum sativum), broad bean (Vicia faba), white clover (Trifolium repens), and Galega orientalis but not in other plants, including other legumes, suggesting that the family might be specific for galegoid legumes forming indeterminate nodules. Gene expression of all family members was restricted to nodules except for two, also expressed in mycorrhizal roots. NCR genes exhibited distinct temporal and spatial expression patterns in nodules and, thus, were coupled to different stages of development. The signal peptide targeted the polypeptides in the secretory pathway, as shown by green fluorescent protein fusions expressed in onion (Allium cepa) epidermal cells. Coregulation of certain NCR genes with genes coding for a potentially secreted calmodulin-like protein and for a signal peptide peptidase suggests a concerted action in nodule development. Potential functions of the NCR polypeptides in cell-to-cell signaling and creation of a defense system are discussed. PMID:12746522

  12. A review and empirical study of the composite scales of the Das-Naglieri cognitive assessment system.

    PubMed

    McCrea, Simon M

    2009-01-01

    Alexander Luria's model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria's syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria's two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning-Attention-Simultaneous-Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria's theory of higher cortical functions. In this paper a theoretical review of Luria's theory, Das and colleagues elaboration of Luria's model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby.

  13. Exceptionally long 5' UTR short tandem repeats specifically linked to primates.

    PubMed

    Namdar-Aligoodarzi, P; Mohammadparast, S; Zaker-Kandjani, B; Talebi Kakroodi, S; Jafari Vesiehsari, M; Ohadi, M

    2015-09-10

    We have previously reported genome-scale short tandem repeats (STRs) in the core promoter interval (i.e. -120 to +1 to the transcription start site) of protein-coding genes that have evolved identically in primates vs. non-primates. Those STRs may function as evolutionary switch codes for primate speciation. In the current study, we used the Ensembl database to analyze the 5' untranslated region (5' UTR) between +1 and +60 of the transcription start site of the entire human protein-coding genes annotated in the GeneCards database, in order to identify "exceptionally long" STRs (≥5-repeats), which may be of selective/adaptive advantage. The importance of this critical interval is its function as core promoter, and its effect on transcription and translation. In order to minimize ascertainment bias, we analyzed the evolutionary status of the human 5' UTR STRs of ≥5-repeats in several species encompassing six major orders and superorders across mammals, including primates, rodents, Scandentia, Laurasiatheria, Afrotheria, and Xenarthra. We introduce primate-specific STRs, and STRs which have expanded from mouse to primates. Identical co-occurrence of the identified STRs of rare average frequency between 0.006 and 0.0001 in primates supports a role for those motifs in processes that diverged primates from other mammals, such as neuronal differentiation (e.g. APOD and FGF4), and craniofacial development (e.g. FILIP1L). A number of the identified STRs of ≥5-repeats may be human-specific (e.g. ZMYM3 and DAZAP1). Future work is warranted to examine the importance of the listed genes in primate/human evolution, development, and disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. A review and empirical study of the composite scales of the Das–Naglieri cognitive assessment system

    PubMed Central

    McCrea, Simon M

    2009-01-01

    Alexander Luria’s model of the working brain consisting of three functional units was formulated through the examination of hundreds of focal brain-injury patients. Several psychometric instruments based on Luria’s syndrome analysis and accompanying qualitative tasks have been developed since the 1970s. In the mid-1970s, JP Das and colleagues defined a specific cognitive processes model based directly on Luria’s two coding units termed simultaneous and successive by studying diverse cross-cultural, ability, and socioeconomic strata. The cognitive assessment system is based on the PASS model of cognitive processes and consists of four composite scales of Planning–Attention–Simultaneous–Successive (PASS) devised by Naglieri and Das in 1997. Das and colleagues developed the two new scales of planning and attention to more closely model Luria’s theory of higher cortical functions. In this paper a theoretical review of Luria’s theory, Das and colleagues elaboration of Luria’s model, and the neural correlates of PASS composite scales based on extant studies is summarized. A brief empirical study of the neuropsychological specificity of the PASS composite scales in a sample of 33 focal cortical stroke patients using cluster analysis is then discussed. Planning and simultaneous were sensitive to right hemisphere lesions. These findings were integrated with recent functional neuroimaging studies of PASS scales. In sum it was found that simultaneous is strongly dependent on dual bilateral occipitoparietal interhemispheric coordination whereas successive demonstrated left frontotemporal specificity with some evidence of interhemispheric coordination across the prefrontal cortex. Hence, support for the validity of the PASS composite scales was found as well as for the axiom of the independence of code content from code type originally specified in 1994 by Das, Naglieri, and Kirby. PMID:22110322

  15. A Computer Code for Swirling Turbulent Axisymmetric Recirculating Flows in Practical Isothermal Combustor Geometries

    NASA Technical Reports Server (NTRS)

    Lilley, D. G.; Rhode, D. L.

    1982-01-01

    A primitive pressure-velocity variable finite difference computer code was developed to predict swirling recirculating inert turbulent flows in axisymmetric combustors in general, and for application to a specific idealized combustion chamber with sudden or gradual expansion. The technique involves a staggered grid system for axial and radial velocities, a line relaxation procedure for efficient solution of the equations, a two-equation k-epsilon turbulence model, a stairstep boundary representation of the expansion flow, and realistic accommodation of swirl effects. A user's manual, dealing with the computational problem, showing how the mathematical basis and computational scheme may be translated into a computer program is presented. A flow chart, FORTRAN IV listing, notes about various subroutines and a user's guide are supplied as an aid to prospective users of the code.

  16. Studying Spacecraft Charging via Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Delzanno, G. L.; Moulton, D.; Meierbachtol, C.; Svyatskiy, D.; Vernon, L.

    2015-12-01

    The electrical charging of spacecraft due to bombarding charged particles can affect their performance and operation. We study this charging using CPIC; a particle-in-cell code specifically designed for studying plasma-material interactions [1]. CPIC is based on multi-block curvilinear meshes, resulting in near-optimal computational performance while maintaining geometric accuracy. Relevant plasma parameters are imported from the SHIELDS framework (currently under development at LANL), which simulates geomagnetic storms and substorms in the Earth's magnetosphere. Simulated spacecraft charging results of representative Van Allen Probe geometries using these plasma parameters will be presented, along with an overview of the code. [1] G.L. Delzanno, E. Camporeale, J.D. Moulton, J.E. Borovsky, E.A. MacDonald, and M.F. Thomsen, "CPIC: A Curvilinear Particle-In-Cell Code for Plasma-Material Interaction Studies," IEEE Trans. Plas. Sci., 41 (12), 3577 (2013).

  17. Multidimensional Fuel Performance Code: BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BISON is a finite element based nuclear fuel performance code applicable to a variety of fuel forms including light water reactor fuel rods, TRISO fuel particles, and metallic rod and plate fuel (Refs. [a, b, c]). It solves the fully-coupled equations of thermomechanics and species diffusion and includes important fuel physics such as fission gas release and material property degradation with burnup. BISON is based on the MOOSE framework (Ref. [d]) and can therefore efficiently solve problems on 1-, 2- or 3-D meshes using standard workstations or large high performance computers. BISON is also coupled to a MOOSE-based mesoscale phasemore » field material property simulation capability (Refs. [e, f]). As described here, BISON includes the code library named FOX, which was developed concurrent with BISON. FOX contains material and behavioral models that are specific to oxide fuels.« less

  18. Avenue for integrating geoethics into the working world.

    NASA Astrophysics Data System (ADS)

    Boon, Jan

    2017-04-01

    GeoEthics is a young field and is taking shape through events such as the International Declaration on GeoEthics (2011) and conference sessions held since 2013 (http://www.geoethics.org/events). Many of these focused on the geoscience community and had an academic or educational focus. While interest in the subject is growing, the number of participants is still low as compared to that of other geoscience disciplines. The author has found it difficult to generate interest in the formal inclusion of geoethics and the related fields of social and environmental responsibility into university geoscience curricula. This paper proposes to link geoethics to the huge effort that has been under way over the past decade and a half in the extractive industries to develop industry-wide codes that facilitate and structure the introduction of social and environmental responsibility into company operations. All of these codes contain specific references to ethics. The paper builds on two well-developed codes: "e3 Plus - A Framework or Responsible Exploration" (Prospectors and Developers Association of Canada) and "Towards Sustainable Mining" (Mining Association of Canada) to propose an overall practical reference frame for geoethics that may facilitate the transfer of geoethicists' work into daily practice in the extractive industries.

  19. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  20. Base heating methodology improvements, volume 1

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold

    1992-01-01

    This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.

Top