Sample records for genon concept coding

  1. The gene and the genon concept: a functional and information-theoretic analysis

    PubMed Central

    Scherrer, Klaus; Jost, Jürgen

    2007-01-01

    ‘Gene' has become a vague and ill-defined concept. To set the stage for mathematical analysis of gene storage and expression, we return to the original concept of the gene as a function encoded in the genome, basis of genetic analysis, that is a polypeptide or other functional product. The additional information needed to express a gene is contained within each mRNA as an ensemble of signals, added to or superimposed onto the coding sequence. To designate this programme, we introduce the term ‘genon'. Individual genons are contained in the pre-mRNA forming a pre-genon. A genomic domain contains a proto-genon, with the signals of transcription activation in addition to the pre-genon in the transcripts. Some contain several mRNAs and hence genons, to be singled out by RNA processing and differential splicing. The programme in the genon in cis is implemented by corresponding factors of protein or RNA nature contained in the transgenon of the cell or organism. The gene, the cis programme contained in the individual domain and transcript, and the trans programme of factors, can be analysed by information theory. PMID:17353929

  2. Gene and genon concept: coding versus regulation

    PubMed Central

    2007-01-01

    We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760

  3. NPDES Permit for NRG Energy (Formerly GenOn Potomac River Generating Station)

    EPA Pesticide Factsheets

    Under National Pollutant Discharge Elimination System permit number DC0022004, NRG Energy (Formerly GenOn Potomac River Generating Station) is authorized to discharge from a facility into receiving waters named Potomac River.

  4. Geometric Defects in Quantum Hall States

    NASA Astrophysics Data System (ADS)

    Gromov, Andrey

    I will describe a geometric analogue of Laughlin quasiholes in fractional quantum Hall (FQH) states. These ``quasiholes'' are generated by an insertion of quantized fluxes of curvature - which can be modeled by branch points of a certain Riemann surface - and, consequently, are related to genons. Unlike quasiholes, the genons are not excitations, but extrinsic defects. Fusion of genons describes the response of an FQH state to a process that changes (effective) topology of the physical space. These defects are abelian for IQH states and non-abelian for FQH states. I will explain how to calculate an electric charge, geometric spin and adiabatic mutual statistics of the these defects. Leo Kadanoff Fellowship.

  5. 77 FR 56839 - GenOn Marsh Landing, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-2545-000] GenOn Marsh Landing, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... Landing, LLC's application for market-based rate authority, with an accompanying rate schedule, noting...

  6. 76 FR 82293 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ..., GenOn Wholesale Generation, LP, RRI Energy Services, LLC. Description: Updated Market Power Analysis...: Windpower Partners 1993, LLC. Description: Windpower Partners 1993, LLC Notice of Succession and Revisions...

  7. 77 FR 26438 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Approval of 2011 Consent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ...EPA is taking direct final action to approve State Implementation Plan (SIP) revisions submitted by the Maryland Department of the Environment (MDE) pertaining to the GenOn Chalk Point Generating Station (Chalk Point). These revisions approve specific provisions of a 2011 Consent Decree between MDE and GenOn to reduce particulate matter (PM), sulfur oxides (SOX), and nitrogen oxides (NOX) from Chalk Point. These revisions also remove the 1978 and 1979 Consent Orders for the Chalk Point generating station from the Maryland SIP as those Consent Orders have been superseded by the 2011 Consent Decree. EPA is approving these SIP revisions because the reductions of PM, SOX, and NOX are beneficial for reducing ambient levels of the PM, sulfur dioxide (SO2), nitrogen dioxide (NO2) and ozone. They also reduce visible emissions from Chalk Point. This action is being taken under the Clean Air Act (CAA).

  8. 77 FR 26474 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Approval of 2011 Consent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ...EPA proposes to approve State Implementation Plan (SIP) revisions submitted by the Maryland Department of the Environment (MDE). These revisions approve specific provisions of a 2011 Consent Decree between MDE and GenOn to reduce particulate matter (PM), sulfur oxides (SOX), and nitrogen oxides (NOX) from the GenOn Chalk Point generating station (Chalk Point). These revisions also remove the 1978 and 1979 Consent Orders for the Chalk Point generating station from the Maryland SIP as those Consent Orders have been superseded by the 2011 Consent Decree. In the Final Rules section of this Federal Register, EPA is approving the State's SIP submittal as a direct final rule without prior proposal because the Agency views this as a noncontroversial submittal and anticipates no adverse comments. A detailed rationale for the approval is set forth in the direct final rule. If no adverse comments are received in response to this action, no further activity is contemplated. If EPA receives adverse comments, the direct final rule will be withdrawn and all public comments received will be addressed in a subsequent final rule based on this proposed rule. EPA will not institute a second comment period. Any parties interested in commenting on this action should do so at this time.

  9. Petition for the Administrator to Object to Title V Permit for GenOn REMA, LLC's Shawville Generation Station

    EPA Pesticide Factsheets

    This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Petition Database available at www2.epa.gov/title-v-operating-permits/title-v-petition-database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  10. Variation of SNOMED CT coding of clinical research concepts among coding experts.

    PubMed

    Andrews, James E; Richesson, Rachel L; Krischer, Jeffrey

    2007-01-01

    To compare consistency of coding among professional SNOMED CT coders representing three commercial providers of coding services when coding clinical research concepts with SNOMED CT. A sample of clinical research questions from case report forms (CRFs) generated by the NIH-funded Rare Disease Clinical Research Network (RDCRN) were sent to three coding companies with instructions to code the core concepts using SNOMED CT. The sample consisted of 319 question/answer pairs from 15 separate studies. The companies were asked to select SNOMED CT concepts (in any form, including post-coordinated) that capture the core concept(s) reflected in the question. Also, they were asked to state their level of certainty, as well as how precise they felt their coding was. Basic frequencies were calculated to determine raw level agreement among the companies and other descriptive information. Krippendorff's alpha was used to determine a statistical measure of agreement among the coding companies for several measures (semantic, certainty, and precision). No significant level of agreement among the experts was found. There is little semantic agreement in coding of clinical research data items across coders from 3 professional coding services, even using a very liberal definition of agreement.

  11. Comparison of procedure coding systems for level 1 and 2 hospitals in South Africa.

    PubMed

    Montewa, Lebogang; Hanmer, Lyn; Reagon, Gavin

    2013-01-01

    The ability of three procedure coding systems to reflect the procedure concepts extracted from patient records from six hospitals was compared, in order to inform decision making about a procedure coding standard for South Africa. A convenience sample of 126 procedure concepts was extracted from patient records at three level 1 hospitals and three level 2 hospitals. Each procedure concept was coded using ICPC-2, ICD-9-CM, and CCSA-2001. The extent to which each code assigned actually reflected the procedure concept was evaluated (between 'no match' and 'complete match'). For the study sample, CCSA-2001 was found to reflect the procedure concepts most completely, followed by ICD-9-CM and then ICPC-2. In practice, decision making about procedure coding standards would depend on multiple factors in addition to coding accuracy.

  12. Coding and decoding in a point-to-point communication using the polarization of the light beam.

    PubMed

    Kavehvash, Z; Massoumian, F

    2008-05-10

    A new technique for coding and decoding of optical signals through the use of polarization is described. In this technique the concept of coding is translated to polarization. In other words, coding is done in such a way that each code represents a unique polarization. This is done by implementing a binary pattern on a spatial light modulator in such a way that the reflected light has the required polarization. Decoding is done by the detection of the received beam's polarization. By linking the concept of coding to polarization we can use each of these concepts in measuring the other one, attaining some gains. In this paper the construction of a simple point-to-point communication where coding and decoding is done through polarization will be discussed.

  13. Assigning clinical codes with data-driven concept representation on Dutch clinical free text.

    PubMed

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter

    2017-05-01

    Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.

  14. Some Practical Universal Noiseless Coding Techniques

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.

    1994-01-01

    Report discusses noiseless data-compression-coding algorithms, performance characteristics and practical consideration in implementation of algorithms in coding modules composed of very-large-scale integrated circuits. Report also has value as tutorial document on data-compression-coding concepts. Coding techniques and concepts in question "universal" in sense that, in principle, applicable to streams of data from variety of sources. However, discussion oriented toward compression of high-rate data generated by spaceborne sensors for lower-rate transmission back to earth.

  15. Automated UMLS-Based Comparison of Medical Forms

    PubMed Central

    Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard

    2013-01-01

    Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827

  16. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  17. Participation as an outcome measure in psychosocial oncology: content of cancer-specific health-related quality of life instruments.

    PubMed

    van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F

    2011-12-01

    To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.

  18. A Strategy for Reusing the Data of Electronic Medical Record Systems for Clinical Research.

    PubMed

    Matsumura, Yasushi; Hattori, Atsushi; Manabe, Shiro; Tsuda, Tsutomu; Takeda, Toshihiro; Okada, Katsuki; Murata, Taizo; Mihara, Naoki

    2016-01-01

    There is a great need to reuse data stored in electronic medical records (EMR) databases for clinical research. We previously reported the development of a system in which progress notes and case report forms (CRFs) were simultaneously recorded using a template in the EMR in order to exclude redundant data entry. To make the data collection process more efficient, we are developing a system in which the data originally stored in the EMR database can be populated within a frame in a template. We developed interface plugin modules that retrieve data from the databases of other EMR applications. A universal keyword written in a template master is converted to a local code using a data conversion table, then the objective data is retrieved from the corresponding database. The template element data, which are entered by a template, are stored in the template element database. To retrieve the data entered by other templates, the objective data is designated by the template element code with the template code, or by the concept code if it is written for the element. When the application systems in the EMR generate documents, they also generate a PDF file and a corresponding document profile XML, which includes important data, and send them to the document archive server and the data sharing saver, respectively. In the data sharing server, the data are represented by an item with an item code with a document class code and its value. By linking a concept code to an item identifier, an objective data can be retrieved by designating a concept code. We employed a flexible strategy in which a unique identifier for a hospital is initially attached to all of the data that the hospital generates. The identifier is secondarily linked with concept codes. The data that are not linked with a concept code can also be retrieved using the unique identifier of the hospital. This strategy makes it possible to reuse any of a hospital's data.

  19. Coding of procedures documented by general practitioners in Swedish primary care-an explorative study using two procedure coding systems

    PubMed Central

    2012-01-01

    Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095

  20. Reliability of SNOMED-CT Coding by Three Physicians using Two Terminology Browsers

    PubMed Central

    Chiang, Michael F.; Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Cimino, James J.; Starren, Justin

    2006-01-01

    SNOMED-CT has been promoted as a reference terminology for electronic health record (EHR) systems. Many important EHR functions are based on the assumption that medical concepts will be coded consistently by different users. This study is designed to measure agreement among three physicians using two SNOMED-CT terminology browsers to encode 242 concepts from five ophthalmology case presentations in a publicly-available clinical journal. Inter-coder reliability, based on exact coding match by each physician, was 44% using one browser and 53% using the other. Intra-coder reliability testing revealed that a different SNOMED-CT code was obtained up to 55% of the time when the two browsers were used by one user to encode the same concept. These results suggest that the reliability of SNOMED-CT coding is imperfect, and may be a function of browsing methodology. A combination of physician training, terminology refinement, and browser improvement may help increase the reproducibility of SNOMED-CT coding. PMID:17238317

  1. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  2. Effects of Selected Filmic Coding Elements of TV on the Development of the Euclidean Concepts of Horizontality and Verticality in Adolescents.

    ERIC Educational Resources Information Center

    Lynch, Beth Eloise

    This study was conducted to determine whether the filmic coding elements of split screen, slow motion, generated line cues, the zoom of a camera, and rotation could aid in the development of the Euclidean space concepts of horizontality and verticality, and to explore presence and development of spatial skills involving these two concepts in…

  3. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  4. An all-digital receiver for satellite audio broadcasting signals using trellis coded quasi-orthogonal code-division multiplexing

    NASA Astrophysics Data System (ADS)

    Braun, Walter; Eglin, Peter; Abello, Ricard

    1993-02-01

    Spread Spectrum Code Division Multiplex is an attractive scheme for the transmission of multiple signals over a satellite transponder. By using orthogonal or quasi-orthogonal spreading codes the interference between the users can be virtually eliminated. However, the acquisition and tracking of the spreading code phase can not take advantage of the code orthogonality since sequential acquisition and Delay-Locked loop tracking depend on correlation with code phases other than the optimal despreading phase. Hence, synchronization is a critical issue in such a system. A demonstration hardware for the verification of the orthogonal CDM synchronization and data transmission concept is being designed and implemented. The system concept, the synchronization scheme, and the implementation are described. The performance of the system is discussed based on computer simulations.

  5. A Concept for Run-Time Support of the Chapel Language

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.

  6. Inclusion of pressure and flow in a new 3D MHD equilibrium code

    NASA Astrophysics Data System (ADS)

    Raburn, Daniel; Fukuyama, Atsushi

    2012-10-01

    Flow and nonsymmetric effects can play a large role in plasma equilibria and energy confinement. A concept for such a 3D equilibrium code was developed and presented in 2011. The code is called the Kyoto ITerative Equilibrium Solver (KITES) [1], and the concept is based largely on the PIES code [2]. More recently, the work-in-progress KITES code was used to calculate force-free equilibria. Here, progress and results on the inclusion of pressure and flow in the code are presented. [4pt] [1] Daniel Raburn and Atsushi Fukuyama, Plasma and Fusion Research: Regular Articles, 7:240381 (2012).[0pt] [2] H. S. Greenside, A. H. Reiman, and A. Salas, J. Comput. Phys, 81(1):102-136 (1989).

  7. The effect of multiple internal representations on context-rich instruction

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Aulls, Mark W.

    2007-11-01

    We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.

  8. Code Mixing and Modernization across Cultures.

    ERIC Educational Resources Information Center

    Kamwangamalu, Nkonko M.

    A review of recent studies addressed the functional uses of code mixing across cultures. Expressions of code mixing (CM) are not random; in fact, a number of functions of code mixing can easily be delineated, for example, the concept of "modernization.""Modernization" is viewed with respect to how bilingual code mixers perceive…

  9. Formulation of Policy for Cyber Crime in Criminal Law Revision Concept of Bill Book of Criminal Law (A New Penal Code)

    NASA Astrophysics Data System (ADS)

    Soponyono, Eko; Deva Bernadhi, Brav

    2017-04-01

    Development of national legal systems is aimed to establish the public welfare and the protection of the public. Many attempts has been carried out to renew material criminal law and those efforts results in the formulation of the concept of the draft Law Book of the Law of Criminal Law in the form of concept criminal code draft. The basic ideas in drafting rules and regulation based on the values inside the idology of Pancasila are balance among various norm and rules in society. The design concept of the New Criminal Code Act is anticipatory and proactive to formulate provisions on Crime in Cyberspace and Crime on Information and Electronic Transactions. Several issues compiled in this paper are whether the policy in formulation of cyber crime is embodied in the provisions of the current legislation and what the policies formulation of cyber crime is in the concept of the bill book of law - criminal law recently?.

  10. Wall interference assessment and corrections

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Kemp, W. B., Jr.; Garriz, J. A.

    1989-01-01

    Wind tunnel wall interference assessment and correction (WIAC) concepts, applications, and typical results are discussed in terms of several nonlinear transonic codes and one panel method code developed for and being implemented at NASA-Langley. Contrasts between 2-D and 3-D transonic testing factors which affect WIAC procedures are illustrated using airfoil data from the 0.3 m Transonic Cryogenic Tunnel and Pathfinder 1 data from the National Transonic Facility. Initial results from the 3-D WIAC codes are encouraging; research on and implementation of WIAC concepts continue.

  11. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  12. A Semantic Analysis Method for Scientific and Engineering Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  13. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  14. The MIMIC Code Repository: enabling reproducibility in critical care research.

    PubMed

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  15. Unsupervised Extraction of Diagnosis Codes from EMRs Using Knowledge-Based and Extractive Text Summarization Techniques

    PubMed Central

    Kavuluru, Ramakanth; Han, Sifei; Harris, Daniel

    2017-01-01

    Diagnosis codes are extracted from medical records for billing and reimbursement and for secondary uses such as quality control and cohort identification. In the US, these codes come from the standard terminology ICD-9-CM derived from the international classification of diseases (ICD). ICD-9 codes are generally extracted by trained human coders by reading all artifacts available in a patient’s medical record following specific coding guidelines. To assist coders in this manual process, this paper proposes an unsupervised ensemble approach to automatically extract ICD-9 diagnosis codes from textual narratives included in electronic medical records (EMRs). Earlier attempts on automatic extraction focused on individual documents such as radiology reports and discharge summaries. Here we use a more realistic dataset and extract ICD-9 codes from EMRs of 1000 inpatient visits at the University of Kentucky Medical Center. Using named entity recognition (NER), graph-based concept-mapping of medical concepts, and extractive text summarization techniques, we achieve an example based average recall of 0.42 with average precision 0.47; compared with a baseline of using only NER, we notice a 12% improvement in recall with the graph-based approach and a 7% improvement in precision using the extractive text summarization approach. Although diagnosis codes are complex concepts often expressed in text with significant long range non-local dependencies, our present work shows the potential of unsupervised methods in extracting a portion of codes. As such, our findings are especially relevant for code extraction tasks where obtaining large amounts of training data is difficult. PMID:28748227

  16. The World in a Tomato: Revisiting the Use of "Codes" in Freire's Problem-Posing Education.

    ERIC Educational Resources Information Center

    Barndt, Deborah

    1998-01-01

    Gives examples of the use of Freire's notion of codes or generative themes in problem-posing literacy education. Describes how these applications expand Freire's conceptions by involving students in code production, including multicultural perspectives, and rethinking codes as representations. (SK)

  17. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  18. Optical encryption and QR codes: secure and noise-free information retrieval.

    PubMed

    Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto

    2013-03-11

    We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.

  19. Conceptualisations of infinity by primary pre-service teachers

    NASA Astrophysics Data System (ADS)

    Date-Huxtable, Elizabeth; Cavanagh, Michael; Coady, Carmel; Easey, Michael

    2018-05-01

    As part of the Opening Real Science: Authentic Mathematics and Science Education for Australia project, an online mathematics learning module embedding conceptual thinking about infinity in science-based contexts, was designed and trialled with a cohort of 22 pre-service teachers during 1 week of intensive study. This research addressed the question: "How do pre-service teachers conceptualise infinity mathematically?" Participants argued the existence of infinity in a summative reflective task, using mathematical and empirical arguments that were coded according to five themes: definition, examples, application, philosophy and teaching; and 17 codes. Participants' reflections were differentiated as to whether infinity was referred to as an abstract (A) or a real (R) concept or whether both (B) codes were used. Principal component analysis of the reflections, using frequency of codings, revealed that A and R codes occurred at different frequencies in three groups of reflections. Distinct methods of argument were associated with each group of reflections: mathematical numerical examples and empirical measurement comparisons characterised arguments for infinity as an abstract concept, geometric and empirical dynamic examples and belief statements characterised arguments for infinity as a real concept and empirical measurement and mathematical examples and belief statements characterised arguments for infinity as both an abstract and a real concept. An implication of the results is that connections between mathematical and empirical applications of infinity may assist pre-service teachers to contrast finite with infinite models of the world.

  20. Code Switching and Code Superimposition in Music. Working Papers in Sociolinguistics, No. 63.

    ERIC Educational Resources Information Center

    Slobin, Mark

    This paper illustrates how the sociolinguistic concept of code switching applies to the use of different styles of music. The two bases for the analogy are Labov's definition of code-switching as "moving from one consistent set of co-occurring rules to another," and the finding of sociolinguistics that code switching tends to be part of…

  1. Soft-Input Soft-Output Modules for the Construction and Distributed Iterative Decoding of Code Networks

    NASA Technical Reports Server (NTRS)

    Benedetto, S.; Divsalar, D.; Montorsi, G.; Pollara, F.

    1998-01-01

    Soft-input soft-output building blocks (modules) are presented to construct and iteratively decode in a distributed fashion code networks, a new concept that includes, and generalizes, various forms of concatenated coding schemes.

  2. GRC RBCC Concept Multidisciplinary Analysis

    NASA Technical Reports Server (NTRS)

    Suresh, Ambady

    2001-01-01

    This report outlines the GRC RBCC Concept for Multidisciplinary Analysis. The multidisciplinary coupling procedure is presented, along with technique validations and axisymmetric multidisciplinary inlet and structural results. The NPSS (Numerical Propulsion System Simulation) test bed developments and code parallelization are also presented. These include milestones and accomplishments, a discussion of running R4 fan application on the PII cluster as compared to other platforms, and the National Combustor Code speedup.

  3. A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory

    NASA Astrophysics Data System (ADS)

    Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin

    2015-09-01

    Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.

  4. Processing Motion: Using Code to Teach Newtonian Physics

    NASA Astrophysics Data System (ADS)

    Massey, M. Ryan

    Prior to instruction, students often possess a common-sense view of motion, which is inconsistent with Newtonian physics. Effective physics lessons therefore involve conceptual change. To provide a theoretical explanation for concepts and how they change, the triangulation model brings together key attributes of prototypes, exemplars, theories, Bayesian learning, ontological categories, and the causal model theory. The triangulation model provides a theoretical rationale for why coding is a viable method for physics instruction. As an experiment, thirty-two adolescent students participated in summer coding academies to learn how to design Newtonian simulations. Conceptual and attitudinal data was collected using the Force Concept Inventory and the Colorado Learning Attitudes about Science Survey. Results suggest that coding is an effective means for teaching Newtonian physics.

  5. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials.

    PubMed

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin; Cui, Tie Jun

    2017-09-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits "0" and "1" to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency-spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments.

  6. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  7. The general theory of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  8. Nevada Administrative Code for Special Education Programs.

    ERIC Educational Resources Information Center

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  9. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    ERIC Educational Resources Information Center

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  10. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  11. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  12. Two-Dimensional Parson's Puzzles: The Concept, Tools, and First Observations

    ERIC Educational Resources Information Center

    Ihantola, Petri; Karavirta, Ville

    2011-01-01

    Parson's programming puzzles are a family of code construction assignments where lines of code are given, and the task is to form the solution by sorting and possibly selecting the correct code lines. We introduce a novel family of Parson's puzzles where the lines of code need to be sorted in two dimensions. The vertical dimension is used to order…

  13. A European mobile satellite system concept exploiting CDMA and OBP

    NASA Technical Reports Server (NTRS)

    Vernucci, A.; Craig, A. D.

    1993-01-01

    This paper describes a novel Land Mobile Satellite System (LMSS) concept applicable to networks allowing access to a large number of gateway stations ('Hubs'), utilizing low-cost Very Small Aperture Terminals (VSAT's). Efficient operation of the Forward-Link (FL) repeater can be achieved by adopting a synchronous Code Division Multiple Access (CDMA) technique, whereby inter-code interference (self-noise) is virtually eliminated by synchronizing orthogonal codes. However, with a transparent FL repeater, the requirements imposed by the highly decentralized ground segment can lead to significant efficiency losses. The adoption of a FL On-Board Processing (OBP) repeater is proposed as a means of largely recovering this efficiency impairment. The paper describes the network architecture, the system design and performance, the OBP functions and impact on implementation. The proposed concept, applicable to a future generation of the European LMSS, was developed in the context of a European Space Agency (ESA) study contract.

  14. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials

    PubMed Central

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin

    2017-01-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits “0” and “1” to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency‐spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments. PMID:28932671

  15. Sequential Syndrome Decoding of Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; Truong, T. K.

    1984-01-01

    The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.

  16. Proceduracy: Computer Code Writing in the Continuum of Literacy

    ERIC Educational Resources Information Center

    Vee, Annette

    2010-01-01

    This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…

  17. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  18. Galen-In-Use: using artificial intelligence terminology tools to improve the linguistic coherence of a national coding system for surgical procedures.

    PubMed

    Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F

    1998-01-01

    GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.

  19. Computational Infrastructure for Engine Structural Performance Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.

  20. Upgrades to the NESS (Nuclear Engine System Simulation) Code

    NASA Technical Reports Server (NTRS)

    Fittje, James E.

    2007-01-01

    In support of the President's Vision for Space Exploration, the Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for human expeditions to the moon and Mars. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the 1960's and 1970's. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design.

  1. Computer Graphics and Metaphorical Elaboration for Learning Science Concepts.

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan; Chan, Kung-Chi

    This study explores the instructional impact of using computer multimedia to integrate metaphorical verbal information into graphical representations of biotechnology concepts. The combination of text and graphics into a single metaphor makes concepts dual-coded, and therefore more comprehensible and memorable for the student. Visual stimuli help…

  2. Subsumption principles underlying medical concept systems and their formal reconstruction.

    PubMed Central

    Bernauer, J.

    1994-01-01

    Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907

  3. SEADYN Analysis of a Tow Line for a High Altitude Towed Glider

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony J.

    1996-01-01

    The concept of using a system, consisting of a tow aircraft, glider and tow line, which would enable subsonic flight at altitudes above 24 km (78 kft) has previously been investigated. The preliminary results from these studies seem encouraging. Under certain conditions these studies indicate the concept is feasible. However, the previous studies did not accurately take into account the forces acting on the tow line. Therefore in order to investigate the concept further a more detailed analysis was needed. The code that was selected was the SEADYN cable dynamics computer program which was developed at the Naval Facilities Engineering Service Center. The program is a finite element based structural analysis code that was developed over a period of 10 years. The results have been validated by the Navy in both laboratory and at actual sea conditions. This code was used to simulate arbitrarily-configured cable structures subjected to excitations encountered in real-world operations. The Navy's interest was mainly for modeling underwater tow lines, however the code is also usable for tow lines in air when the change in fluid properties is taken into account. For underwater applications the fluid properties are basically constant over the length of the tow line. For the tow aircraft/glider application the change in fluid properties is considerable along the length of the tow line. Therefore the code had to be modified in order to take into account the variation in atmospheric properties that would be encountered in this application. This modification consisted of adding a variable density to the fluid based on the altitude of the node being calculated. This change in the way the code handled the fluid density had no effect on the method of calculation or any other factor related to the codes validation.

  4. A new systems engineering approach to streamlined science and mission operations for the Far Ultraviolet Spectroscopic Explorer (FUSE)

    NASA Technical Reports Server (NTRS)

    Butler, Madeline J.; Sonneborn, George; Perkins, Dorothy C.

    1994-01-01

    The Mission Operations and Data Systems Directorate (MO&DSD, Code 500), the Space Sciences Directorate (Code 600), and the Flight Projects Directorate (Code 400) have developed a new approach to combine the science and mission operations for the FUSE mission. FUSE, the last of the Delta-class Explorer missions, will obtain high resolution far ultraviolet spectra (910 - 1220 A) of stellar and extragalactic sources to study the evolution of galaxies and conditions in the early universe. FUSE will be launched in 2000 into a 24-hour highly eccentric orbit. Science operations will be conducted in real time for 16-18 hours per day, in a manner similar to the operations performed today for the International Ultraviolet Explorer. In a radical departure from previous missions, the operations concept combines spacecraft and science operations and data processing functions in a single facility to be housed in the Laboratory for Astronomy and Solar Physics (Code 680). A small missions operations team will provide the spacecraft control, telescope operations and data handling functions in a facility designated as the Science and Mission Operations Center (SMOC). This approach will utilize the Transportable Payload Operations Control Center (TPOCC) architecture for both spacecraft and instrument commanding. Other concepts of integrated operations being developed by the Code 500 Renaissance Project will also be employed for the FUSE SMOC. The primary objective of this approach is to reduce development and mission operations costs. The operations concept, integration of mission and science operations, and extensive use of existing hardware and software tools will decrease both development and operations costs extensively. This paper describes the FUSE operations concept, discusses the systems engineering approach used for its development, and the software, hardware and management tools that will make its implementation feasible.

  5. Spatial transform coding of color images.

    NASA Technical Reports Server (NTRS)

    Pratt, W. K.

    1971-01-01

    The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.

  6. Legislative recognition in France of psychological harassment at work.

    PubMed

    Graser, M; Manaouil, C; Verrier, A; Doutrellot-Phillipon, C; Jardé, O

    2003-01-01

    The recent French Law on Social Modernisation of 17 January 2002 introduced into the French Labour Code and into the French Criminal Code, the concept of "moral" harassment. The definition of psychological harassment under this law adopts quite a broad conception of the notion of psychological harassment. The legislator has established a means for "friendly" settlement of disputes: mediation. When it has not been possible to settle the dispute internally, the Courts have a number of sanctions available to them. The French Labour Code provides that any termination of the contract of employment resulting from a situation of psychological harassment is automatically null and void. Such nullification should therefore be applicable whatever the nature of the termination: dismissal, resignation or negotiated departure and it punishes psychological harassment at work by imprisonment for one year and a fine of 3,750 Euros. The French Criminal Code prescribes penalties of one year and 15,000 Euros.

  7. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    PubMed Central

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  8. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young

    2003-02-27

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less

  9. Error-correction coding for digital communications

    NASA Astrophysics Data System (ADS)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  10. Phase II evaluation of clinical coding schemes: completeness, taxonomy, mapping, definitions, and clarity. CPRI Work Group on Codes and Structures.

    PubMed

    Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J

    1997-01-01

    To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.

  11. Semi-Automated Annotation of Biobank Data Using Standard Medical Terminologies in a Graph Database.

    PubMed

    Hofer, Philipp; Neururer, Sabrina; Goebel, Georg

    2016-01-01

    Data describing biobank resources frequently contains unstructured free-text information or insufficient coding standards. (Bio-) medical ontologies like Orphanet Rare Diseases Ontology (ORDO) or the Human Disease Ontology (DOID) provide a high number of concepts, synonyms and entity relationship properties. Such standard terminologies increase quality and granularity of input data by adding comprehensive semantic background knowledge from validated entity relationships. Moreover, cross-references between terminology concepts facilitate data integration across databases using different coding standards. In order to encourage the use of standard terminologies, our aim is to identify and link relevant concepts with free-text diagnosis inputs within a biobank registry. Relevant concepts are selected automatically by lexical matching and SPARQL queries against a RDF triplestore. To ensure correctness of annotations, proposed concepts have to be confirmed by medical data administration experts before they are entered into the registry database. Relevant (bio-) medical terminologies describing diseases and phenotypes were identified and stored in a graph database which was tied to a local biobank registry. Concept recommendations during data input trigger a structured description of medical data and facilitate data linkage between heterogeneous systems.

  12. Multipath search coding of stationary signals with applications to speech

    NASA Astrophysics Data System (ADS)

    Fehn, H. G.; Noll, P.

    1982-04-01

    This paper deals with the application of multipath search coding (MSC) concepts to the coding of stationary memoryless and correlated sources, and of speech signals, at a rate of one bit per sample. Use is made of three MSC classes: (1) codebook coding, or vector quantization, (2) tree coding, and (3) trellis coding. This paper explains the performances of these coders and compares them both with those of conventional coders and with rate-distortion bounds. The potentials of MSC coding strategies are demonstrated by illustrations. The paper reports also on results of MSC coding of speech, where both the strategy of adaptive quantization and of adaptive prediction were included in coder design.

  13. On the Biological Plausibility of Grandmother Cells: Implications for Neural Network Theories in Psychology and Neuroscience

    ERIC Educational Resources Information Center

    Bowers, Jeffrey S.

    2009-01-01

    A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated…

  14. Innovation and Standardization in School Building: A Proposal for the National Code in Italy.

    ERIC Educational Resources Information Center

    Ridolfi, Giuseppe

    This document discusses the University of Florence's experience and concepts as it developed the research to define a proposal for designing a new national school building code. Section 1 examines the current school building code and the Italian Reform Process in Education between 1960 and 2000. Section 2 details and explains the new school…

  15. Electromagnetic reprogrammable coding-metasurface holograms.

    PubMed

    Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang

    2017-08-04

    Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.

  16. Resources for comparing the speed and performance of medical autocoders.

    PubMed

    Berman, Jules J

    2004-06-15

    Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.

  17. Use of the ETA-1 reactor for the validation of the multi-group APOLLO2-MORET 5 code and the Monte Carlo continuous energy MORET 5 code

    NASA Astrophysics Data System (ADS)

    Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.

    2014-06-01

    The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.

  18. Ascent Aerodynamic Pressure Distributions on WB001

    NASA Technical Reports Server (NTRS)

    Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.

    1996-01-01

    To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.

  19. Concept Inventory Development Reveals Common Student Misconceptions about Microbiology †

    PubMed Central

    Briggs, Amy G.; Hughes, Lee E.; Brennan, Robert E.; Buchner, John; Horak, Rachel E. A.; Amburn, D. Sue Katz; McDonald, Ann H.; Primm, Todd P.; Smith, Ann C.; Stevens, Ann M.; Yung, Sunny B.; Paustian, Timothy D.

    2017-01-01

    Misconceptions, or alternative conceptions, are incorrect understandings that students have incorporated into their prior knowledge. The goal of this study was the identification of misconceptions in microbiology held by undergraduate students upon entry into an introductory, general microbiology course. This work was the first step in developing a microbiology concept inventory based on the American Society for Microbiology’s Recommended Curriculum Guidelines for Undergraduate Microbiology. Responses to true/false (T/F) questions accompanied by written explanations by undergraduate students at a diverse set of institutions were used to reveal misconceptions for fundamental microbiology concepts. These data were analyzed to identify the most difficult core concepts, misalignment between explanations and answer choices, and the most common misconceptions for each core concept. From across the core concepts, nineteen misconception themes found in at least 5% of the coded answers for a given question were identified. The top five misconceptions, with coded responses ranging from 19% to 43% of the explanations, are described, along with suggested classroom interventions. Identification of student misconceptions in microbiology provides a foundation upon which to understand students’ prior knowledge and to design appropriate tools for improving instruction in microbiology. PMID:29854046

  20. Nuclear Analysis

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Kirby, K. D.

    1973-01-01

    Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.

  1. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  2. Proof of Concept Coded Aperture Miniature Mass Spectrometer Using a Cycloidal Sector Mass Analyzer, a Carbon Nanotube (CNT) Field Emission Electron Ionization Source, and an Array Detector.

    PubMed

    Amsden, Jason J; Herr, Philip J; Landry, David M W; Kim, William; Vyas, Raul; Parker, Charles B; Kirley, Matthew P; Keil, Adam D; Gilchrist, Kristin H; Radauscher, Erich J; Hall, Stephen D; Carlson, James B; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T; Russell, Zachary E; Grego, Sonia; Edwards, Steven J; Sperline, Roger P; Denton, M Bonner; Stoner, Brian R; Gehm, Michael E; Glass, Jeffrey T

    2018-02-01

    Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified. Graphical Abstract ᅟ.

  3. Proof of Concept Coded Aperture Miniature Mass Spectrometer Using a Cycloidal Sector Mass Analyzer, a Carbon Nanotube (CNT) Field Emission Electron Ionization Source, and an Array Detector

    NASA Astrophysics Data System (ADS)

    Amsden, Jason J.; Herr, Philip J.; Landry, David M. W.; Kim, William; Vyas, Raul; Parker, Charles B.; Kirley, Matthew P.; Keil, Adam D.; Gilchrist, Kristin H.; Radauscher, Erich J.; Hall, Stephen D.; Carlson, James B.; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T.; Russell, Zachary E.; Grego, Sonia; Edwards, Steven J.; Sperline, Roger P.; Denton, M. Bonner; Stoner, Brian R.; Gehm, Michael E.; Glass, Jeffrey T.

    2018-02-01

    Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified.

  4. Motion-adaptive model-assisted compatible coding with spatiotemporal scalability

    NASA Astrophysics Data System (ADS)

    Lee, JaeBeom; Eleftheriadis, Alexandros

    1997-01-01

    We introduce the concept of motion adaptive spatio-temporal model-assisted compatible (MA-STMAC) coding, a technique to selectively encode areas of different importance to the human eye in terms of space and time in moving images with the consideration of object motion. PRevious STMAC was proposed base don the fact that human 'eye contact' and 'lip synchronization' are very important in person-to-person communication. Several areas including the eyes and lips need different types of quality, since different areas have different perceptual significance to human observers. The approach provides a better rate-distortion tradeoff than conventional image coding techniques base don MPEG-1, MPEG- 2, H.261, as well as H.263. STMAC coding is applied on top of an encoder, taking full advantage of its core design. Model motion tracking in our previous STMAC approach was not automatic. The proposed MA-STMAC coding considers the motion of the human face within the STMAC concept using automatic area detection. Experimental results are given using ITU-T H.263, addressing very low bit-rate compression.

  5. Quality Scalability Aware Watermarking for Visual Content.

    PubMed

    Bhowmik, Deepayan; Abhayaratne, Charith

    2016-11-01

    Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.

  6. Effective gene prediction by high resolution frequency estimator based on least-norm solution technique

    PubMed Central

    2014-01-01

    Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895

  7. Evaluation of Recent Upgrades to the NESS (Nuclear Engine System Simulation) Code

    NASA Technical Reports Server (NTRS)

    Fittje, James E.; Schnitzler, Bruce G.

    2008-01-01

    The Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for exploratory expeditions to the moon, Mars, and beyond. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the Rover/NERVA program from 1955 to 1973. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design, and a comparison of its results to the Small Nuclear Rocket Engine (SNRE) design.

  8. Building an ontology of pulmonary diseases with natural language processing tools using textual corpora.

    PubMed

    Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine

    2007-01-01

    Pathologies and acts are classified in thesauri to help physicians to code their activity. In practice, the use of thesauri is not sufficient to reduce variability in coding and thesauri are not suitable for computer processing. We think the automation of the coding task requires a conceptual modeling of medical items: an ontology. Our task is to help lung specialists code acts and diagnoses with software that represents medical knowledge of this concerned specialty by an ontology. The objective of the reported work was to build an ontology of pulmonary diseases dedicated to the coding process. To carry out this objective, we develop a precise methodological process for the knowledge engineer in order to build various types of medical ontologies. This process is based on the need to express precisely in natural language the meaning of each concept using differential semantics principles. A differential ontology is a hierarchy of concepts and relationships organized according to their similarities and differences. Our main research hypothesis is to apply natural language processing tools to corpora to develop the resources needed to build the ontology. We consider two corpora, one composed of patient discharge summaries and the other being a teaching book. We propose to combine two approaches to enrich the ontology building: (i) a method which consists of building terminological resources through distributional analysis and (ii) a method based on the observation of corpus sequences in order to reveal semantic relationships. Our ontology currently includes 1550 concepts and the software implementing the coding process is still under development. Results show that the proposed approach is operational and indicates that the combination of these methods and the comparison of the resulting terminological structures give interesting clues to a knowledge engineer for the building of an ontology.

  9. The global public good concept: a means of promoting good veterinary governance.

    PubMed

    Eloit, M

    2012-08-01

    At the outset, the concept of a 'public good' was associated with economic policies. However, it has now evolved not only from a national to a global concept (global public good), but also from a concept applying solely to the production of goods to one encompassing societal issues (education, environment, etc.) and fundamental rights, including the right to health and food. Through their actions, Veterinary Services, as defined by the Terrestrial Animal Health Code (Terrestrial Code) of the World Organisation for Animal Health (OIE), help to improve animal health and reduce production losses. In this way they contribute directly and indirectly to food security and to safeguarding human health and economic resources. The organisation and operating procedures of Veterinary Services are therefore key to the efficient governance required to achieve these objectives. The OIE is a major player in global cooperation and governance in the fields of animal and public health through the implementation of its strategic standardisation mission and other programmes for the benefit of Veterinary Services and OIE Member Countries. Thus, the actions of Veterinary Services and the OIE deserve to be recognised as a global public good, backed by public investment to ensure that all Veterinary Services are in a position to apply the principles of good governance and to comply with the international standards for the quality of Veterinary Services set out in the OIE Terrestrial Code (Section 3 on Quality of Veterinary Services) and Aquatic Animal Health Code (Section 3 on Quality of Aquatic Animal Health Services).

  10. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  11. Standardization of Terminology in Laboratory Medicine II

    PubMed Central

    Lee, Kap No; Yoon, Jong-Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Jang, Seongsoo; Ki, Chang-Seok; Bae, Sook Young; Kim, Jang Su; Kwon, Jung-Ah; Lee, Chang Kyu

    2008-01-01

    Standardization of medical terminology is essential in data transmission between health care institutes and in maximizing the benefits of information technology. The purpose of this study was to standardize medical terms for laboratory observations. During the second year of the study, a standard database of concept names for laboratory terms that covered those used in tertiary health care institutes and reference laboratories was developed. The laboratory terms in the Logical Observation Identifier Names and Codes (LOINC) database were adopted and matched with the electronic data interchange (EDI) codes in Korea. A public hearing and a workshop for clinical pathologists were held to collect the opinions of experts. The Korean standard laboratory terminology database containing six axial concept names, components, property, time aspect, system (specimen), scale type, and method type, was established for 29,340 test observations. Short names and mapping tables for EDI codes and UMLS were added. Synonym tables were prepared to help match concept names to common terms used in the fields. We herein described the Korean standard laboratory terminology database for test names, result description terms, and result units encompassing most of the laboratory tests in Korea. PMID:18756062

  12. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  13. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  14. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  15. GOATS 2008 Autonomous, Adaptive Multistatic Acoustic Sensing

    DTIC Science & Technology

    2008-09-30

    To develop net-centric, autonomous underwater vehicle sensing concepts for littoral MCM and ASW, exploiting collaborative and environmentally...unlimited 13. SUPPLEMENTARY NOTES code 1 only 14. ABSTRACT To develop net-centric, autonomous underwater vehicle sensing concepts for littoral MCM and...of autonomous underwater vehicle networks as platforms for new sonar concepts exploring the full 3-D acoustic environment of shallow water (SW) and

  16. Nodes and Codes: The Reality of Cyber Warfare

    DTIC Science & Technology

    2012-05-17

    Nodes and Codes explores the reality of cyber warfare through the story of Stuxnet, a string of weaponized code that reached through a domain...nodes. Stuxnet served as a proof-of-concept for cyber weapons and provided a comparative laboratory to study the reality of cyber warfare from the...military powers most often associated with advanced, offensive cyber attack capabilities. The reality of cyber warfare holds significant operational

  17. Simulator platform for fast reactor operation and safety technology demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, R. B.; Park, Y. S.; Grandy, C.

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe responsemore » to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.« less

  18. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings.

    PubMed

    Zhang, Yinsheng; Zhang, Guoming

    2018-01-01

    A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  19. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    PubMed

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  20. Tumor taxonomy for the developmental lineage classification of neoplasms

    PubMed Central

    Berman, Jules J

    2004-01-01

    Background The new "Developmental lineage classification of neoplasms" was described in a prior publication. The classification is simple (the entire hierarchy is described with just 39 classifiers), comprehensive (providing a place for every tumor of man), and consistent with recent attempts to characterize tumors by cytogenetic and molecular features. A taxonomy is a list of the instances that populate a classification. The taxonomy of neoplasia attempts to list every known term for every known tumor of man. Methods The taxonomy provides each concept with a unique code and groups synonymous terms under the same concept. A Perl script validated successive drafts of the taxonomy ensuring that: 1) each term occurs only once in the taxonomy; 2) each term occurs in only one tumor class; 3) each concept code occurs in one and only one hierarchical position in the classification; and 4) the file containing the classification and taxonomy is a well-formed XML (eXtensible Markup Language) document. Results The taxonomy currently contains 122,632 different terms encompassing 5,376 neoplasm concepts. Each concept has, on average, 23 synonyms. The taxonomy populates "The developmental lineage classification of neoplasms," and is available as an XML file, currently 9+ Megabytes in length. A representation of the classification/taxonomy listing each term followed by its code, followed by its full ancestry, is available as a flat-file, 19+ Megabytes in length. The taxonomy is the largest nomenclature of neoplasms, with more than twice the number of neoplasm names found in other medical nomenclatures, including the 2004 version of the Unified Medical Language System, the Systematized Nomenclature of Medicine Clinical Terminology, the National Cancer Institute's Thesaurus, and the International Classification of Diseases Oncolology version. Conclusions This manuscript describes a comprehensive taxonomy of neoplasia that collects synonymous terms under a unique code number and assigns each tumor to a single class within the tumor hierarchy. The entire classification and taxonomy are available as open access files (in XML and flat-file formats) with this article. PMID:15571625

  1. Research Prototype: Automated Analysis of Scientific and Engineering Semantics

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.

  2. Tutorial on Reed-Solomon error correction coding

    NASA Technical Reports Server (NTRS)

    Geisel, William A.

    1990-01-01

    This tutorial attempts to provide a frank, step-by-step approach to Reed-Solomon (RS) error correction coding. RS encoding and RS decoding both with and without erasing code symbols are emphasized. There is no need to present rigorous proofs and extreme mathematical detail. Rather, the simple concepts of groups and fields, specifically Galois fields, are presented with a minimum of complexity. Before RS codes are presented, other block codes are presented as a technical introduction into coding. A primitive (15, 9) RS coding example is then completely developed from start to finish, demonstrating the encoding and decoding calculations and a derivation of the famous error-locator polynomial. The objective is to present practical information about Reed-Solomon coding in a manner such that it can be easily understood.

  3. The Development of Concepts of Deviance in Children. Volumes I [and] II: The Development of Concepts of Handicaps: An Interview Study. Volume III: Coding Manual for Interviews about Concepts of Handicaps. Final Report.

    ERIC Educational Resources Information Center

    Budoff, Milton; And Others

    This three volume report presents findings from an interview study with 103 children and adults regarding their awareness and conceptions of handicapping conditions and from a followup study of preschool handicapped and nonhandicapped students. Volume I details the design and results of the interview study focusing on Ss in five age groups:…

  4. Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding

    NASA Astrophysics Data System (ADS)

    McConnell, Tom J.; Parker, Joyce M.; Eberhardt, Jan

    2013-06-01

    One of the characteristics of effective science teachers is a deep understanding of science concepts. The ability to identify, explain and apply concepts is critical in designing, delivering and assessing instruction. Because some teachers have not completed extensive courses in some areas of science, especially in middle and elementary grades, many professional development programs attempt to strengthen teachers' content knowledge. Assessing this content knowledge is challenging. Concept inventories are reliable and efficient, but do not reveal depth of knowledge. Interviews and observations are time-consuming. The Problem Based Learning Project for Teachers implemented a strategy that includes pre-post instruments in eight content strands that permits blind coding of responses and comparison across teachers and groups of teachers. The instruments include two types of open-ended questions that assess both general knowledge and the ability to apply Big Ideas related to specific science topics. The coding scheme is useful in revealing patterns in prior knowledge and learning, and identifying ideas that are challenging or not addressed by learning activities. The strengths and limitations of the scoring scheme are identified through comparison of the findings to case studies of four participating teachers from middle and elementary schools. The cases include examples of coded pre- and post-test responses to illustrate some of the themes seen in teacher learning. The findings raise questions for future investigation that can be conducted using analyses of the coded responses.

  5. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  6. Personalisation: The Emerging "Revised" Code of Education?

    ERIC Educational Resources Information Center

    Hartley, David

    2007-01-01

    In England, a "revised" educational code appears to be emerging. It centres upon the concept of "personalisation". Its basis is less in educational theory, more in contemporary marketing theory. Personalisation can be regarded in two ways. First, it provides the rationale for a new mode of public-service delivery, one which…

  7. Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography

    ERIC Educational Resources Information Center

    Aydin, Nuh

    2009-01-01

    The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…

  8. Beacon- and Schema-Based Method for Recognizing Algorithms from Students' Source Code

    ERIC Educational Resources Information Center

    Taherkhani, Ahmad; Malmi, Lauri

    2013-01-01

    In this paper, we present a method for recognizing algorithms from students programming submissions coded in Java. The method is based on the concept of "programming schemas" and "beacons". Schemas are high-level programming knowledge with detailed knowledge abstracted out, and beacons are statements that imply specific…

  9. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  10. Representation of ophthalmology concepts by electronic systems: adequacy of controlled medical terminologies.

    PubMed

    Chiang, Michael F; Casper, Daniel S; Cimino, James J; Starren, Justin

    2005-02-01

    To assess the adequacy of 5 controlled medical terminologies (International Classification of Diseases 9, Clinical Modification [ICD9-CM]; Current Procedural Terminology 4 [CPT-4]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; Logical Identifiers, Names, and Codes [LOINC]; Medical Entities Dictionary [MED]) for representing concepts in ophthalmology. Noncomparative case series. Twenty complete ophthalmology case presentations were sequentially selected from a publicly available ophthalmology journal. Each of the 20 cases was parsed into discrete concepts, and each concept was classified along 2 axes: (1) diagnosis, finding, or procedure and (2) ophthalmic or medical concept. Electronic or paper browsers were used to assign a code for every concept in each of the 5 terminologies. Adequacy of assignment for each concept was scored on a 3-point scale. Findings from all 20 case presentations were combined and compared based on a coverage score, which was the average score for all concepts in that terminology. Adequacy of assignment for concepts in each terminology, based on a 3-point Likert scale (0, no match; 1, partial match; 2, complete match). Cases were parsed into 1603 concepts. SNOMED-CT had the highest mean overall coverage score (1.625+/-0.667), followed by MED (0.974+/-0.764), LOINC (0.781+/-0.929), ICD9-CM (0.280+/-0.619), and CPT-4 (0.082+/-0.337). SNOMED-CT also had higher coverage scores than any of the other terminologies for concepts in the diagnosis, finding, and procedure categories. Average coverage scores for ophthalmic concepts were lower than those for medical concepts. Controlled terminologies are required for electronic representation of ophthalmology data. SNOMED-CT had significantly higher content coverage than any other terminology in this study.

  11. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    NASA Astrophysics Data System (ADS)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  12. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  13. Asymmetric Memory Circuit Would Resist Soft Errors

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G.; Perlman, Marvin

    1990-01-01

    Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.

  14. Internalism, Externalism and Coding

    ERIC Educational Resources Information Center

    Carr, Philip

    2007-01-01

    I examine some of the issues connected with the internalist/externalist distinction in work on the ontology of language. I note that Chomskyan radical internalism necessarily leads to a passive conception of child language acquisition. I reject that passive conception, and support current versions of constructivism [Tomasello, M., 2001. "The…

  15. Continua or Chimera?

    ERIC Educational Resources Information Center

    Booth, Tony

    1994-01-01

    This article looks at two concepts in the British 1993 draft Code of Practice concerning students with special needs: the concepts of a "continuum of needs" and a "continuum of provision." Issues involved in connecting the two continua are addressed, including whether service delivery decisions should be based on severity of…

  16. Promoting Transfer of Ecosystems Concepts

    ERIC Educational Resources Information Center

    Yu, Yawen; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Eberbach, Catherine; Sinha, Suparna

    2016-01-01

    This study examines to what extent students transferred their knowledge from a familiar aquatic ecosystem to an unfamiliar rainforest ecosystem after participating in a technology-rich inquiry curriculum. We coded students' drawings for components of important ecosystems concepts at pre- and posttest. Our analysis examined the extent to which each…

  17. How a modified approach to dental coding can benefit personal and professional development with improved clinical outcomes.

    PubMed

    Lam, Raymond; Kruger, Estie; Tennant, Marc

    2014-12-01

    One disadvantage of the remarkable achievements in dentistry is that treatment options have never been more varied or confusing. This has made the concept of Evidenced Based Dentistry more applicable to modern dental practice. Despite merit in the concept whereby clinical decisions are guided by scientific evidence, there are problems with establishing a scientific base. This is no more challenging than in modern dentistry where the gap between rapidly developing products/procedures and its evidence base are widening. Furthermore, the burden of oral disease continues to remain high at the population level. These problems have prompted new approaches to enhancing research. The aim of this paper is to outline how a modified approach to dental coding may benefit clinical and population level research. Using publically assessable data obtained from the Australian Chronic Disease Dental Scheme and item codes contained within the Australian Schedule of Dental Services and Glossary, a suggested approach to dental informatics is illustrated. A selection of item codes have been selected and expanded with the addition of suffixes. These suffixes provided circumstantial information that will assist in assessing clinical outcomes such as success rates and prognosis. The use of item codes in administering the CDDS yielded a large database of item codes. These codes are amenable to dental informatics which has been shown to enhance research at both the clinical and population level. This is a cost effective method to supplement existing research methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. An Experiment in Scientific Program Understanding

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Owen, Karl (Technical Monitor)

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  19. 2002 CNA Code of Ethics: some recommendations.

    PubMed

    Kikuchi, June F

    2004-07-01

    The Canadian Nurses Association (CNA) recently revised its 1997 Code of Ethics for Registered Nurses to reflect the context within which nurses practise today. Given the unprecedented changes that have taken place within the profession, healthcare and society, it was timely for the CNA to review and revise its Code. But the revisions were relatively minor; important problematic, substantive aspects of the Code were essentially left untouched and persist in the updated 2002 Code. In this paper, three of those aspects are examined and discussed: the 2002 Code's (a) definition of health and well-being, (b) notion of respect and (c) conception of justice. Recommendations are made. It is hoped that these comments will encourage nurse leaders in Canada to initiate discussion of the Code now, in preparation for its next planned revision in 2007.

  20. With or without you: predictive coding and Bayesian inference in the brain

    PubMed Central

    Aitchison, Laurence; Lengyel, Máté

    2018-01-01

    Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084

  1. SYMTRAN - A Time-dependent Symmetric Tandem Mirror Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hua, D; Fowler, T

    2004-06-15

    A time-dependent version of the steady-state radial transport model in symmetric tandem mirrors in Ref. [1] has been coded up and first tests performed. Our code, named SYMTRAN, is an adaptation of the earlier SPHERE code for spheromaks, now modified for tandem mirror physics. Motivated by Post's new concept of kinetic stabilization of symmetric mirrors, it is an extension of the earlier TAMRAC rate-equation code omitting radial transport [2], which successfully accounted for experimental results in TMX. The SYMTRAN code differs from the earlier tandem mirror radial transport code TMT in that our code is focused on axisymmetric tandem mirrorsmore » and classical diffusion, whereas TMT emphasized non-ambipolar transport in TMX and MFTF-B due to yin-yang plugs and non-symmetric transitions between the plugs and axisymmetric center cell. Both codes exhibit interesting but different non-linear behavior.« less

  2. Base heating methodology improvements, volume 1

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold

    1992-01-01

    This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.

  3. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1987-01-01

    Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.

  4. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1990-01-01

    Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.

  5. NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding

    PubMed Central

    Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo

    2016-01-01

    Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280

  6. Life is physics and chemistry and communication.

    PubMed

    Witzany, Guenther

    2015-04-01

    Manfred Eigen extended Erwin Schroedinger's concept of "life is physics and chemistry" through the introduction of information theory and cybernetic systems theory into "life is physics and chemistry and information." Based on this assumption, Eigen developed the concepts of quasispecies and hypercycles, which have been dominant in molecular biology and virology ever since. He insisted that the genetic code is not just used metaphorically: it represents a real natural language. However, the basics of scientific knowledge changed dramatically within the second half of the 20th century. Unfortunately, Eigen ignored the results of the philosophy of science discourse on essential features of natural languages and codes: a natural language or code emerges from populations of living agents that communicate. This contribution will look at some of the highlights of this historical development and the results relevant for biological theories about life. © 2014 New York Academy of Sciences.

  7. Semantic Interoperability of Health Risk Assessments

    PubMed Central

    Rajda, Jay; Vreeman, Daniel J.; Wei, Henry G.

    2011-01-01

    The health insurance and benefits industry has administered Health Risk Assessments (HRAs) at an increasing rate. These are used to collect data on modifiable health risk factors for wellness and disease management programs. However, there is significant variability in the semantics of these assessments, making it difficult to compare data sets from the output of 2 different HRAs. There is also an increasing need to exchange this data with Health Information Exchanges and Electronic Medical Records. To standardize the data and concepts from these tools, we outline a process to determine presence of certain common elements of modifiable health risk extracted from these surveys. This information is coded using concept identifiers, which allows cross-survey comparison and analysis. We propose that using LOINC codes or other universal coding schema may allow semantic interoperability of a variety of HRA tools across the industry, research, and clinical settings. PMID:22195174

  8. Getting Started in Classroom Computing.

    ERIC Educational Resources Information Center

    Ahl, David H.

    Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…

  9. Historical and cultural aspects of the provision of care at an indigenous healthcare service facility.

    PubMed

    Ribeiro, Aridiane Alves; Arantes, Cássia Irene Spinelli; Gualda, Dulce Maria Rosa; Rossi, Lídia Aparecida

    2017-06-01

    This case study aimed to interpret the underlying historical and cultural aspects of the provision of care at an indigenous healthcare service facility. This is an interpretive, case study-type research with qualitative approach, which was conducted in 2012 at the Indigenous Health Support Center (CASAI) of the State of Mato Grosso do Sul, Brazil. Data were collected by means systematic observation, documentary analyses and semi-structured interviews with ten health professionals. Data review was performed according to an approach based on social anthropology and health anthropology. The anthropological concepts of social code and ethnocentrism underpinned the interpretation of outcomes. Two categories were identified: CASAI, a space between streets and village; Ethnocentrism and indigenous health care. Healthcare practice and current social code are influenced by each other. The street social code prevails in the social environment under study. The institutional organization and professionals' appreciation of the indigenous biological body are decisive to provision of care under the streets social code perspective. Professionals' concepts evidence ethnocentrism in healthcare. Workers, however, try to adopt a relativized view vis-à-vis indigenous people at CASAI.

  10. Advanced Modulation and Coding Technology Conference

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The objectives, approach, and status of all current LeRC-sponsored industry contracts and university grants are presented. The following topics are covered: (1) the LeRC Space Communications Program, and Advanced Modulation and Coding Projects; (2) the status of four contracts for development of proof-of-concept modems; (3) modulation and coding work done under three university grants, two small business innovation research contracts, and two demonstration model hardware development contracts; and (4) technology needs and opportunities for future missions.

  11. Music Handbook for Primary Grades.

    ERIC Educational Resources Information Center

    Bowman, Doris; And Others

    GRADES OR AGES: Primary grades (1, 2, and 3). SUBJECT MATTER: Music. ORGANIZATION AND PHYSICAL APPEARANCE: This guide contains a detailed outline of the basic music concepts for elementary grades with suggestions for activities which may develop understanding of the concepts. The pages of activities are color coded by grade level. There are three…

  12. Changing the Latitudes and Attitudes about Content Analysis Research

    ERIC Educational Resources Information Center

    Brank, Eve M.; Fox, Kathleen A.; Youstin, Tasha J.; Boeppler, Lee C.

    2008-01-01

    The current research employs the use of content analysis to teach research methods concepts among students enrolled in an upper division research methods course. Students coded and analyzed Jimmy Buffett song lyrics rather than using a downloadable database or collecting survey data. Students' knowledge of content analysis concepts increased after…

  13. [The concept of mental health deterioration in light of decisions by higher judicial bodies].

    PubMed

    Kaya, Ahsen; Aktaş, Ekin Özgür

    2014-01-01

    Important arrangements were made to protect an individuals' sexual safety in the Turkish Penal Code. During judgments of sexual crimes, the witnesses of medical experts are usually used for evidence collection and for researching whether the crimes were aggravated. Due to this, reports are frequently requested from all physicians in all fields of medicine in their daily clinical practices by judicial authorities. Following implementation of the new Turkish Penal Code, the concept of mental health deterioration was frequently discussed and is still a discussed topic in the fields of both law and medicine in terms of crimes against sexual immunity. It is believed that subjects discussed in this article will provide important information for both adult, child and adolescent mental health professionals in terms of drawing attention to the importance of the medicolegal evaluations which are frequently requested from psychiatrists in their daily clinical practice and in terms of providing an evaluation of the concept of mental health deterioration in light of judicial decisions. Regarding the process from the beginning of application to the present, prejudications reduce questions about how the concept must be evaluated and what the meaning of the concept is. In this study, the decisions of Higher Judicial Bodies were researched and situations relating to how concepts must be evaluated and the meaning of the concept of mental health deterioration today in accordance with the prejudications were presented.

  14. Genomics-Based Security Protocols: From Plaintext to Cipherprotein

    NASA Technical Reports Server (NTRS)

    Shaw, Harry; Hussein, Sayed; Helgert, Hermann

    2011-01-01

    The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.

  15. Link performance optimization for digital satellite broadcasting systems

    NASA Astrophysics Data System (ADS)

    de Gaudenzi, R.; Elia, C.; Viola, R.

    The authors introduce the concept of digital direct satellite broadcasting (D-DBS), which allows unprecedented flexibility by providing a large number of audiovisual services. The concept assumes an information rate of 40 Mb/s, which is compatible with practically all present-day transponders. After discussion of the general system concept, the results of transmission system optimization are presented. Channel and interference effects are taken into account. Numerical results show that the scheme with the best performance is trellis-coded 8-PSK (phase shift keying) modulation concatenated with Reed-Solomon block code. For a net data rate of 40 Mb/s a bit error rate of 10-10 can be achieved with an equivalent bit energy to noise density of 9.5 dB, including channel, interference, and demodulator impairments. A link budget analysis shows how a medium-power direct-to-home TV satellite can provide multimedia services to users equipped with small (60-cm) dish antennas.

  16. Three-Dimensional Algebraic Models of the tRNA Code and 12 Graphs for Representing the Amino Acids.

    PubMed

    José, Marco V; Morgado, Eberto R; Guimarães, Romeu Cardoso; Zamudio, Gabriel S; de Farías, Sávio Torres; Bobadilla, Juan R; Sosa, Daniela

    2014-08-11

    Three-dimensional algebraic models, also called Genetic Hotels, are developed to represent the Standard Genetic Code, the Standard tRNA Code (S-tRNA-C), and the Human tRNA code (H-tRNA-C). New algebraic concepts are introduced to be able to describe these models, to wit, the generalization of the 2n-Klein Group and the concept of a subgroup coset with a tail. We found that the H-tRNA-C displayed broken symmetries in regard to the S-tRNA-C, which is highly symmetric. We also show that there are only 12 ways to represent each of the corresponding phenotypic graphs of amino acids. The averages of statistical centrality measures of the 12 graphs for each of the three codes are carried out and they are statistically compared. The phenotypic graphs of the S-tRNA-C display a common triangular prism of amino acids in 10 out of the 12 graphs, whilst the corresponding graphs for the H-tRNA-C display only two triangular prisms. The graphs exhibit disjoint clusters of amino acids when their polar requirement values are used. We contend that the S-tRNA-C is in a frozen-like state, whereas the H-tRNA-C may be in an evolving state.

  17. The development of an intelligent interface to a computational fluid dynamics flow-solver code

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1988-01-01

    Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, 3-D, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.

  18. The development of an intelligent interface to a computational fluid dynamics flow-solver code

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1988-01-01

    Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, three-dimensional, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.

  19. CFD validation needs for advanced concepts at Northrop Corporation

    NASA Technical Reports Server (NTRS)

    George, Michael W.

    1987-01-01

    Information is given in viewgraph form on the Computational Fluid Dynamics (CFD) Workshop held July 14 - 16, 1987. Topics covered include the philosophy of CFD validation, current validation efforts, the wing-body-tail Euler code, F-20 Euler simulated oil flow, and Euler Navier-Stokes code validation for 2D and 3D nozzle afterbody applications.

  20. Functions of Code-Switching among Iranian Advanced and Elementary Teachers and Students

    ERIC Educational Resources Information Center

    Momenian, Mohammad; Samar, Reza Ghafar

    2011-01-01

    This paper reports on the findings of a study carried out on the advanced and elementary teachers' and students' functions and patterns of code-switching in Iranian English classrooms. This concept has not been adequately examined in L2 (second language) classroom contexts than in outdoor natural contexts. Therefore, besides reporting on the…

  1. Ethics and the Early Childhood Teacher Educator: A Proposed Addendum to the NAEYC Code of Ethical Conduct.

    ERIC Educational Resources Information Center

    Freeman, Nancy; Feeney, Stephanie; Moravcik, Eva

    2003-01-01

    Proposes an addendum to the National Association for the Education of Young Children's Code of Ethical Conduct concerning the unique ethical challenges facing teacher educators. Presents a conception of professional responsibility in six areas: children and families, adult students, programs hosting practicum students and programs' staffs and…

  2. Neurobehavioral Assessment from Fetus to Infant: The NICU Network Neurobehavioral Scale and the Fetal Neurobehavior Coding Scale

    ERIC Educational Resources Information Center

    Salisbury, Amy L.; Fallone, Melissa Duncan; Lester, Barry

    2005-01-01

    This review provides an overview and definition of the concept of neurobehavior in human development. Two neurobehavioral assessments used by the authors in current fetal and infant research are discussed: the NICU Network Neurobehavioral Assessment Scale and the Fetal Neurobehavior Coding System. This review will present how the two assessments…

  3. 78 FR 47028 - Exchange Traded Concepts, LLC, et al.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ..., and receive securities from, the series in connection with the purchase and redemption of Creation... similar Inside Information Policy. In accordance with the Code of Ethics \\12\\ and Inside Information... code of ethics pursuant to rule 17j-1 under the Act and Rule 204A-1 under the Advisers Act, which...

  4. Toward Semantic Interoperability in Home Health Care: Formally Representing OASIS Items for Integration into a Concept-oriented Terminology

    PubMed Central

    Choi, Jeungok; Jenkins, Melinda L.; Cimino, James J.; White, Thomas M.; Bakken, Suzanne

    2005-01-01

    Objective: The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Design and Measurements: Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Results: Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. Conclusion: The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database. PMID:15802480

  5. Toward semantic interoperability in home health care: formally representing OASIS items for integration into a concept-oriented terminology.

    PubMed

    Choi, Jeungok; Jenkins, Melinda L; Cimino, James J; White, Thomas M; Bakken, Suzanne

    2005-01-01

    The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database.

  6. "The City Snuffs out Nature": Young People's Conceptions of and Relationship with Nature

    ERIC Educational Resources Information Center

    Pointon, Pam

    2014-01-01

    This paper reports a study of 384 13-14-year olds' written responses to open-ended questions about their understanding of and relationship with "nature." Using constant comparative method the responses were coded, categorised and themed. Most students held scientific conceptions of nature (excluding humans) and a utilitarian relationship…

  7. Concepts of Healthful Food among Low-Income African American Women

    ERIC Educational Resources Information Center

    Lynch, Elizabeth B.; Holmes, Shane; Keim, Kathryn; Koneman, Sylvia A.

    2012-01-01

    Objective: Describe beliefs about what makes foods healthful among low-income African American women. Methods: In one-on-one interviews, 28 low-income African American mothers viewed 30 pairs of familiar foods and explained which food in the pair was more healthful and why. Responses were grouped into codes describing concepts of food…

  8. Contagious Ideas: Vulnerability, Epistemic Injustice and Counter-Terrorism in Education

    ERIC Educational Resources Information Center

    O'Donnell, Aislinn

    2018-01-01

    The article addresses the implications of Prevent and Channel for epistemic justice. The first section outlines the background of Prevent. It draws upon Moira Gatens and Genevieve Lloyd's concept of the collective imaginary, alongside Lorraine Code's concept of epistemologies of mastery, in order to outline some of the images and imaginaries that…

  9. The ICF and Postsurgery Occupational Therapy after Traumatic Hand Injury

    ERIC Educational Resources Information Center

    Fitinghoff, Helene; Lindqvist, Birgitta; Nygard, Louise; Ekholm, Jan; Schult, Marie-Louise

    2011-01-01

    Recent studies have examined the effectiveness of hand rehabilitation programmes and have linked the outcomes to the concept of ICF but not to specific ICF category codes. The objective of this study was to gain experience using ICF concepts to describe occupational therapy interventions during postsurgery hand rehabilitation, and to describe…

  10. Effect of sexed semen on conception rate for Holsteins in the United States

    USDA-ARS?s Scientific Manuscript database

    Effect of sexed-semen breedings on conception rate was investigated using US Holstein field data from January 2006 through October 2008. Sexed-semen breeding status was determined by a National Association of Animal Breeders’ 500-series marketing code or by individual breeding information in a cow o...

  11. How to identify up to 30 colors without training: color concept retrieval by free color naming

    NASA Astrophysics Data System (ADS)

    Derefeldt, Gunilla A. M.; Swartling, Tiina

    1994-05-01

    Used as a redundant code, color is shown to be advantageous in visual search tasks. It enhances attention, detection, and recall of information. Neuropsychological and neurophysiological findings have shown color and spatial perception to be interrelated functions. Studies on eye movements show that colored symbols are easier to detect and that eye fixations are more correctly directed to color-coded symbols. Usually between 5 and 15 colors have been found useful in classification tasks, but this umber can be increased to between 20 to 30 by careful selection of colors, and by a subject's practice with the identification task and familiarity with the particular colors. Recent neurophysiological findings concerning the language-concept connection in color suggest that color concept retrieval would be enhanced by free color naming or by the use of natural associations between color concepts and color words. To test this hypothesis, we had subjects give their own free associations to a set of 35 colors presented on a display. They were able to identify as many as 30 colors without training.

  12. Parametric Weight Comparison of Current and Proposed Thermal Protection System (TPS) Concepts

    NASA Technical Reports Server (NTRS)

    Myers, David E.; Martin, Carl J.; Blosser, Max L.

    1999-01-01

    A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (1 -D) thermal finite element sizing code. This sizing code contained models to ac- count for coatings, fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) rocket-powered single-stage-to-orbit (SSTO) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a particular trajectory. Eight TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile systems and approaches blanket TPS weights for higher integrated heat loads.

  13. Vector processing efficiency of plasma MHD codes by use of the FACOM 230-75 APU

    NASA Astrophysics Data System (ADS)

    Matsuura, T.; Tanaka, Y.; Naraoka, K.; Takizuka, T.; Tsunematsu, T.; Tokuda, S.; Azumi, M.; Kurita, G.; Takeda, T.

    1982-06-01

    In the framework of pipelined vector architecture, the efficiency of vector processing is assessed with respect to plasma MHD codes in nuclear fusion research. By using a vector processor, the FACOM 230-75 APU, the limit of the enhancement factor due to parallelism of current vector machines is examined for three numerical codes based on a fluid model. Reasonable speed-up factors of approximately 6,6 and 4 times faster than the highly optimized scalar version are obtained for ERATO (linear stability code), AEOLUS-R1 (nonlinear stability code) and APOLLO (1-1/2D transport code), respectively. Problems of the pipelined vector processors are discussed from the viewpoint of restructuring, optimization and choice of algorithms. In conclusion, the important concept of "concurrency within pipelined parallelism" is emphasized.

  14. Conceptual Underpinnings of the Quality of Life in Neurological Disorders (Neuro-QoL): Comparisons of Core Sets for Stroke, Multiple Sclerosis, Spinal Cord Injury, and Traumatic Brain Injury.

    PubMed

    Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W

    2018-04-03

    To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    NASA Astrophysics Data System (ADS)

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.

  16. National Combustion Code Validated Against Lean Direct Injection Flow Field Data

    NASA Technical Reports Server (NTRS)

    Iannetti, Anthony C.

    2003-01-01

    Most combustion processes have, in some way or another, a recirculating flow field. This recirculation stabilizes the reaction zone, or flame, but an unnecessarily large recirculation zone can result in high nitrogen oxide (NOx) values for combustion systems. The size of this recirculation zone is crucial to the performance of state-of-the-art, low-emissions hardware. If this is a large-scale combustion process, the flow field will probably be turbulent and, therefore, three-dimensional. This research dealt primarily with flow fields resulting from lean direct injection (LDI) concepts, as described in Research & Technology 2001. LDI is a concept that depends heavily on the design of the swirler. The LDI concept has the potential to reduce NOx values from 50 to 70 percent of current values, with good flame stability characteristics. It is cost effective and (hopefully) beneficial to do most of the design work for an LDI swirler using computer-aided design (CAD) and computer-aided engineering (CAE) tools. Computational fluid dynamics (CFD) codes are CAE tools that can calculate three-dimensional flows in complex geometries. However, CFD codes are only beginning to correctly calculate the flow fields for complex devices, and the related combustion models usually remove a large portion of the flow physics.

  17. Galen: a third generation terminology tool to support a multipurpose national coding system for surgical procedures.

    PubMed

    Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H

    1999-01-01

    GALEN has developed a new generation of terminology tools based on a language independent concept reference model using a compositional formalism allowing computer processing and multiple reuses. During the 4th framework program project Galen-In-Use we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures (CCAM) in France. On one hand we contributed to a language independent knowledge repository for multicultural Europe. On the other hand we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW to process French professional medical language rubrics produced by the national colleges of surgeons into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation on one hand we generate controlled French natural language to support the finalization of the linguistic labels in relation with the meanings of the conceptual system structure. On the other hand the classification manager of third generation proves to be very powerful to retrieve the initial professional rubrics with different categories of concepts within a semantic network.

  18. Coding and traceability: cells and tissues in North America.

    PubMed

    Brubaker, Scott A; Wilson, Diane

    2010-11-01

    Cell and tissue banking professionals in North America have long understood the value of labeling their allografts with descriptive names that make them easily recognized. They have also understood that advantages exist in possessing the capability to track them internally and externally to better understand tissue handling from donation through distribution. An added insight that can assist with strategic planning is to know who uses them, how many, and for what purpose or application. Uniquely coding allografts naturally aids tracking in event of recall or the rare need to link them if implicated in an adverse outcome report. These values relate to an ability or inability to sufficiently track specific cell/tissue types throughout the allograft's lifetime. These concepts easily fit into the functions of a Quality Program and promote recipient safety. It is management oversight that drives the direction taken and either optimizes this knowledge or limits it. How concepts related to coding and tracing human cells and tissues for transplantation have evolved in North America, and where they may be headed, are described in this manuscript. Many protocols are in place but they exist in numerous operational silos. Quality Management System concepts should drive decision-making and include considerations for future planning beyond our own professional lifetimes.

  19. Overview and Current Status of Analyses of Potential LEU Design Concepts for TREAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.

    2015-10-01

    Neutronic and thermal-hydraulic analyses have been performed to evaluate the performance of different low-enriched uranium (LEU) fuel design concepts for the conversion of the Transient Reactor Test Facility (TREAT) from its current high-enriched uranium (HEU) fuel. TREAT is an experimental reactor developed to generate high neutron flux transients for the testing of nuclear fuels. The goal of this work was to identify an LEU design which can maintain the performance of the existing HEU core while continuing to operate safely. A wide variety of design options were considered, with a focus on minimizing peak fuel temperatures and optimizing the powermore » coupling between the TREAT core and test samples. Designs were also evaluated to ensure that they provide sufficient reactivity and shutdown margin for each control rod bank. Analyses were performed using the core loading and experiment configuration of historic M8 Power Calibration experiments (M8CAL). The Monte Carlo code MCNP was utilized for steady-state analyses, and transient calculations were performed with the point kinetics code TREKIN. Thermal analyses were performed with the COMSOL multi-physics code. Using the results of this study, a new LEU Baseline design concept is being established, which will be evaluated in detail in a future report.« less

  20. The Nuremberg Code and the Nuremberg Trial. A reappraisal.

    PubMed

    Katz, J

    1996-11-27

    The Nuremberg Code includes 10 principles to guide physician-investigators in experiments involving human subjects. These principles, particularly the first principle on "voluntary consent," primarily were based on legal concepts because medical codes of ethics existent at the time of the Nazi atrocities did not address consent and other safeguards for human subjects. The US judges who presided over the proceedings did not intend the Code to apply only to the case before them, to be a response to the atrocities committed by the Nazi physicians, or to be inapplicable to research as it is customarily carried on in medical institutions. Instead, a careful reading of the judgment suggests that they wrote the Code for the practice of human experimentation whenever it is being conducted.

  1. Constructing a Pre-Emptive System Based on a Multidimentional Matrix and Autocompletion to Improve Diagnostic Coding in Acute Care Hospitals.

    PubMed

    Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice

    2016-01-01

    Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.

  2. Flexible digital modulation and coding synthesis for satellite communications

    NASA Technical Reports Server (NTRS)

    Vanderaar, Mark; Budinger, James; Hoerig, Craig; Tague, John

    1991-01-01

    An architecture and a hardware prototype of a flexible trellis modem/codec (FTMC) transmitter are presented. The theory of operation is built upon a pragmatic approach to trellis-coded modulation that emphasizes power and spectral efficiency. The system incorporates programmable modulation formats, variations of trellis-coding, digital baseband pulse-shaping, and digital channel precompensation. The modulation formats examined include (uncoded and coded) binary phase shift keying (BPSK), quatenary phase shift keying (QPSK), octal phase shift keying (8PSK), 16-ary quadrature amplitude modulation (16-QAM), and quadrature quadrature phase shift keying (Q squared PSK) at programmable rates up to 20 megabits per second (Mbps). The FTMC is part of the developing test bed to quantify modulation and coding concepts.

  3. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    NASA Astrophysics Data System (ADS)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  4. Learning Concepts, Language, and Literacy in Hybrid Linguistic Codes: The Multilingual Maze of Urban Grade 1 Classrooms in South Africa

    ERIC Educational Resources Information Center

    Henning, Elizabeth

    2012-01-01

    From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…

  5. Using a Serious Game Approach to Teach Secure Coding in Introductory Programming: Development and Initial Findings

    ERIC Educational Resources Information Center

    Adamo-Villani, Nicoletta; Oania, Marcus; Cooper, Stephen

    2013-01-01

    We report the development and initial evaluation of a serious game that, in conjunction with appropriately designed matching laboratory exercises, can be used to teach secure coding and Information Assurance (IA) concepts across a range of introductory computing courses. The IA Game is a role-playing serious game (RPG) in which the student travels…

  6. Playing Music, Playing with Music: A Proposal for Music Coding in Primary School

    ERIC Educational Resources Information Center

    Baratè, Adriano; Ludovico, Luca Andrea; Mangione, Giuseppina Rita; Rosa, Alessia

    2015-01-01

    In this work we will introduce the concept of "music coding," namely a new discipline that employs basic music activities and simplified languages to teach the computational way of thinking to musically-untrained children who attend the primary school. In this context, music represents both a mean and a goal: in fact, from one side…

  7. Observation and Coding Manual for the Longitudinal Study of Reading Comprehension and Science Concept Acquisition (Third Edition). Technical Report No. L-1.

    ERIC Educational Resources Information Center

    Meyer, Linda A.; And Others

    This manual describes the model--specifically the observation procedures and coding systems--used in a longitudinal study of how children learn to comprehend what they read, with particular emphasis on science texts. Included are procedures for the following: identifying students; observing--recording observations and diagraming the room; writing…

  8. Just sustainability? Sustainability and social justice in professional codes of ethics for engineers.

    PubMed

    Brauer, Cletus S

    2013-09-01

    Should environmental, social, and economic sustainability be of primary concern to engineers? Should social justice be among these concerns? Although the deterioration of our natural environment and the increase in social injustices are among today's most pressing and important issues, engineering codes of ethics and their paramountcy clause, which contains those values most important to engineering and to what it means to be an engineer, do not yet put either concept on a par with the safety, health, and welfare of the public. This paper addresses a recent proposal by Michelfelder and Jones (2011) to include sustainability in the paramountcy clause as a way of rectifying the current disregard for social justice issues in the engineering codes. That proposal builds on a certain notion of sustainability that includes social justice as one of its dimensions and claims that social justice is a necessary condition for sustainability, not vice versa. The relationship between these concepts is discussed, and the original proposal is rejected. Drawing on insights developed throughout the paper, some suggestions are made as to how one should address the different requirements that theory and practice demand of the value taxonomy of professional codes of ethics.

  9. [Conflicts between nursing ethics and health care legislation in Spain].

    PubMed

    Gea-Sánchez, Montserrat; Terés-Vidal, Lourdes; Briones-Vozmediano, Erica; Molina, Fidel; Gastaldo, Denise; Otero-García, Laura

    2016-01-01

    To identify the ethical conflicts that may arise between the nursing codes of ethics and the Royal Decree-law 16/2012 modifying Spanish health regulations. We conducted a review and critical analysis of the discourse of five nursing codes of ethics from Barcelona, Catalonia, Spain, Europe and International, and of the discourse of the Spanish legislation in force in 2013. Language structures referring to five different concepts of the theoretical framework of care were identified in the texts: equity, human rights, right to healthcare, access to care, and continuity of care. Codes of ethics define the function of nursing according to equity, acknowledgement of human rights, right to healthcare, access to care and continuity of care, while legal discourse hinges on the concept of beneficiary or being insured. The divergence between the code of ethics and the legal discourse may produce ethical conflicts that negatively affect nursing practice. The application of RDL 16/2012 promotes a framework of action that prevents nursing professionals from providing care to uninsured collectives, which violates human rights and the principles of care ethics. Copyright © 2016 SESPAS. Published by Elsevier Espana. All rights reserved.

  10. Application of a Database System for Korean Military Personnel Management.

    DTIC Science & Technology

    1987-03-01

    PUNOINtGiSPONSORING 6b OFFICE SYMBOIL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (taoab 8c AOORE SS (city. Stare. MWd BP Code) 10...concepts ......... 33 C. R SHIIONSPS WITH THE A .TL. ................. ............................... 35 1. Tree or hierarchical relationships...between relation and data-processing concepts6 ............... 35 3.6 Example of Tree Relationship ......... .......................... 36 3.7

  11. Cobweb/3: A portable implementation

    NASA Technical Reports Server (NTRS)

    Mckusick, Kathleen; Thompson, Kevin

    1990-01-01

    An algorithm is examined for data clustering and incremental concept formation. An overview is given of the Cobweb/3 system and the algorithm on which it is based, as well as the practical details of obtaining and running the system code. The implementation features a flexible user interface which includes a graphical display of the concept hierarchies that the system constructs.

  12. OpenGeoSys: Performance-Oriented Computational Methods for Numerical Modeling of Flow in Large Hydrogeological Systems

    NASA Astrophysics Data System (ADS)

    Naumov, D.; Fischer, T.; Böttcher, N.; Watanabe, N.; Walther, M.; Rink, K.; Bilke, L.; Shao, H.; Kolditz, O.

    2014-12-01

    OpenGeoSys (OGS) is a scientific open source code for numerical simulation of thermo-hydro-mechanical-chemical processes in porous and fractured media. Its basic concept is to provide a flexible numerical framework for solving multi-field problems for applications in geoscience and hydrology as e.g. for CO2 storage applications, geothermal power plant forecast simulation, salt water intrusion, water resources management, etc. Advances in computational mathematics have revolutionized the variety and nature of the problems that can be addressed by environmental scientists and engineers nowadays and an intensive code development in the last years enables in the meantime the solutions of much larger numerical problems and applications. However, solving environmental processes along the water cycle at large scales, like for complete catchment or reservoirs, stays computationally still a challenging task. Therefore, we started a new OGS code development with focus on execution speed and parallelization. In the new version, a local data structure concept improves the instruction and data cache performance by a tight bundling of data with an element-wise numerical integration loop. Dedicated analysis methods enable the investigation of memory-access patterns in the local and global assembler routines, which leads to further data structure optimization for an additional performance gain. The concept is presented together with a technical code analysis of the recent development and a large case study including transient flow simulation in the unsaturated / saturated zone of the Thuringian Syncline, Germany. The analysis is performed on a high-resolution mesh (up to 50M elements) with embedded fault structures.

  13. Billing, coding, and documentation in the critical care environment.

    PubMed

    Fakhry, S M

    2000-06-01

    Optimal conduct of modern-day physician practices involves a thorough understanding and application of the principles of documentation, coding, and billing. Physicians' role in these activities can no longer be secondary. Surgeons practicing critical care must be well versed in these concepts and their effective application to ensure that they are competitive in an increasingly difficult and demanding environment. Health care policies and regulations continue to evolve, mandating constant education of practicing physicians and their staffs and surgical residents who also will have to function in this environment. Close, collaborative relationships between physicians and individuals well versed in the concepts of documentation, coding, and billing are indispensable. Similarly, ongoing educational and review processes (whether internal or consultative from outside sources) not only can decrease the possibility of unfavorable outcomes from audit but also will likely enhance practice efficiency and cash flow. A financially viable practice is certainly a prerequisite for a surgical critical care practice to achieve its primary goal of excellence in patient care.

  14. Inference in the brain: Statistics flowing in redundant population codes

    PubMed Central

    Pitkow, Xaq; Angelaki, Dora E

    2017-01-01

    It is widely believed that the brain performs approximate probabilistic inference to estimate causal variables in the world from ambiguous sensory data. To understand these computations, we need to analyze how information is represented and transformed by the actions of nonlinear recurrent neural networks. We propose that these probabilistic computations function by a message-passing algorithm operating at the level of redundant neural populations. To explain this framework, we review its underlying concepts, including graphical models, sufficient statistics, and message-passing, and then describe how these concepts could be implemented by recurrently connected probabilistic population codes. The relevant information flow in these networks will be most interpretable at the population level, particularly for redundant neural codes. We therefore outline a general approach to identify the essential features of a neural message-passing algorithm. Finally, we argue that to reveal the most important aspects of these neural computations, we must study large-scale activity patterns during moderately complex, naturalistic behaviors. PMID:28595050

  15. NRA8-21 Cycle 2 RBCC Turbopump Risk Reduction

    NASA Technical Reports Server (NTRS)

    Ferguson, Thomas V.; Williams, Morgan; Marcu, Bogdan

    2004-01-01

    This project was composed of three sub-tasks. The objective of the first task was to use the CFD code INS3D to generate both on- and off-design predictions for the consortium optimized impeller flowfield. The results of the flow simulations are given in the first section. The objective of the second task was to construct a turbomachinery testing database comprised of measurements made on several different impellers, an inducer and a diffuser. The data was in the form of static pressure measurements as well as laser velocimeter measurements of velocities and flow angles within the stated components. Several databases with this information were created for these components. The third subtask objective was two-fold: first, to validate the Enigma CFD code for pump diffuser analysis, and secondly, to perform steady and unsteady analyses on some wide flow range diffuser concepts using Enigma. The code was validated using the consortium optimized impeller database and then applied to two different concepts for wide flow diffusers.

  16. The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro

    2010-05-01

    The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.

  17. Pragmatic turn in biology: From biological molecules to genetic content operators.

    PubMed

    Witzany, Guenther

    2014-08-26

    Erwin Schrödinger's question "What is life?" received the answer for decades of "physics + chemistry". The concepts of Alain Turing and John von Neumann introduced a third term: "information". This led to the understanding of nucleic acid sequences as a natural code. Manfred Eigen adapted the concept of Hammings "sequence space". Similar to Hilbert space, in which every ontological entity could be defined by an unequivocal point in a mathematical axiomatic system, in the abstract "sequence space" concept each point represents a unique syntactic structure and the value of their separation represents their dissimilarity. In this concept molecular features of the genetic code evolve by means of self-organisation of matter. Biological selection determines the fittest types among varieties of replication errors of quasi-species. The quasi-species concept dominated evolution theory for many decades. In contrast to this, recent empirical data on the evolution of DNA and its forerunners, the RNA-world and viruses indicate cooperative agent-based interactions. Group behaviour of quasi-species consortia constitute de novo and arrange available genetic content for adaptational purposes within real-life contexts that determine epigenetic markings. This review focuses on some fundamental changes in biology, discarding its traditional status as a subdiscipline of physics and chemistry.

  18. Telemetry: Summary of concept and rationale

    NASA Astrophysics Data System (ADS)

    1987-12-01

    This report presents the concept and supporting rationale for the telemetry system developed by the Consultative Committee for Space Data Systems (CCSDS). The concepts, protocols and data formats developed for the telemetry system are designed for flight and ground data systems supporting conventional, contemporary free-flyer spacecraft. Data formats are designed with efficiency as a primary consideration, i.e., format overhead is minimized. The results reflect the consensus of experts from many space agencies. An overview of the CCSDS telemetry system introduces the notion of architectural layering to achieve transparent and reliable delivery of scientific and engineering sensor data (generated aboard space vehicles) to users located in space or on earth. The system is broken down into two major conceptual categories: a packet telemetry concept and a telemetry channel coding concept. Packet telemetry facilitates data transmission from source to user in a standardized and highly automated manner. It provides a mechanism for implementing common data structures and protocols which can enhance the development and operation of space mission systems. Telemetry channel coding is a method by which data can be sent from a source to a destination by processing it in such a way that distinct messages are created which are easily distinguishable from one another. This allows construction of the data with low error probability, thus improving performance of the channel.

  19. A Bayesian network coding scheme for annotating biomedical information presented to genetic counseling clients.

    PubMed

    Green, Nancy

    2005-04-01

    We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.

  20. Rate-compatible punctured convolutional codes (RCPC codes) and their applications

    NASA Astrophysics Data System (ADS)

    Hagenauer, Joachim

    1988-04-01

    The concept of punctured convolutional codes is extended by punctuating a low-rate 1/N code periodically with period P to obtain a family of codes with rate P/(P + l), where l can be varied between 1 and (N - 1)P. A rate-compatibility restriction on the puncturing tables ensures that all code bits of high rate codes are used by the lower-rate codes. This allows transmission of incremental redundancy in ARQ/FEC (automatic repeat request/forward error correction) schemes and continuous rate variation to change from low to high error protection within a data frame. Families of RCPC codes with rates between 8/9 and 1/4 are given for memories M from 3 to 6 (8 to 64 trellis states) together with the relevant distance spectra. These codes are almost as good as the best known general convolutional codes of the respective rates. It is shown that the same Viterbi decoder can be used for all RCPC codes of the same M. The application of RCPC codes to hybrid ARQ/FEC schemes is discussed for Gaussian and Rayleigh fading channels using channel-state information to optimize throughput.

  1. Introduction to the Natural Anticipator and the Artificial Anticipator

    NASA Astrophysics Data System (ADS)

    Dubois, Daniel M.

    2010-11-01

    This short communication deals with the introduction of the concept of anticipator, which is one who anticipates, in the framework of computing anticipatory systems. The definition of anticipation deals with the concept of program. Indeed, the word program, comes from "pro-gram" meaning "to write before" by anticipation, and means a plan for the programming of a mechanism, or a sequence of coded instructions that can be inserted into a mechanism, or a sequence of coded instructions, as genes or behavioural responses, that is part of an organism. Any natural or artificial programs are thus related to anticipatory rewriting systems, as shown in this paper. All the cells in the body, and the neurons in the brain, are programmed by the anticipatory genetic code, DNA, in a low-level language with four signs. The programs in computers are also computing anticipatory systems. It will be shown, at one hand, that the genetic code DNA is a natural anticipator. As demonstrated by Nobel laureate McClintock [8], genomes are programmed. The fundamental program deals with the DNA genetic code. The properties of the DNA consist in self-replication and self-modification. The self-replicating process leads to reproduction of the species, while the self-modifying process leads to new species or evolution and adaptation in existing ones. The genetic code DNA keeps its instructions in memory in the DNA coding molecule. The genetic code DNA is a rewriting system, from DNA coding to DNA template molecule. The DNA template molecule is a rewriting system to the Messenger RNA molecule. The information is not destroyed during the execution of the rewriting program. On the other hand, it will be demonstrated that Turing machine is an artificial anticipator. The Turing machine is a rewriting system. The head reads and writes, modifying the content of the tape. The information is destroyed during the execution of the program. This is an irreversible process. The input data are lost.

  2. Solid Geometric Modeling - The Key to Improved Materiel Acquisition from Concept to Deployment

    DTIC Science & Technology

    1984-09-01

    M. J. Reisinger, "The GIFT Code User Manual; Volume I, Introduction and Input Requirements (U)," BRL Report No. 1802, July 1975. AD# A078364. 8 G...G. Kuehl, L. W. Bain, Jr., M. J. Reisinger, "The GIFT Code User Manual; Volume II, The Output Options (U)," USA ARRAOCOM Report No. 02189, Sep 79, AD...A078364 . • These results are plotted by a code called RunShot written by L. M. Rybak which takes input from GIFT and plots color shotlines on a

  3. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 8: Cooling Flow/heat Transfer Analysis User's Manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.

    1994-01-01

    The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.

  4. Three-Dimensional Algebraic Models of the tRNA Code and 12 Graphs for Representing the Amino Acids

    PubMed Central

    José, Marco V.; Morgado, Eberto R.; Guimarães, Romeu Cardoso; Zamudio, Gabriel S.; de Farías, Sávio Torres; Bobadilla, Juan R.; Sosa, Daniela

    2014-01-01

    Three-dimensional algebraic models, also called Genetic Hotels, are developed to represent the Standard Genetic Code, the Standard tRNA Code (S-tRNA-C), and the Human tRNA code (H-tRNA-C). New algebraic concepts are introduced to be able to describe these models, to wit, the generalization of the 2n-Klein Group and the concept of a subgroup coset with a tail. We found that the H-tRNA-C displayed broken symmetries in regard to the S-tRNA-C, which is highly symmetric. We also show that there are only 12 ways to represent each of the corresponding phenotypic graphs of amino acids. The averages of statistical centrality measures of the 12 graphs for each of the three codes are carried out and they are statistically compared. The phenotypic graphs of the S-tRNA-C display a common triangular prism of amino acids in 10 out of the 12 graphs, whilst the corresponding graphs for the H-tRNA-C display only two triangular prisms. The graphs exhibit disjoint clusters of amino acids when their polar requirement values are used. We contend that the S-tRNA-C is in a frozen-like state, whereas the H-tRNA-C may be in an evolving state. PMID:25370377

  5. Death of a dogma: eukaryotic mRNAs can code for more than one protein

    PubMed Central

    Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier

    2016-01-01

    mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5′ UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3′ UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. PMID:26578573

  6. Neural correlates of concreteness in semantic categorization.

    PubMed

    Pexman, Penny M; Hargreaves, Ian S; Edwards, Jodi D; Henry, Luke C; Goodyear, Bradley G

    2007-08-01

    In some contexts, concrete words (CARROT) are recognized and remembered more readily than abstract words (TRUTH). This concreteness effect has historically been explained by two theories of semantic representation: dual-coding [Paivio, A. Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 45, 255-287, 1991] and context-availability [Schwanenflugel, P. J. Why are abstract concepts hard to understand? In P. J. Schwanenflugel (Ed.), The psychology of word meanings (pp. 223-250). Hillsdale, NJ: Erlbaum, 1991]. Past efforts to adjudicate between these theories using functional magnetic resonance imaging have produced mixed results. Using event-related functional magnetic resonance imaging, we reexamined this issue with a semantic categorization task that allowed for uniform semantic judgments of concrete and abstract words. The participants were 20 healthy adults. Functional analyses contrasted activation associated with concrete and abstract meanings of ambiguous and unambiguous words. Results showed that for both ambiguous and unambiguous words, abstract meanings were associated with more widespread cortical activation than concrete meanings in numerous regions associated with semantic processing, including temporal, parietal, and frontal cortices. These results are inconsistent with both dual-coding and context-availability theories, as these theories propose that the representations of abstract concepts are relatively impoverished. Our results suggest, instead, that semantic retrieval of abstract concepts involves a network of association areas. We argue that this finding is compatible with a theory of semantic representation such as Barsalou's [Barsalou, L. W. Perceptual symbol systems. Behavioral & Brain Sciences, 22, 577-660, 1999] perceptual symbol systems, whereby concrete and abstract concepts are represented by similar mechanisms but with differences in focal content.

  7. Network analysis for the visualization and analysis of qualitative data.

    PubMed

    Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D

    2018-03-01

    We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Research on Ajax and Hibernate technology in the development of E-shop system

    NASA Astrophysics Data System (ADS)

    Yin, Luo

    2011-12-01

    Hibernate is a object relational mapping framework of open source code, which conducts light-weighted object encapsulation of JDBC to let Java programmers use the concept of object-oriented programming to manipulate database at will. The appearence of the concept of Ajax (asynchronous JavaScript and XML technology) begins the time prelude of page partial refresh so that developers can develop web application programs with stronger interaction. The paper illustrates the concrete application of Ajax and Hibernate to the development of E-shop in details and adopts them to design to divide the entire program code into relatively independent parts which can cooperate with one another as well. In this way, it is easier for the entire program to maintain and expand.

  9. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  10. Concept of a photon-counting camera based on a diffraction-addressed Gray-code mask

    NASA Astrophysics Data System (ADS)

    Morel, Sébastien

    2004-09-01

    A new concept of photon counting camera for fast and low-light-level imaging applications is introduced. The possible spectrum covered by this camera ranges from visible light to gamma rays, depending on the device used to transform an incoming photon into a burst of visible photons (photo-event spot) localized in an (x,y) image plane. It is actually an evolution of the existing "PAPA" (Precision Analog Photon Address) Camera that was designed for visible photons. This improvement comes from a simplified optics. The new camera transforms, by diffraction, each photo-event spot from an image intensifier or a scintillator into a cross-shaped pattern, which is projected onto a specific Gray code mask. The photo-event position is then extracted from the signal given by an array of avalanche photodiodes (or photomultiplier tubes, alternatively) downstream of the mask. After a detailed explanation of this camera concept that we have called "DIAMICON" (DIffraction Addressed Mask ICONographer), we briefly discuss about technical solutions to build such a camera.

  11. Number theoretical foundations in cryptography

    NASA Astrophysics Data System (ADS)

    Atan, Kamel Ariffin Mohd

    2017-08-01

    In recent times the hazards in relationships among entities in different establishments worldwide have generated exciting developments in cryptography. Central to this is the theory of numbers. This area of mathematics provides very rich source of fundamental materials for constructing secret codes. Some number theoretical concepts that have been very actively used in designing crypto systems will be highlighted in this presentation. This paper will begin with introduction to basic number theoretical concepts which for many years have been thought to have no practical applications. This will include several theoretical assertions that were discovered much earlier in the historical development of number theory. This will be followed by discussion on the "hidden" properties of these assertions that were later exploited by designers of cryptosystems in their quest for developing secret codes. This paper also highlights some earlier and existing cryptosystems and the role played by number theoretical concepts in their constructions. The role played by cryptanalysts in detecting weaknesses in the systems developed by cryptographers concludes this presentation.

  12. Viscous diffusion of vorticity in unsteady wall layers using the diffusion velocity concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strickland, J.H.; Kempka, S.N.; Wolfe, W.P.

    1995-03-01

    The primary purpose of this paper is to provide a careful evaluation of the diffusion velocity concept with regard to its ability to predict the diffusion of vorticity near a moving wall. A computer code BDIF has been written which simulates the evolution of the vorticity field near a wall of infinite length which is moving in an arbitrary fashion. The simulations generated by this code are found to give excellent results when compared to several exact solutions. We also outline a two-dimensional unsteady viscous boundary layer model which utilizes the diffusion velocity concept and is compatible with vortex methods.more » A primary goal of this boundary layer model is to minimize the number of vortices generated on the surface at each time step while achieving good resolution of the vorticity field near the wall. Preliminary results have been obtained for simulating a simple two-dimensional laminar boundary layer.« less

  13. Parametric Weight Comparison of Advanced Metallic, Ceramic Tile, and Ceramic Blanket Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Myers, David E.; Martin, Carl J.; Blosser, Max L.

    2000-01-01

    A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (I-D) finite element sizing code. This sizing code contained models to account for coatings fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a certain trajectory. Ten TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile stems and approaches blanket TPS weights for higher integrated heat loads.

  14. Convolution Operations on Coding Metasurface to Reach Flexible and Continuous Controls of Terahertz Beams.

    PubMed

    Liu, Shuo; Cui, Tie Jun; Zhang, Lei; Xu, Quan; Wang, Qiu; Wan, Xiang; Gu, Jian Qiang; Tang, Wen Xuan; Qing Qi, Mei; Han, Jia Guang; Zhang, Wei Li; Zhou, Xiao Yang; Cheng, Qiang

    2016-10-01

    The concept of coding metasurface makes a link between physically metamaterial particles and digital codes, and hence it is possible to perform digital signal processing on the coding metasurface to realize unusual physical phenomena. Here, this study presents to perform Fourier operations on coding metasurfaces and proposes a principle called as scattering-pattern shift using the convolution theorem, which allows steering of the scattering pattern to an arbitrarily predesigned direction. Owing to the constant reflection amplitude of coding particles, the required coding pattern can be simply achieved by the modulus of two coding matrices. This study demonstrates that the scattering patterns that are directly calculated from the coding pattern using the Fourier transform have excellent agreements to the numerical simulations based on realistic coding structures, providing an efficient method in optimizing coding patterns to achieve predesigned scattering beams. The most important advantage of this approach over the previous schemes in producing anomalous single-beam scattering is its flexible and continuous controls to arbitrary directions. This work opens a new route to study metamaterial from a fully digital perspective, predicting the possibility of combining conventional theorems in digital signal processing with the coding metasurface to realize more powerful manipulations of electromagnetic waves.

  15. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    NASA Technical Reports Server (NTRS)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  16. Challenges in assessing college students' conception of duality: the case of infinity

    NASA Astrophysics Data System (ADS)

    Babarinsa-Ochiedike, Grace Olutayo

    Interpreting students' views of infinity posits a challenge for researchers due to the dynamic nature of the conception. There is diversity and variation among students' process-object perceptions. The fluctuations between students' views however reveal an undeveloped duality conception. This study examined college students' conception of duality in understanding and representing infinity with the intent to design strategies that could guide researchers in categorizing students' views of infinity into different levels. Data for the study were collected from N=238 college students enrolled in Calculus sequence courses (Pre-Calculus, Calculus I through Calculus III) at one of the southwestern universities in the U.S. using self-report questionnaires and semi-structured individual task-based interviews. Data was triangulated using multiple measures analyzed by three independent experts using self-designed coding sheets to assess students' externalization of the duality conception of infinity. Results of this study reveal that college students' experiences in traditional Calculus sequence courses are not supportive of the development of duality conception. On the contrary, it strengthens the singularity perspective on fundamental ideas of mathematics such as infinity. The study also found that coding and assessing college students' conception of duality is a challenging and complex process due to the dynamic nature of the conception that is task-dependent and context-dependent. Practical significance of the study is that it helps to recognize misconceptions and starts addressing them so students will have a more comprehensive view of fundamental mathematical ideas as they progress through the Calculus coursework sequence. The developed duality concept development framework called Action-Process-Object-Duality (APOD) adapted from the APOS theory could guide educators and researchers as they engage in assessing students' conception of duality. The results of this study could serve as a facilitating instrument to further analyze cognitive obstacles in college students' understanding of the infinity concept.

  17. Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).

    PubMed

    Paivio, Allan

    2013-02-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved

  18. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physicalmore » principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as virtual experiments that give deeper and long lasting understanding of core principles. The student can then make sound judgements in novel situations encountered beyond routine clinical activities.« less

  19. Pulsed Ejector Wave Propogation Test Program

    NASA Technical Reports Server (NTRS)

    Fernandez, Rene; Slater, John W.; Paxson, Daniel E.

    2003-01-01

    The development of, and initial test data from, a nondetonating Pulse Detonation Engine (PDE) simulator tested in the NASA Glenn 1 x 1 foot Supersonic Wind Tunnel (SWT) is presented in this paper. The concept is a pulsed ejector driven by the simulated exhaust of a PDE. This pro- gram is applicable to a PDE entombed in a ramjet flowpath, i.e., a PDE combined-cycle propulsion system. The ejector primary flow is a pulsed, uiiderexpanded, supersonic nozzle simulating the supersonic waves ema- nating from a PDE, while the ejector secondary flow is the 1 x 1 foot SWT test section operated at subsonic Mach numbers. The objective is not to study the detonation details, but the wave physics including t,he start- ing vortices, the extent of propagation of the wave front, the reflection of the wave from the secondary flowpath walls, and the timing of these events of a pulsed ejector, and correlate these with Computational Fluid Dynamics (CFD) code predictions. Pulsed ejectors have been shown to result in a 3 to 1 improvement in LID (length-to-diameter) and a near 2 to 1 improvement in thrust augmentation over a steady ejector. This program will also explore the extent of upstream interactions between an inlet and large, periodically applied, backpressures to the inlet as would be present due to combustion tube detonations in a PDE. These interactions could result in inlet unstart or buzz for a supersonic mixed compression inlet. The design of the present experiment entailed the use of an 2-t diagram characteristics code to study the nozzle filling and purging timescales as well as a series of CFD analyses conducted using the WIND code. The WIND code is a general purpose CFD code for solution of the Reynolds averaged Navier-Stokes equations and can be applied to both steady state and time-accurate calculations. The first, proof-of-concept, test entry (spring 2001) pressure distributions shown here indicate the simulation concept was successful and therefore the experimental approach is sound.

  20. Applying the Landscape Model to Comprehending Discourse from TV News Stories

    ERIC Educational Resources Information Center

    Lee, Mina; Roskos-Ewoldsen, Beverly; Roskos-Ewoldsen, David R.

    2008-01-01

    The Landscape Model of text comprehension was extended to the comprehension of audiovisual discourse from text and video TV news stories. Concepts from the story were coded for activation after each sequence, creating a matrix of activations that was reduced to a vector of the degree of total activation for each concept. In Study 1, the degree…

  1. Concept For Generation Of Long Pseudorandom Sequences

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1990-01-01

    Conceptual very-large-scale integrated (VLSI) digital circuit performs exponentiation in finite field. Algorithm that generates unusually long sequences of pseudorandom numbers executed by digital processor that includes such circuits. Concepts particularly advantageous for such applications as spread-spectrum communications, cryptography, and generation of ranging codes, synthetic noise, and test data, where usually desirable to make pseudorandom sequences as long as possible.

  2. The Development of the Concept of "Matter": A Cross-Age Study of How Children Describe Materials

    ERIC Educational Resources Information Center

    Krnel, Dusan; Watson, Rod; Glazar, Sasa A.

    2005-01-01

    The development of the concept of matter was explored by interviewing 84 children aged 3-13 in Slovenia. Children were asked to describe objects and substances placed in front of them. Children's responses were coded and explored for patterns indicating development with age. The patterns of responses indicate that by acting on objects and…

  3. Halftoning Algorithms and Systems.

    DTIC Science & Technology

    1996-08-01

    TERMS 15. NUMBER IF PAGESi. Halftoning algorithms; error diffusions ; color printing; topographic maps 16. PRICE CODE 17. SECURITY CLASSIFICATION 18...graylevels for each screen level. In the case of error diffusion algorithms, the calibration procedure using the new centering concept manifests itself as a...Novel Centering Concept for Overlapping Correction Paper / Transparency (Patent Applied 5/94)I * Applications To Error Diffusion * To Dithering (IS&T

  4. Exploring Students' Conceptions of Science Learning via Drawing: A Cross-Sectional Analysis

    ERIC Educational Resources Information Center

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2017-01-01

    This cross-sectional study explored students' conceptions of science learning via drawing analysis. A total of 906 Taiwanese students in 4th, 6th, 8th, 10th, and 12th grade were asked to use drawing to illustrate how they conceptualise science learning. Students' drawings were analysed using a coding checklist to determine the presence or absence…

  5. Learning Illustrated: An Exploratory Cross-Sectional Drawing Analysis of Students' Conceptions of Learning

    ERIC Educational Resources Information Center

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2018-01-01

    Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…

  6. Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains

    NASA Astrophysics Data System (ADS)

    Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao

    2017-11-01

    A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.

  7. On the validation of a code and a turbulence model appropriate to circulation control airfoils

    NASA Technical Reports Server (NTRS)

    Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.

    1988-01-01

    A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.

  8. [Standardization of terminology in laboratory medicine I].

    PubMed

    Yoon, Soo Young; Yoon, Jong Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Lee, Chang Kyu; Kwon, Jung Ah; Lee, Kap No

    2007-04-01

    Standardization of medical terminology is essential for data transmission between health-care institutions or clinical laboratories and for maximizing the benefits of information technology. Purpose of our study was to standardize the medical terms used in the clinical laboratory, such as test names, units, terms used in result descriptions, etc. During the first year of the study, we developed a standard database of concept names for laboratory terms, which covered the terms used in government health care centers, their branch offices, and primary health care units. Laboratory terms were collected from the electronic data interchange (EDI) codes from National Health Insurance Corporation (NHIC), Logical Observation Identifier Names and Codes (LOINC) database, community health centers and their branch offices, and clinical laboratories of representative university medical centers. For standard expression, we referred to the English-Korean/ Korean-English medical dictionary of Korean Medical Association and the rules for foreign language translation. Programs for mapping between LOINC DB and EDI code and for translating English to Korean were developed. A Korean standard laboratory terminology database containing six axial concept names such as components, property, time aspect, system (specimen), scale type, and method type was established for 7,508 test observations. Short names and a mapping table for EDI codes and Unified Medical Language System (UMLS) were added. Synonym tables for concept names, words used in the database, and six axial terms were prepared to make it easier to find the standard terminology with common terms used in the field of laboratory medicine. Here we report for the first time a Korean standard laboratory terminology database for test names, result description terms, result units covering most laboratory tests in primary healthcare centers.

  9. Color inference in visual communication: the meaning of colors in recycling.

    PubMed

    Schloss, Karen B; Lessard, Laurent; Walmsley, Charlotte S; Foley, Kathleen

    2018-01-01

    People interpret abstract meanings from colors, which makes color a useful perceptual feature for visual communication. This process is complicated, however, because there is seldom a one-to-one correspondence between colors and meanings. One color can be associated with many different concepts (one-to-many mapping) and many colors can be associated with the same concept (many-to-one mapping). We propose that to interpret color-coding systems, people perform assignment inference to determine how colors map onto concepts. We studied assignment inference in the domain of recycling. Participants saw images of colored but unlabeled bins and were asked to indicate which bins they would use to discard different kinds of recyclables and trash. In Experiment 1, we tested two hypotheses for how people perform assignment inference. The local assignment hypothesis predicts that people simply match objects with their most strongly associated color. The global assignment hypothesis predicts that people also account for the association strengths between all other objects and colors within the scope of the color-coding system. Participants discarded objects in bins that optimized the color-object associations of the entire set, which is consistent with the global assignment hypothesis. This sometimes resulted in discarding objects in bins whose colors were weakly associated with the object, even when there was a stronger associated option available. In Experiment 2, we tested different methods for encoding color-coding systems and found that people were better at assignment inference when color sets simultaneously maximized the association strength between assigned color-object parings while minimizing associations between unassigned pairings. Our study provides an approach for designing intuitive color-coding systems that facilitate communication through visual media such as graphs, maps, signs, and artifacts.

  10. Literature-based concept profiles for gene annotation: the issue of weighting.

    PubMed

    Jelier, Rob; Schuemie, Martijn J; Roes, Peter-Jan; van Mulligen, Erik M; Kors, Jan A

    2008-05-01

    Text-mining has been used to link biomedical concepts, such as genes or biological processes, to each other for annotation purposes or the generation of new hypotheses. To relate two concepts to each other several authors have used the vector space model, as vectors can be compared efficiently and transparently. Using this model, a concept is characterized by a list of associated concepts, together with weights that indicate the strength of the association. The associated concepts in the vectors and their weights are derived from a set of documents linked to the concept of interest. An important issue with this approach is the determination of the weights of the associated concepts. Various schemes have been proposed to determine these weights, but no comparative studies of the different approaches are available. Here we compare several weighting approaches in a large scale classification experiment. Three different techniques were evaluated: (1) weighting based on averaging, an empirical approach; (2) the log likelihood ratio, a test-based measure; (3) the uncertainty coefficient, an information-theory based measure. The weighting schemes were applied in a system that annotates genes with Gene Ontology codes. As the gold standard for our study we used the annotations provided by the Gene Ontology Annotation project. Classification performance was evaluated by means of the receiver operating characteristics (ROC) curve using the area under the curve (AUC) as the measure of performance. All methods performed well with median AUC scores greater than 0.84, and scored considerably higher than a binary approach without any weighting. Especially for the more specific Gene Ontology codes excellent performance was observed. The differences between the methods were small when considering the whole experiment. However, the number of documents that were linked to a concept proved to be an important variable. When larger amounts of texts were available for the generation of the concepts' vectors, the performance of the methods diverged considerably, with the uncertainty coefficient then outperforming the two other methods.

  11. Nurse prescribing ethics and medical marketing.

    PubMed

    Adams, J

    This article suggests that nurse prescribers require an awareness of key concepts in ethics, such as deontology and utilitarianism to reflect on current debates and contribute to them. The principles of biomedical ethics have also been influential in the development of professional codes of conduct. Attention is drawn to the importance of the Association of the British Pharmaceutical Industry's code of practice for the pharmaceutical industry in regulating marketing aimed at prescribers.

  12. Relativistic Klystron Amplifiers Driven by Modulated Intense Relativistic Electron Beams

    DTIC Science & Technology

    1990-04-11

    electrical parameters of the cavity were calculated using the SUPERFISH computer code. We found: (1) that the gap voltage, V was half as high as the...SUPERFISH computer code and experimenting with various cavities we found the best cavity geometry that fulfilled the above conditions. For this cavity...paths. Experiments along this line are being planned (T. Godlove and F. Mako, private communciation ). A somewhat different concept which also

  13. Hypochondria as withdrawal and comedy as cure in Dr. Willibald's Der Hypochondrist (1824).

    PubMed

    Potter, Edward T

    2012-01-01

    Balthasar von Ammann's comedy Der Hypochondrist, published in 1824 under the pseudonym Dr. Willibald, foregrounds the social, sexual, and political implications of hypochondria. The play engages with early nineteenth-century medical and popular conceptions of hypochondria to co-opt potentially subversive elements and to promote a specific social, sexual, and political agenda. The text promotes literature — specifically comedic drama — as a cure for hypochondria. Hypochondria functions as a code for withdrawal. The hypochondriac withdraws medically from healthy society, gaining exceptional status. He withdraws sexually from society by remaining a bachelor, possibly engaged in non-normative sexual behaviour. Furthermore, the politically disenfranchised protagonist voices his political frustrations via a coded medical metaphor. The hypochondriac poses a threefold challenge to the social, sexual, and political order, and the play engages with contemporary conceptions of the disease to provide the solution: comedy. The text, presented as a cure for hypochondria, replaces the coded questioning of the social order via hypochondria with the less threatening code of heraldry. A comedy-within-the-comedy uses the hypochondriac's love of heraldry to cure him, resulting in the elimination of his medical problems and exceptional status, in the purification of his bachelorhood from non-normative elements, and in the pre-emption of political frustrations.

  14. Water cycle algorithm: A detailed standard code

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.

  15. Flowgen: Flowchart-based documentation for C + + codes

    NASA Astrophysics Data System (ADS)

    Kosower, David A.; Lopez-Villarejo, J. J.

    2015-11-01

    We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.

  16. [Global aspects of medical ethics: conditions and possibilities].

    PubMed

    Neitzke, G

    2001-01-01

    A global or universal code of medical ethics seems paradoxical in the era of pluralism and postmodernism. A different conception of globalisation will be developed in terms of a "procedural universality". According to this philosophical concept, a code of medical ethics does not oblige physicians to accept certain specific, preset, universal values and rules. It rather obliges every culture and society to start a culture-sensitive, continuous, and active discourse on specific issues, mentioned in the codex. This procedure might result in regional, intra-cultural consensus, which should be presented to an inter-cultural dialogue. To exemplify this procedure, current topics of medical ethics (spiritual foundations of medicine, autonomy, definitions concerning life and death, physicians' duties, conduct within therapeutic teams) will be discussed from the point of view of western medicine.

  17. The ASSERT Virtual Machine Kernel: Support for Preservation of Temporal Properties

    NASA Astrophysics Data System (ADS)

    Zamorano, J.; de la Puente, J. A.; Pulido, J. A.; Urueña

    2008-08-01

    A new approach to building embedded real-time software has been developed in the ASSERT project. One of its key elements is the concept of a virtual machine preserving the non-functional properties of the system, and especially real-time properties, all the way down from high- level design models down to executable code. The paper describes one instance of the virtual machine concept that provides support for the preservation of temporal properties both at the source code level —by accept- ing only "legal" entities, i.e. software components with statically analysable real-tim behaviour— and at run-time —by monitoring the temporal behaviour of the system. The virtual machine has been validated on several pilot projects carried out by aerospace companies in the framework of the ASSERT project.

  18. Applying a Force and Motion Learning Progression over an Extended Time Span Using the Force Concept Inventory

    ERIC Educational Resources Information Center

    Fulmer, Gavin W.; Liang, Ling L.; Liu, Xiufeng

    2014-01-01

    This exploratory study applied a proposed force and motion learning progression (LP) to high-school and university students and to content involving both one- and two-dimensional force and motion situations. The Force Concept Inventory (FCI) was adapted, based on a previous content analysis and coding of the questions in the FCI in terms of the…

  19. Analysis of Physical Science Textbooks for Conceptual Frameworks on Acids, Bases and Neutralization: Implications for Students' Conceptual Understanding.

    ERIC Educational Resources Information Center

    Erduran, Sibel

    Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…

  20. Death of a dogma: eukaryotic mRNAs can code for more than one protein.

    PubMed

    Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier

    2016-01-08

    mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5' UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3' UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview

    NASA Technical Reports Server (NTRS)

    Budinger, James M.

    1992-01-01

    The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.

  2. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    NASA Astrophysics Data System (ADS)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  3. Mapping the Content of the Patient Reported Outcomes Measurement Information System (PROMIS®) Using the International Classification of Functioning, Health and Disability

    PubMed Central

    Tucker, Carole A; Escorpizo, Reuben; Cieza, Alarcos; Lai, Jin Shei; Stucki, Gerold; Ustun, T. Bedirhan; Kostanjsek, Nenad; Cella, David; Forrest, Christopher B.

    2014-01-01

    Background The Patient Reported Outcomes Measurement Information System (PROMIS®) is a U.S. National Institutes of Health initiative that has produced self-reported item banks for physical, mental, and social health. Objective To describe the content of PROMIS at the item level using the World Health Organization’s International Classification of Functioning, Disability and Health (ICF). Methods All PROMIS adult items (publicly available as of 2012) were assigned to relevant ICF concepts. The content of the PROMIS adult item banks were then described using the mapped ICF code descriptors. Results The 1006 items in the PROMIS instruments could all be mapped to ICF concepts at the second level of classification, with the exception of 3 items of global or general health that mapped across the first-level classification of ICF activity and participation component (d categories). Individual PROMIS item banks mapped from 1 to 5 separate ICF codes indicating one-to-one, one-to-many and many-to-one mappings between PROMIS item banks and ICF second level classification codes. PROMIS supports measurement of the majority of major concepts in the ICF Body Functions (b) and Activity & Participation (d) components using PROMIS item banks or subsets of PROMIS items that could, with care, be used to develop customized instruments. Given the focus of PROMIS is on measurement of person health outcomes, concepts in body structures (s) and some body functions (b), as well as many ICF environmental factor have minimal coverage in PROMIS. Discussion The PROMIS-ICF mapped items provide a basis for users to evaluate the ICF related content of specific PROMIS instruments, and to select PROMIS instruments in ICF based measurement applications. PMID:24760532

  4. Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository

    PubMed Central

    Cimino, James J.; Remennick, Lyubov

    2014-01-01

    Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344

  5. Optical information encryption based on incoherent superposition with the help of the QR code

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  6. LDPC-coded MIMO optical communication over the atmospheric turbulence channel using Q-ary pulse-position modulation.

    PubMed

    Djordjevic, Ivan B

    2007-08-06

    We describe a coded power-efficient transmission scheme based on repetition MIMO principle suitable for communication over the atmospheric turbulence channel, and determine its channel capacity. The proposed scheme employs the Q-ary pulse-position modulation. We further study how to approach the channel capacity limits using low-density parity-check (LDPC) codes. Component LDPC codes are designed using the concept of pairwise-balanced designs. Contrary to the several recent publications, bit-error rates and channel capacities are reported assuming non-ideal photodetection. The atmospheric turbulence channel is modeled using the Gamma-Gamma distribution function due to Al-Habash et al. Excellent bit-error rate performance improvement, over uncoded case, is found.

  7. Evaluation of natural language processing from emergency department computerized medical records for intra-hospital syndromic surveillance

    PubMed Central

    2011-01-01

    Background The identification of patients who pose an epidemic hazard when they are admitted to a health facility plays a role in preventing the risk of hospital acquired infection. An automated clinical decision support system to detect suspected cases, based on the principle of syndromic surveillance, is being developed at the University of Lyon's Hôpital de la Croix-Rousse. This tool will analyse structured data and narrative reports from computerized emergency department (ED) medical records. The first step consists of developing an application (UrgIndex) which automatically extracts and encodes information found in narrative reports. The purpose of the present article is to describe and evaluate this natural language processing system. Methods Narrative reports have to be pre-processed before utilizing the French-language medical multi-terminology indexer (ECMT) for standardized encoding. UrgIndex identifies and excludes syntagmas containing a negation and replaces non-standard terms (abbreviations, acronyms, spelling errors...). Then, the phrases are sent to the ECMT through an Internet connection. The indexer's reply, based on Extensible Markup Language, returns codes and literals corresponding to the concepts found in phrases. UrgIndex filters codes corresponding to suspected infections. Recall is defined as the number of relevant processed medical concepts divided by the number of concepts evaluated (coded manually by the medical epidemiologist). Precision is defined as the number of relevant processed concepts divided by the number of concepts proposed by UrgIndex. Recall and precision were assessed for respiratory and cutaneous syndromes. Results Evaluation of 1,674 processed medical concepts contained in 100 ED medical records (50 for respiratory syndromes and 50 for cutaneous syndromes) showed an overall recall of 85.8% (95% CI: 84.1-87.3). Recall varied from 84.5% for respiratory syndromes to 87.0% for cutaneous syndromes. The most frequent cause of lack of processing was non-recognition of the term by UrgIndex (9.7%). Overall precision was 79.1% (95% CI: 77.3-80.8). It varied from 81.4% for respiratory syndromes to 77.0% for cutaneous syndromes. Conclusions This study demonstrates the feasibility of and interest in developing an automated method for extracting and encoding medical concepts from ED narrative reports, the first step required for the detection of potentially infectious patients at epidemic risk. PMID:21798029

  8. Geometric Processing and Its Relational Graphics

    DTIC Science & Technology

    1976-10-01

    20, If different from Report) f3. SUPPLEMENTARY NOTES 9. KEY WORDS (Cbnttnue on reverse aide if neceaaary .mdldentlfy by bfock number) Graphics GIFT ...are typified by defining an object as a series of adjacent triangular or rectangular patches or surfaces (ruled surfaces may also be used). The GIFT ...code embodies the Patch code concept in one of its solids, the ARS; however, processing of a many-faceted GIFT solid takes longer to process than its

  9. High Performance Object-Oriented Scientific Programming in Fortran 90

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.

    1997-01-01

    We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.

  10. Thin family: a new barcode concept

    NASA Astrophysics Data System (ADS)

    Allais, David C.

    1991-02-01

    This paper describes a new space-efficient family of thin bar code symbologies which are appropriate for representing small amounts of information. The proposed structure is 30 to 50 percent more compact than the narrowest existing bar code when 12 or fewer bits of information are to be encoded in each symbol. Potential applications for these symbologies include menus catalogs automated test and survey scoring and biological research such as the tracking of honey bees.

  11. Coding Gains for Rank Decoding

    DTIC Science & Technology

    1990-02-01

    PM PUB=C RERZASB DISThIDUnO UNLI M . U.S. ARMY LABORATORY COWMAND BALLISTIC RESEARCH LABORATORY ABERDEEN PROVING GROUND, MARYLAND 9o 03 is.032...Proving Ground, MD 21005-5066 ATITN: SLCBR-D Aberdeen Proving Ground, M 21005-5066 8a NAME OF FUNDING , SPONSORING 8b OFFICE SYMBOL 9 PROCUREMENT...Previouseditionsare obsolete. SECURITY CLASSIFILATION OF THIS PAGE mm m ini IIIIIIIIIIIIIII I Isn FI E Contents 1 Soft Decision Concepts 1 2 Coding Gain 2 3

  12. Application of Advanced Concepts and Techniques in Electromagnetic Topology Based Simulations: CRIPTE and Related Codes

    DTIC Science & Technology

    2008-12-01

    multiconductor transmission line theory. The per-unit capacitance, inductance , and characteristic impedance matrices generated from the companion LAPLACE...code based on the Method of Moments application, by meshing different sections of the multiconductor cable for capacitance and inductance matrices [21...conductors held together in four pairs and resided in the cable jacket. Each of eight conductors was also designed with the per unit length resistance

  13. Spotted star mapping by light curve inversion: Tests and application to HD 12545

    NASA Astrophysics Data System (ADS)

    Kolbin, A. I.; Shimansky, V. V.

    2013-06-01

    A code for mapping the surfaces of spotted stars is developed. The concept of the code is to analyze rotational-modulated light curves. We simulate the process of reconstruction for the star surface and the results of simulation are presented. The reconstruction atrifacts caused by the ill-posed nature of the problem are deduced. The surface of the spotted component of system HD 12545 is mapped using the procedure.

  14. Research into language concepts for the mission control center

    NASA Technical Reports Server (NTRS)

    Dellenback, Steven W.; Barton, Timothy J.; Ratner, Jeremiah M.

    1990-01-01

    A final report is given on research into language concepts for the Mission Control Center (MCC). The Specification Driven Language research is described. The state of the image processing field and how image processing techniques could be applied toward automating the generation of the language known as COmputation Development Environment (CODE or Comp Builder) are discussed. Also described is the development of a flight certified compiler for Comps.

  15. A Ceramic Fracture Model for High Velocity Impact

    DTIC Science & Technology

    1993-05-01

    employ damage concepts appear more relevant than crack growth models for this application . This research adopts existing fracture model concepts and...extends them through applications in an existing finite element continuum mechanics code (hydrocode) to the prediction of the damage and fracture processes...to be accurate in the lower velocity range of this work. Mescall and Tracy 15] investigated the selection of ceramic material for application in armors

  16. Developing Trustworthy Commissioned Officers: Transcending the Honor Codes and Concept

    DTIC Science & Technology

    2012-10-01

    extracurricular   activities ).       This   developmental  concept  recognizes  that  individuals...tangible   activities  within  the  developmental  programs  at  each  SOC  must  be  designed  and  implemented...develop  simultaneously  across  and  within  all   domains   as   they   complete   the   activities  

  17. A study of concept-based similarity approaches for recommending program examples

    NASA Astrophysics Data System (ADS)

    Hosseini, Roya; Brusilovsky, Peter

    2017-07-01

    This paper investigates a range of concept-based example recommendation approaches that we developed to provide example-based problem-solving support in the domain of programming. The goal of these approaches is to offer students a set of most relevant remedial examples when they have trouble solving a code comprehension problem where students examine a program code to determine its output or the final value of a variable. In this paper, we use the ideas of semantic-level similarity-based linking developed in the area of intelligent hypertext to generate examples for the given problem. To determine the best-performing approach, we explored two groups of similarity approaches for selecting examples: non-structural approaches focusing on examples that are similar to the problem in terms of concept coverage and structural approaches focusing on examples that are similar to the problem by the structure of the content. We also explored the value of personalized example recommendation based on student's knowledge levels and learning goal of the exercise. The paper presents concept-based similarity approaches that we developed, explains the data collection studies and reports the result of comparative analysis. The results of our analysis showed better ranking performance of the personalized structural variant of cosine similarity approach.

  18. Script, code, information: how to differentiate analogies in the "prehistory" of molecular biology.

    PubMed

    Kogge, Werner

    2012-01-01

    The remarkable fact that twentieth-century molecular biology developed its conceptual system on the basis of sign-like terms has been the object of numerous studies and debates. Throughout these, the assumption is made that this vocabulary's emergence should be seen in the historical context of mathematical communication theory and cybernetics. This paper, in contrast, sets out the need for a more differentiated view: whereas the success of the terms "code" and "information" would probably be unthinkable outside that historical context, general semiotic and especially scriptural concepts arose far earlier in the "prehistory" of molecular biology, and in close association with biological research and phenomena. This distinction, established through a reconstruction of conceptual developments between 1870 and 1950, makes it possible to separate off a critique of the reductive implications of particular information-based concepts from the use of semiotic and scriptural concepts, which is fundamental to molecular biology. Gene-centrism and determinism are not implications of semiotic and scriptural analogies, but arose only when the vocabulary of information was superimposed upon them.

  19. RB-ARD: A proof of concept rule-based abort

    NASA Technical Reports Server (NTRS)

    Smith, Richard; Marinuzzi, John

    1987-01-01

    The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.

  20. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes

    PubMed Central

    Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren

    2016-01-01

    The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases. PMID:27656096

  1. A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.

    1994-01-01

    Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.

  2. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  3. Addition of equilibrium air to an upwind Navier-Stokes code and other first steps toward a more generalized flow solver

    NASA Technical Reports Server (NTRS)

    Rosen, Bruce S.

    1991-01-01

    An upwind three-dimensional volume Navier-Stokes code is modified to facilitate modeling of complex geometries and flow fields represented by proposed National Aerospace Plane concepts. Code enhancements include an equilibrium air model, a generalized equilibrium gas model and several schemes to simplify treatment of complex geometric configurations. The code is also restructured for inclusion of an arbitrary number of independent and dependent variables. This latter capability is intended for eventual use to incorporate nonequilibrium/chemistry gas models, more sophisticated turbulence and transition models, or other physical phenomena which will require inclusion of additional variables and/or governing equations. Comparisons of computed results with experimental data and results obtained using other methods are presented for code validation purposes. Good correlation is obtained for all of the test cases considered, indicating the success of the current effort.

  4. Development of structured ICD-10 and its application to computer-assisted ICD coding.

    PubMed

    Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko

    2010-01-01

    This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.

  5. Extension of analog network coding in wireless information exchange

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Huang, Jiaqing

    2012-01-01

    Ever since the concept of analog network coding(ANC) was put forward by S.Katti, much attention has been focused on how to utilize analog network coding to take advantage of wireless interference, which used to be considered generally harmful, to improve throughput performance. Previously, only the case of two nodes that need to exchange information has been fully discussed while the issue of extending analog network coding to more than three nodes remains undeveloped. In this paper, we propose a practical transmission scheme to extend analog network coding to more than two nodes that need to exchange information among themselves. We start with the case of three nodes that need to exchange information and demonstrate that through utilizing our algorithm, the throughput can achieve 33% and 20% increase compared with that of traditional transmission scheduling and digital network coding, respectively. Then, we generalize the algorithm so that it can fit for occasions with any number of nodes. We also discuss some technical issues and throughput analysis as well as the bit error rate.

  6. Soapy: an adaptive optics simulation written purely in Python for rapid concept development

    NASA Astrophysics Data System (ADS)

    Reeves, Andrew

    2016-07-01

    Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.

  7. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  8. Implications of Information Theory for Computational Modeling of Schizophrenia.

    PubMed

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  9. Implications of Information Theory for Computational Modeling of Schizophrenia

    PubMed Central

    Wibral, Michael; Phillips, William A.

    2017-01-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory—such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio—can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development. PMID:29601053

  10. Design of neurophysiologically motivated structures of time-pulse coded neurons

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lobodzinska, Raisa F.

    2009-04-01

    The common methodology of biologically motivated concept of building of processing sensors systems with parallel input and picture operands processing and time-pulse coding are described in paper. Advantages of such coding for creation of parallel programmed 2D-array structures for the next generation digital computers which require untraditional numerical systems for processing of analog, digital, hybrid and neuro-fuzzy operands are shown. The optoelectronic time-pulse coded intelligent neural elements (OETPCINE) simulation results and implementation results of a wide set of neuro-fuzzy logic operations are considered. The simulation results confirm engineering advantages, intellectuality, circuit flexibility of OETPCINE for creation of advanced 2D-structures. The developed equivalentor-nonequivalentor neural element has power consumption of 10mW and processing time about 10...100us.

  11. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less

  12. An open-source textbook for teaching climate-related risk analysis using the R computing environment

    NASA Astrophysics Data System (ADS)

    Applegate, P. J.; Keller, K.

    2015-12-01

    Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.

  13. Color associations among designers and non-designers for common warning and operation concepts.

    PubMed

    Ng, Annie W Y; Chan, Alan H S

    2018-07-01

    This study examined color-concept associations among designers and non-designers with commonly used warning and operation concepts. This study required 199 designers and 175 non-designers to indicate their choice among nine colors to associate with each of the 38 concepts in a color-concept table. The results showed that the designers and non-designers had the same color associations and similar strengths of stereotypes for 17 concepts. The strongest color-concept stereotypes for both groups were red-danger, red-fire, and red-hot. However, the designers and non-designers had different color associations for the concepts of escape (green, red), increase (green, red), potential hazard (red, orange), fatal (black, red), and normal (white, green), while the strengths of the 16 remaining associations for both groups were not at equivalent levels. These findings provide ergonomists and design practitioners with a better understanding of population stereotypes for color coding, and consequently to effectively use colors in their user-centered designs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Center of Gravity in the Asymmetric Environment: Applicable or Not

    DTIC Science & Technology

    2006-06-01

    public release; distribution unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The military concept of a Center of Gravity ( COG ) in...changed a great deal since the introduction of COG . And in today’s asymmetric environment, in which non-state actors use unconventional tactics, it is...becoming extremely difficult to apply the COG concept. The primary reason for this difficulty is that non-state actors do not operate as a unitary

  15. Joint Chiefs of Staff > Directorates > J3 | Operations

    Science.gov Websites

    Joint Staff Structure Joint Staff Inspector General Origin of Joint Concepts U.S. Code | Joint Chiefs of J8 | Force Structure, Resources & Assessment Contact J3 Operations Home : Directorates : J3

  16. Sharing Resources In Mobile/Satellite Communications

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee; Sue, Miles K.

    1992-01-01

    Report presents preliminary theoretical analysis of several alternative schemes for allocation of satellite resource among terrestrial subscribers of landmobile/satellite communication system. Demand-access and random-access approaches under code-division and frequency-division concepts compared.

  17. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  18. Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.

    PubMed

    Maani, Ehsan; Katsaggelos, Aggelos K

    2009-09-01

    The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.

  19. Microgravity Materials Research and Code U ISRU

    NASA Technical Reports Server (NTRS)

    Curreri, Peter A.; Sibille, Laurent

    2004-01-01

    The NASA microgravity research program, simply put, has the goal of doing science (which is essentially finding out something previously unknown about nature) utilizing the unique long-term microgravity environment in Earth orbit. Since 1997 Code U has in addition funded scientific basic research that enables safe and economical capabilities to enable humans to live, work and do science beyond Earth orbit. This research has been integrated with the larger NASA missions (Code M and S). These new exploration research focus areas include Radiation Shielding Materials, Macromolecular Research on Bone and Muscle Loss, In Space Fabrication and Repair, and Low Gravity ISRU. The latter two focus on enabling materials processing in space for use in space. The goal of this program is to provide scientific and technical research resulting in proof-of-concept experiments feeding into the larger NASA program to provide humans in space with an energy rich, resource rich, self sustaining infrastructure at the earliest possible time and with minimum risk, launch mass and program cost. President Bush's Exploration Vision (1/14/04) gives a new urgency for the development of ISRU concepts into the exploration architecture. This will require an accelerated One NASA approach utilizing NASA's partners in academia, and industry.

  20. Social responsibility of nursing: a global perspective.

    PubMed

    Tyer-Viola, Lynda; Nicholas, Patrice K; Corless, Inge B; Barry, Donna M; Hoyt, Pamela; Fitzpatrick, Joyce J; Davis, Sheila M

    2009-05-01

    This study addresses social responsibility in the discipline of nursing and implications for global health. The concept of social responsibility is explicated and its relevance for nursing is examined, grounded in the American Nurses Association Code of Ethics and the International Council of Nurses Code of Ethics. Social justice, human rights, nurse migration, and approaches to nursing education are discussed within the framework of nursing's social responsibility. Strategies for addressing nursing workforce issues and education within a framework of social responsibility are explored.

  1. Cryptography based on the absorption/emission features of multicolor semiconductor nanocrystal quantum dots.

    PubMed

    Zhou, Ming; Chang, Shoude; Grover, Chander

    2004-06-28

    Further to the optical coding based on fluorescent semiconductor quantum dots (QDs), a concept of using mixtures of multiple single-color QDs for creating highly secret cryptograms based on their absorption/emission properties was demonstrated. The key to readout of the optical codes is a group of excitation lights with the predetermined wavelengths programmed in a secret manner. The cryptograms can be printed on the surfaces of different objects such as valuable documents for security purposes.

  2. IEEE International Symposium on Information Theory (ISIT): Abstracts of Papers, Held in Ann Arbor, Michigan on 6-9 October 1986.

    DTIC Science & Technology

    1986-10-01

    BUZO, and FEDERICO KUHLMANN, Universidad Nacional Autdnoma de Mixico, Facultad de Ingenieria , Divisidn Estudios de Posgrado, P.O. Box 70-256, 04510...unsuccess- ful in this area for a long time. It was felt, e.g., in the voiceband modem industry , that the coding gains achievable by error-correction coding...without bandwidth expansion or data rate reduction, when compared to uncoded modulation. The concept was quickly adopted by industry , and is now becoming

  3. Multi-level Expression Design Language: Requirement level (MEDL-R) system evaluation

    NASA Technical Reports Server (NTRS)

    1980-01-01

    An evaluation of the Multi-Level Expression Design Language Requirements Level (MEDL-R) system was conducted to determine whether it would be of use in the Goddard Space Flight Center Code 580 software development environment. The evaluation is based upon a study of the MEDL-R concept of requirement languages, the functions performed by MEDL-R, and the MEDL-R language syntax. Recommendations are made for changes to MEDL-R that would make it useful in the Code 580 environment.

  4. CrossTalk: The Journal of Defense Software Engineering. Volume 25, Number 4, July/August 2012

    DTIC Science & Technology

    2012-08-01

    understand the interface between various code components. For example, consider a situation in which handwrit - ten code produced by one team generates an...conclusively say that a division by zero will not occur. The abstract interpretation concept can be generalized as a tool set that can be used to determine...word what makes a good manager, I would say decisiveness. You can use the fan- ciest computers to gather the numbers, but in the end you have to set

  5. Traffic Pattern Detection Using the Hough Transformation for Anomaly Detection to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2013-12-01

    Programming code in the Python language used in AIS data preprocessing is contained in Appendix A. The MATLAB programming code used to apply the Hough...described in Chapter III is applied to archived AIS data in this chapter. The implementation of the method, including programming techniques used, is...is contained in the second. To provide a proof of concept for the algorithm described in Chapter III, the PYTHON programming language was used for

  6. Modular Track System For Positioning Mobile Robots

    NASA Technical Reports Server (NTRS)

    Miller, Jeff

    1995-01-01

    Conceptual system for positioning mobile robotic manipulators on large main structure includes modular tracks and ancillary structures assembled easily along with main structure. System, called "tracked robotic location system" (TROLS), originally intended for application to platforms in outer space, but TROLS concept might also prove useful on Earth; for example, to position robots in factories and warehouses. T-cross-section rail keeps mobile robot on track. Bar codes mark locations along track. Each robot equipped with bar-code-recognizing circuitry so it quickly finds way to assigned location.

  7. Invocation oriented architecture for agile code and agile data

    NASA Astrophysics Data System (ADS)

    Verma, Dinesh; Chan, Kevin; Leung, Kin; Gkelias, Athanasios

    2017-05-01

    In order to address the unique requirements of sensor information fusion in a tactical coalition environment, we are proposing a new architecture - one based on the concept of invocations. An invocation is a combination of a software code and a piece of data, both managed using techniques from Information Centric networking. This paper will discuss limitations of current approaches, present the architecture for an invocation oriented architecture, illustrate how it works with an example scenario, and provide reasons for its suitability in a coalition environment.

  8. Travelling Wave Concepts for the Modeling and Control of Space Structures

    DTIC Science & Technology

    1988-01-31

    ZIP Code) 77 Massachusetts Avenue AFOSR / L \\\\ 0 Cambridge, MA 02139 Bolling Air Force Base , DC 20332-6448 8a. NAME OF FUNDING/SPONSORING 8b OFFICE...FQ8671-88-00398 8c. ADDRESS (City, State, and ZIP Code) 10 SOURCE OF FUNDING NUMBERS Building 410 PROGRAM PROJECT tASK WORK UNIT Bolling Air Force Base ...at the Jet Propulsion Laboratories, and is writing two further papers for journal publication based on his PhD dissertation. In the winter of 1987

  9. Multidimensional representations: The knowledge domain of germs held by students, teachers and medical professionals

    NASA Astrophysics Data System (ADS)

    Rua, Melissa Jo

    The present study examined the understandings held by 5th, 8th, and 11th-grade students, their teachers and medical professionals about germs. Specifically, this study describes the content and structure of students' and adults' conceptions in the areas of germ contraction, transmission, and treatment of infectious and non-infectious diseases caused by microorganisms. Naturalistic and empirical research methods were used to investigate participants' conceptions. Between and within group similarities were found using data from concept maps on the topic "flu," drawings of germs, a 20 word card sort related to germs and illness, and a semi-structured interview. Concept maps were coded according to techniques by Novak and Gowan (1984). Drawings of germs were coded into four main categories (bacteria, viruses, animal cell, other) and five subcategories (disease, caricature, insect, protozoa, unclassified). Cluster patterns for the card sorts of each group were found using multidimensional scaling techniques. Six coding categories emerged from the interview transcripts: (a) transmission, (b) treatment, (c) effect of weather on illness, (d) immune response, (e) location of germs, and (f) similarities and differences between bacteria and viruses. The findings showed students, teachers and medical professionals have different understandings about bacteria and viruses and the structures of those understandings vary. Gaps or holes in the participants knowledge were found in areas such as: (a) how germs are transmitted, (b) where germs are found, (c) how the body transports and uses medicine, (d) how the immune system functions, (e) the difference between vaccines and non-prescription medicines, (f) differences that exist between bacteria and viruses, and (g) bacterial resistance to medication. The youngest students relied heavily upon personal experiences with germs rather than formal instruction when explaining their conceptions. As a result, the influence of media was evident in the students' understandings and images of microbes. Students also viewed germs as a human problem rather than seeing microorganisms as an independent member of the ecosystem. Teachers' explanations about germs varied in explicitness based on the grade level they taught while medical professionals based their understandings on formal knowledge and tended to use explicit technical language in their explanations of the phenomena.

  10. Marks of Change in Sequences

    NASA Astrophysics Data System (ADS)

    Jürgensen, H.

    2011-12-01

    Given a sequence of events, how does one recognize that a change has occurred? We explore potential definitions of the concept of change in a sequence and propose that words in relativized solid codes might serve as indicators of change.

  11. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness.

    PubMed

    Graneheim, U H; Lundman, B

    2004-02-01

    Qualitative content analysis as described in published literature shows conflicting opinions and unsolved issues regarding meaning and use of concepts, procedures and interpretation. This paper provides an overview of important concepts (manifest and latent content, unit of analysis, meaning unit, condensation, abstraction, content area, code, category and theme) related to qualitative content analysis; illustrates the use of concepts related to the research procedure; and proposes measures to achieve trustworthiness (credibility, dependability and transferability) throughout the steps of the research procedure. Interpretation in qualitative content analysis is discussed in light of Watzlawick et al.'s [Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies and Paradoxes. W.W. Norton & Company, New York, London] theory of communication.

  12. Increasing productivity through Total Reuse Management (TRM)

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.

  13. Experimental Evaluation of Journal Bearing Stability and New Gas Bearing Material

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Dimofte, Florin

    2001-01-01

    It has been estimated that the noise levels in aircraft engine transmissions can be reduced by as much as 10 dB through the use of journal bearings. The potential benefits of lower noise levels include reduced wear, longer gear life and enhanced comfort for passengers and crew. Based on this concept the journal-thrust wave bearing was analyzed and its performance was evaluated. Numerical codes, developed over the past 30 years by Dr. Dimofte, were used to predict the performance of the bearing. The wave bearing is a fluid film bearing and therefore was analyzed using the Reynolds pressure equation. The formulation includes turbulent flow concepts and possesses a viscosity-temperature correction. The centrifugal growth of the bearing diameter and the deformation of the bearing under gear loads were also incorporated into the code. An experimental rig was developed to test the journal-thrust wave bearing.

  14. Modeling and Optimization for Morphing Wing Concept Generation II. Part 1; Morphing Wing Modeling and Structural Sizing Techniques

    NASA Technical Reports Server (NTRS)

    Skillen, Michael D.; Crossley, William A.

    2008-01-01

    This report documents a series of investigations to develop an approach for structural sizing of various morphing wing concepts. For the purposes of this report, a morphing wing is one whose planform can make significant shape changes in flight - increasing wing area by 50% or more from the lowest possible area, changing sweep 30 or more, and / or increasing aspect ratio by as much as 200% from the lowest possible value. These significant changes in geometry mean that the underlying load-bearing structure changes geometry. While most finite element analysis packages provide some sort of structural optimization capability, these codes are not amenable to making significant changes in the stiffness matrix to reflect the large morphing wing planform changes. The investigations presented here use a finite element code capable of aeroelastic analysis in three different optimization approaches -a "simultaneous analysis" approach, a "sequential" approach, and an "aggregate" approach.

  15. [Preliminarily application of content analysis to qualitative nursing data].

    PubMed

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  16. Hard X-ray imaging from Explorer

    NASA Technical Reports Server (NTRS)

    Grindlay, J. E.; Murray, S. S.

    1981-01-01

    Coded aperture X-ray detectors were applied to obtain large increases in sensitivity as well as angular resolution. A hard X-ray coded aperture detector concept is described which enables very high sensitivity studies persistent hard X-ray sources and gamma ray bursts. Coded aperture imaging is employed so that approx. 2 min source locations can be derived within a 3 deg field of view. Gamma bursts were located initially to within approx. 2 deg and X-ray/hard X-ray spectra and timing, as well as precise locations, derived for possible burst afterglow emission. It is suggested that hard X-ray imaging should be conducted from an Explorer mission where long exposure times are possible.

  17. Proof Compression and the Mobius PCC Architecture for Embedded Devices

    NASA Technical Reports Server (NTRS)

    Jensen, Thomas

    2009-01-01

    The EU Mobius project has been concerned with the security of Java applications, and of mobile devices such as smart phones that execute such applications. In this talk, I'll give a brief overview of the results obtained on on-device checking of various security-related program properties. I'll then describe in more detail how the concept of certified abstract interpretation and abstraction-carrying code can be applied to polyhedral-based analysis of Java byte code in order to verify properties pertaining to the usage of resources of a down-loaded application. Particular emphasis has been on finding ways of reducing the size of the certificates that accompany a piece of code.

  18. Perception of "no code" and the role of the nurse.

    PubMed

    Honan, S; Helseth, C C; Bakke, J; Karpiuk, K; Krsnak, G; Torkelson, R

    1991-01-01

    CPR is now the rule rather than the exception and death is often viewed as the ultimate failure in modern medicine, rather than the final event of the natural life process (Stevens, 1986). The "No Code" concept has created a major dilemma in health care. An interagency collaborative study was conducted to ascertain the perceptions of nurses, physicians, and laypersons about this issue. This article deals primarily with the nurse's role and perceptions of the "No Code" issue. The comparison of nurses' perceptions with those of physicians and laypersons is unique to this study. Based on this research, suggestions are presented that will assist nursing educators and health care professionals in managing this complex dilemma.

  19. An Idealized, Single Radial Swirler, Lean-Direct-Injection (LDI) Concept Meshing Script

    NASA Technical Reports Server (NTRS)

    Iannetti, Anthony C.; Thompson, Daniel

    2008-01-01

    To easily study combustor design parameters using computational fluid dynamics codes (CFD), a Gridgen Glyph-based macro (based on the Tcl scripting language) dubbed BladeMaker has been developed for the meshing of an idealized, single radial swirler, lean-direct-injection (LDI) combustor. BladeMaker is capable of taking in a number of parameters, such as blade width, blade tilt with respect to the perpendicular, swirler cup radius, and grid densities, and producing a three-dimensional meshed radial swirler with a can-annular (canned) combustor. This complex script produces a data format suitable for but not specific to the National Combustion Code (NCC), a state-of-the-art CFD code developed for reacting flow processes.

  20. Software Development Processes Applied to Computational Icing Simulation

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  1. Extensions and improvements on XTRAN3S

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Improvements to the XTRAN3S computer program are summarized. Work on this code, for steady and unsteady aerodynamic and aeroelastic analysis in the transonic flow regime has concentrated on the following areas: (1) Maintenance of the XTRAN3S code, including correction of errors, enhancement of operational capability, and installation on the Cray X-MP system; (2) Extension of the vectorization concepts in XTRAN3S to include additional areas of the code for improved execution speed; (3) Modification of the XTRAN3S algorithm for improved numerical stability for swept, tapered wing cases and improved computational efficiency; and (4) Extension of the wing-only version of XTRAN3S to include pylon and nacelle or external store capability.

  2. A lncRNA Perspective into (Re)Building the Heart.

    PubMed

    Frank, Stefan; Aguirre, Aitor; Hescheler, Juergen; Kurian, Leo

    2016-01-01

    Our conception of the human genome, long focused on the 2% that codes for proteins, has profoundly changed since its first draft assembly in 2001. Since then, an unanticipatedly expansive functionality and convolution has been attributed to the majority of the genome that is transcribed in a cell-type/context-specific manner into transcripts with no apparent protein coding ability. While the majority of these transcripts, currently annotated as long non-coding RNAs (lncRNAs), are functionally uncharacterized, their prominent role in embryonic development and tissue homeostasis, especially in the context of the heart, is emerging. In this review, we summarize and discuss the latest advances in understanding the relevance of lncRNAs in (re)building the heart.

  3. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  4. A plug-in to Eclipse for VHDL source codes: functionalities

    NASA Astrophysics Data System (ADS)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  5. Multi-processing on supercomputers for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Mehta, Unmeel B.

    1990-01-01

    The MIMD concept is applied, through multitasking, with relatively minor modifications to an existing code for a single processor. This approach maps the available memory to multiple processors, exploiting the C-FORTRAN-Unix interface. An existing single processor algorithm is mapped without the need for developing a new algorithm. The procedure of designing a code utilizing this approach is automated with the Unix stream editor. A Multiple Processor Multiple Grid (MPMG) code is developed as a demonstration of this approach. This code solves the three-dimensional, Reynolds-averaged, thin-layer and slender-layer Navier-Stokes equations with an implicit, approximately factored and diagonalized method. This solver is applied to a generic, oblique-wing aircraft problem on a four-processor computer using one process for data management and nonparallel computations and three processes for pseudotime advance on three different grid systems.

  6. Total Quality Management, DLA Finance Center

    DTIC Science & Technology

    1989-07-01

    ton. DC 20503. DATE 3. REPORT TYPE AND DATES COVERED SJuly 1989 4. TITLE AND SUBTIT’LE 5. FUNDING NUMBERS Total Quality Management , DLA Finance Center 6...1989 ~ D 14. SUBJECT TERMS 15. NUMBER OF PAGES TQM (Total Quality Management ), Continuous Process Improvement. ., I Management 16. PRICE CODE 17...CONCEPTS TQM BASICS Total Quality Management (TQM) is a concept which is based on the work of a variety of people in a variety of fields. It includes

  7. Numerical simulation of jet mixing concepts in Tank 241-SY-101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Michener, T.E.

    The episodic gas release events (GRES) that have characterized the behavior of Tank 241-SY-101 for the past several years are thought to result from gases generated by the waste material in it that become trapped in the layer of settled solids at the bottom of the tank. Several concepts for mitigating the GREs have been proposed. One concept involves mobilizing the solid particles with mixing jets. The rationale behind this idea is to prevent formation of a consolidated layer of settled solids at the bottom of the tank, thus inhibiting the accumulation of gas bubbles in this layer. Numerical simulationsmore » were conducted using the TEMPEST computer code to assess the viability and effectiveness of the proposed jet discharge concepts and operating parameters. Before these parametric studies were commenced, a series of turbulent jet studies were conducted that established the adequacy of the TEMPEST code for this application. Configurations studied for Tank 241-SY-101 include centrally located downward discharging jets, draft tubes, and horizontal jets that are either stationary or rotating. Parameter studies included varying the jet discharge velocity, jet diameter, discharge elevation, and material properties. A total of 18 simulations were conducted and are reported in this document. The effect of gas bubbles on the mixing dynamics was not included within the scope of this study.« less

  8. A comparison of "life threatening injury" concept in the Turkish Penal Code and trauma scoring systems.

    PubMed

    Fedakar, Recep; Aydiner, Ahmet Hüsamettin; Ercan, Ilker

    2007-07-01

    To compare accuracy and to check the suitability of the Glasgow Coma Scale (GCS), the Revised Trauma Score (RTS), the Injury Severity Score (ISS), the New Injury Severity Score (NISS) and the Trauma and Injury Severity Score (TRISS), the scoring systems widely used in international trauma studies, in the evaluation of the "life threatening injury" concept established by the Turkish Penal Code. The age, sex, type of trauma, type and localizations of wounds, GCS, RTS, ISS, NISS and TRISS values, the decision of life threatening injury of 627 trauma patients admitted to Emergency Department of the Uludag University Medical School Hospital in year 2003 were examined. A life-threatening injury was present in 35.2% of the cases examined. GCS, RTS, ISS, NISS and TRISS confirmed the decision of life threatening injury with percentages of 74.8%, 76.9%, 88.7%, 86.6% and 68.6%, respectively. The best cut-off point 14 was determined in the ISS system with 79.6% sensitivity and 93.6% specificity. All of the cases with sole linear skull fracture officially decided as life threatening injury had an ISS of 5, a NISS of 6 and the best scores of GCS (15), RTS (7.8408) and TRISS (100%). ISS and NISS appeared to be the best trauma scoring systems that can be used for the decision of life threatening injury, compared with GCS, RTS and TRISS. Thus, ISS and NISS can be acceptable for using the evaluation of the life threatening injury concept established by the Turkish Penal Code.

  9. Ducted-Fan Engine Acoustic Predictions using a Navier-Stokes Code

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Biedron, R. T.; Farassat, F.; Spence, P. L.

    1998-01-01

    A Navier-Stokes computer code is used to predict one of the ducted-fan engine acoustic modes that results from rotor-wake/stator-blade interaction. A patched sliding-zone interface is employed to pass information between the moving rotor row and the stationary stator row. The code produces averaged aerodynamic results downstream of the rotor that agree well with a widely used average-passage code. The acoustic mode of interest is generated successfully by the code and is propagated well upstream of the rotor; temporal and spatial numerical resolution are fine enough such that attenuation of the signal is small. Two acoustic codes are used to find the far-field noise. Near-field propagation is computed by using Eversman's wave envelope code, which is based on a finite-element model. Propagation to the far field is accomplished by using the Kirchhoff formula for moving surfaces with the results of the wave envelope code as input data. Comparison of measured and computed far-field noise levels show fair agreement in the range of directivity angles where the peak radiation lobes from the inlet are observed. Although only a single acoustic mode is targeted in this study, the main conclusion is a proof-of-concept: Navier-Stokes codes can be used both to generate and propagate rotor/stator acoustic modes forward through an engine, where the results can be coupled to other far-field noise prediction codes.

  10. The challenge to unify treatment of high-temperature fatigue - A partisan proposal based on strainrange partitioning

    NASA Technical Reports Server (NTRS)

    Manson, S. S.

    1972-01-01

    The strainrange partitioning concept divides the imposed strain into four basic ranges involving time-dependent and time-independent components. It is shown that some of the results presented at the symposium can be better correlated on the basis of this concept than by alternative methods. It is also suggested that methods of data generation and analysis can be helpfully guided by this approach. Potential applicability of the concept to the treatment of frequency and hold-time effects, environmental influence, crack initiation and growth, thermal fatigue, and code specifications are briefly considered. A required experimental program is outlined.

  11. The Ever-Evolving Concept of the Gene: The Use of RNA/Protein Experimental Techniques to Understand Genome Functions

    PubMed Central

    Cipriano, Andrea; Ballarino, Monica

    2018-01-01

    The completion of the human genome sequence together with advances in sequencing technologies have shifted the paradigm of the genome, as composed of discrete and hereditable coding entities, and have shown the abundance of functional noncoding DNA. This part of the genome, previously dismissed as “junk” DNA, increases proportionally with organismal complexity and contributes to gene regulation beyond the boundaries of known protein-coding genes. Different classes of functionally relevant nonprotein-coding RNAs are transcribed from noncoding DNA sequences. Among them are the long noncoding RNAs (lncRNAs), which are thought to participate in the basal regulation of protein-coding genes at both transcriptional and post-transcriptional levels. Although knowledge of this field is still limited, the ability of lncRNAs to localize in different cellular compartments, to fold into specific secondary structures and to interact with different molecules (RNA or proteins) endows them with multiple regulatory mechanisms. It is becoming evident that lncRNAs may play a crucial role in most biological processes such as the control of development, differentiation and cell growth. This review places the evolution of the concept of the gene in its historical context, from Darwin's hypothetical mechanism of heredity to the post-genomic era. We discuss how the original idea of protein-coding genes as unique determinants of phenotypic traits has been reconsidered in light of the existence of noncoding RNAs. We summarize the technological developments which have been made in the genome-wide identification and study of lncRNAs and emphasize the methodologies that have aided our understanding of the complexity of lncRNA-protein interactions in recent years. PMID:29560353

  12. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  13. 75 FR 35076 - Division of Program Coordination, Planning, and Strategic Initiatives, Office of the Director...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... discussion are concept review of the following proposed FY 2011 Common Fund initiatives: (1) NIH-HMO Research... Canada): 866-695-1528. Conference code: 7626802625. Place: National Institutes of Health, Building 1...

  14. Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing

    NASA Astrophysics Data System (ADS)

    Salamone, Joseph A., III

    Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.

  15. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    NASA Astrophysics Data System (ADS)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  16. Establishing confidence in complex physics codes: Art or science?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trucano, T.

    1997-12-31

    The ALEGRA shock wave physics code, currently under development at Sandia National Laboratories and partially supported by the US Advanced Strategic Computing Initiative (ASCI), is generic to a certain class of physics codes: large, multi-application, intended to support a broad user community on the latest generation of massively parallel supercomputer, and in a continual state of formal development. To say that the author has ``confidence`` in the results of ALEGRA is to say something different than that he believes that ALEGRA is ``predictive.`` It is the purpose of this talk to illustrate the distinction between these two concepts. The authormore » elects to perform this task in a somewhat historical manner. He will summarize certain older approaches to code validation. He views these methods as aiming to establish the predictive behavior of the code. These methods are distinguished by their emphasis on local information. He will conclude that these approaches are more art than science.« less

  17. High-resolution imaging gamma-ray spectroscopy with externally segmented germanium detectors

    NASA Technical Reports Server (NTRS)

    Callas, J. L.; Mahoney, W. A.; Varnell, L. S.; Wheaton, W. A.

    1993-01-01

    Externally segmented germanium detectors promise a breakthrough in gamma-ray imaging capabilities while retaining the superb energy resolution of germanium spectrometers. An angular resolution of 0.2 deg becomes practical by combining position-sensitive germanium detectors having a segment thickness of a few millimeters with a one-dimensional coded aperture located about a meter from the detectors. Correspondingly higher angular resolutions are possible with larger separations between the detectors and the coded aperture. Two-dimensional images can be obtained by rotating the instrument. Although the basic concept is similar to optical or X-ray coded-aperture imaging techniques, several complicating effects arise because of the penetrating nature of gamma rays. The complications include partial transmission through the coded aperture elements, Compton scattering in the germanium detectors, and high background count rates. Extensive electron-photon Monte Carlo modeling of a realistic detector/coded-aperture/collimator system has been performed. Results show that these complicating effects can be characterized and accounted for with no significant loss in instrument sensitivity.

  18. A genetic scale of reading frame coding.

    PubMed

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Kaiser Permanente's Convergent Medical Terminology.

    PubMed

    Dolin, Robert H; Mattison, John E; Cohn, Simon; Campbell, Keith E; Wiesenthal, Andrew M; Hochhalter, Brad; LaBerge, Diane; Barsoum, Rita; Shalaby, James; Abilla, Alan; Clements, Robert J; Correia, Carol M; Esteva, Diane; Fedack, John M; Goldberg, Bruce J; Gopalarao, Sridhar; Hafeza, Eza; Hendler, Peter; Hernandez, Enrique; Kamangar, Ron; Kahn, Rafique A; Kurtovich, Georgina; Lazzareschi, Gerry; Lee, Moon H; Lee, Tracy; Levy, David; Lukoff, Jonathan Y; Lundberg, Cyndie; Madden, Michael P; Ngo, Trongtu L; Nguyen, Ben T; Patel, Nikhilkumar P; Resneck, Jim; Ross, David E; Schwarz, Kathleen M; Selhorst, Charles C; Snyder, Aaron; Umarji, Mohamed I; Vilner, Max; Zer-Chen, Roy; Zingo, Chris

    2004-01-01

    This paper describes Kaiser Permanente's (KP) enterprise-wide medical terminology solution, referred to as our Convergent Medical Terminology (CMT). Initially developed to serve the needs of a regional electronic health record, CMT has evolved into a core KP asset, serving as the common terminology across all applications. CMT serves as the definitive source of concept definitions for the organization, provides a consistent structure and access method to all codes used by the organization, and is KP's language of interoperability, with cross-mappings to regional ancillary systems and administrative billing codes. The core of CMT is comprised of SNOMED CT, laboratory LOINC, and First DataBank drug terminology. These are integrated into a single poly-hierarchically structured knowledge base. Cross map sets provide bi-directional translations between CMT and ancillary applications and administrative billing codes. Context sets provide subsets of CMT for use in specific contexts. Our experience with CMT has lead us to conclude that a successful terminology solution requires that: (1) usability considerations are an organizational priority; (2) "interface" terminology is differentiated from "reference" terminology; (3) it be easy for clinicians to find the concepts they need; (4) the immediate value of coded data be apparent to clinician user; (5) there be a well defined approach to terminology extensions. Over the past several years, there has been substantial progress made in the domain coverage and standardization of medical terminology. KP has learned to exploit that terminology in ways that are clinician-acceptable and that provide powerful options for data analysis and reporting.

  20. A parallel and modular deformable cell Car-Parrinello code

    NASA Astrophysics Data System (ADS)

    Cavazzoni, Carlo; Chiarotti, Guido L.

    1999-12-01

    We have developed a modular parallel code implementing the Car-Parrinello [Phys. Rev. Lett. 55 (1985) 2471] algorithm including the variable cell dynamics [Europhys. Lett. 36 (1994) 345; J. Phys. Chem. Solids 56 (1995) 510]. Our code is written in Fortran 90, and makes use of some new programming concepts like encapsulation, data abstraction and data hiding. The code has a multi-layer hierarchical structure with tree like dependences among modules. The modules include not only the variables but also the methods acting on them, in an object oriented fashion. The modular structure allows easier code maintenance, develop and debugging procedures, and is suitable for a developer team. The layer structure permits high portability. The code displays an almost linear speed-up in a wide range of number of processors independently of the architecture. Super-linear speed up is obtained with a "smart" Fast Fourier Transform (FFT) that uses the available memory on the single node (increasing for a fixed problem with the number of processing elements) as temporary buffer to store wave function transforms. This code has been used to simulate water and ammonia at giant planet conditions for systems as large as 64 molecules for ˜50 ps.

  1. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    NASA Astrophysics Data System (ADS)

    Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-01

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  2. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Working research codes into fluid dynamics education: a science gateway approach

    NASA Astrophysics Data System (ADS)

    Mason, Lachlan; Hetherington, James; O'Reilly, Martin; Yong, May; Jersakova, Radka; Grieve, Stuart; Perez-Suarez, David; Klapaukh, Roman; Craster, Richard V.; Matar, Omar K.

    2017-11-01

    Research codes are effective for illustrating complex concepts in educational fluid dynamics courses, compared to textbook examples, an interactive three-dimensional visualisation can bring a problem to life! Various barriers, however, prevent the adoption of research codes in teaching: codes are typically created for highly-specific `once-off' calculations and, as such, have no user interface and a steep learning curve. Moreover, a code may require access to high-performance computing resources that are not readily available in the classroom. This project allows academics to rapidly work research codes into their teaching via a minimalist `science gateway' framework. The gateway is a simple, yet flexible, web interface allowing students to construct and run simulations, as well as view and share their output. Behind the scenes, the common operations of job configuration, submission, monitoring and post-processing are customisable at the level of shell scripting. In this talk, we demonstrate the creation of an example teaching gateway connected to the Code BLUE fluid dynamics software. Student simulations can be run via a third-party cloud computing provider or a local high-performance cluster. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).

  4. Second-order statistics of colour codes modulate transformations that effectuate varying degrees of scene invariance and illumination invariance.

    PubMed

    Mausfeld, Rainer; Andres, Johannes

    2002-01-01

    We argue, from an ethology-inspired perspective, that the internal concepts 'surface colours' and 'illumination colours' are part of the data format of two different representational primitives. Thus, the internal concept of 'colour' is not a unitary one but rather refers to two different types of 'data structure', each with its own proprietary types of parameters and relations. The relation of these representational structures is modulated by a class of parameterised transformations whose effects are mirrored in the idealised computational achievements of illumination invariance of colour codes, on the one hand, and scene invariance, on the other hand. Because the same characteristics of a light array reaching the eye can be physically produced in many different ways, the visual system, then, has to make an 'inference' whether a chromatic deviation of the space-averaged colour codes from the neutral point is due to a 'non-normal', ie chromatic, illumination or due to an imbalanced spectral reflectance composition. We provide evidence that the visual system uses second-order statistics of chromatic codes of a single view of a scene in order to modulate corresponding transformations. In our experiments we used centre surround configurations with inhomogeneous surrounds given by a random structure of overlapping circles, referred to as Seurat configurations. Each family of surrounds has a fixed space-average of colour codes, but differs with respect to the covariance matrix of colour codes of pixels that defines the chromatic variance along some chromatic axis and the covariance between luminance and chromatic channels. We found that dominant wavelengths of red-green equilibrium settings of the infield exhibited a stable and strong dependence on the chromatic variance of the surround. High variances resulted in a tendency towards 'scene invariance', low variances in a tendency towards 'illumination invariance' of the infield.

  5. Scalable L-infinite coding of meshes.

    PubMed

    Munteanu, Adrian; Cernea, Dan C; Alecu, Alin; Cornelis, Jan; Schelkens, Peter

    2010-01-01

    The paper investigates the novel concept of local-error control in mesh geometry encoding. In contrast to traditional mesh-coding systems that use the mean-square error as target distortion metric, this paper proposes a new L-infinite mesh-coding approach, for which the target distortion metric is the L-infinite distortion. In this context, a novel wavelet-based L-infinite-constrained coding approach for meshes is proposed, which ensures that the maximum error between the vertex positions in the original and decoded meshes is lower than a given upper bound. Furthermore, the proposed system achieves scalability in L-infinite sense, that is, any decoding of the input stream will correspond to a perfectly predictable L-infinite distortion upper bound. An instantiation of the proposed L-infinite-coding approach is demonstrated for MESHGRID, which is a scalable 3D object encoding system, part of MPEG-4 AFX. In this context, the advantages of scalable L-infinite coding over L-2-oriented coding are experimentally demonstrated. One concludes that the proposed L-infinite mesh-coding approach guarantees an upper bound on the local error in the decoded mesh, it enables a fast real-time implementation of the rate allocation, and it preserves all the scalability features and animation capabilities of the employed scalable mesh codec.

  6. Navy’s Advanced Aircraft Armament System Program Concept Objectives

    DTIC Science & Technology

    1983-10-01

    12-1 00 NAVY’S ADVANCED AIRCRAFT ARMAMENT SYSTEM PROGRAM CONCEPT OBJECTIVES T. M . Leese and J. F. Haney Naval Weapons Center Code 31403 China...STORE FLWNT LIFE RECONFIOURATION ♦ UWMST OMHTH ninoairv M — MANN HUCTHM ^♦■ SILECT ALTERNATE • STORE 0PTI0M ■ REOUCIO CK« W0RKL0A0 • . README...mOVEMENTS INÜTEO FUIWUTY MM AM tTATWM COMPLEX AUTOMATIC LACK OF OIT RESTRICTIVE MLNIRV M FLUWAITV IUCSMVI Figure 1. Carrier aircraft

  7. Threshold quantum cryptography

    NASA Astrophysics Data System (ADS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding.

  8. ACCESS 1: Approximation Concepts Code for Efficient Structural Synthesis program documentation and user's guide

    NASA Technical Reports Server (NTRS)

    Miura, H.; Schmit, L. A., Jr.

    1976-01-01

    The program documentation and user's guide for the ACCESS-1 computer program is presented. ACCESS-1 is a research oriented program which implements a collection of approximation concepts to achieve excellent efficiency in structural synthesis. The finite element method is used for structural analysis and general mathematical programming algorithms are applied in the design optimization procedure. Implementation of the computer program, preparation of input data and basic program structure are described, and three illustrative examples are given.

  9. The fully programmable spacecraft: procedural sequencing for JPL deep space missions using VML (Virtual Machine Language)

    NASA Technical Reports Server (NTRS)

    Grasso, C. A.

    2002-01-01

    This paper lays out language constructs and capabilities, code features, and VML operations development concepts. The ability to migrate to the spacecraft functionality which is more traditionally implemented on the ground is examined.

  10. Symbolic Speech

    ERIC Educational Resources Information Center

    Podgor, Ellen S.

    1976-01-01

    The concept of symbolic speech emanates from the 1967 case of United States v. O'Brien. These discussions of flag desecration, grooming and dress codes, nude entertainment, buttons and badges, and musical expression show that the courts place symbolic speech in different strata from verbal communication. (LBH)

  11. Spontaneous self-descriptions and ethnic identities in individualistic and collectivistic cultures.

    PubMed

    Rhee, E; Uleman, J S; Lee, H K; Roman, R J

    1995-07-01

    The Twenty Statements Test (TST) was administered in Seoul and New York, to 454 students from 2 cultures that emphasize collectivism and individualism, respectively. Responses, coded into 33 categories, were classified as either abstract or specific and as either autonomous or social. These 2 dichotomies were more independent in Seoul than in New York. The New York sample included Asian American whose spontaneous social identities differed. They either never listed ethnicity-nationality on the TST, or listed it once or twice. Unidentified Asian Americans' self-concepts resembled Euro-Americans' self-concepts, and twice identified Asian Americans' self-concepts resembled Koreans' self-concepts, in both abstractness-specificity and autonomy-sociality. Differential acculturation did not account for these results. Implications for social identity, self-categorization, and acculturation theory are discussed.

  12. Information coding with frequency of oscillations in Belousov-Zhabotinsky encapsulated disks

    NASA Astrophysics Data System (ADS)

    Gorecki, J.; Gorecka, J. N.; Adamatzky, Andrew

    2014-04-01

    Information processing with an excitable chemical medium, like the Belousov-Zhabotinsky (BZ) reaction, is typically based on information coding in the presence or absence of excitation pulses. Here we present a new concept of Boolean coding that can be applied to an oscillatory medium. A medium represents the logical TRUE state if a selected region oscillates with a high frequency. If the frequency fails below a specified value, it represents the logical FALSE state. We consider a medium composed of disks encapsulating an oscillatory mixture of reagents, as related to our recent experiments with lipid-coated BZ droplets. We demonstrate that by using specific geometrical arrangements of disks containing the oscillatory medium one can perform logical operations on variables coded in oscillation frequency. Realizations of a chemical signal diode and of a single-bit memory with oscillatory disks are also discussed.

  13. Crucial steps to life: From chemical reactions to code using agents.

    PubMed

    Witzany, Guenther

    2016-02-01

    The concepts of the origin of the genetic code and the definitions of life changed dramatically after the RNA world hypothesis. Main narratives in molecular biology and genetics such as the "central dogma," "one gene one protein" and "non-coding DNA is junk" were falsified meanwhile. RNA moved from the transition intermediate molecule into centre stage. Additionally the abundance of empirical data concerning non-random genetic change operators such as the variety of mobile genetic elements, persistent viruses and defectives do not fit with the dominant narrative of error replication events (mutations) as being the main driving forces creating genetic novelty and diversity. The reductionistic and mechanistic views on physico-chemical properties of the genetic code are no longer convincing as appropriate descriptions of the abundance of non-random genetic content operators which are active in natural genetic engineering and natural genome editing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. The Development of Bimodal Bilingualism: Implications for Linguistic Theory.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2016-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.

  15. Computational fluid dynamics of airfoils and wings

    NASA Technical Reports Server (NTRS)

    Garabedian, P.; Mcfadden, G.

    1982-01-01

    It is pointed out that transonic flow is one of the fields where computational fluid dynamics turns out to be most effective. Codes for the design and analysis of supercritical airfoils and wings have become standard tools of the aircraft industry. The present investigation is concerned with mathematical models and theorems which account for some of the progress that has been made. The most successful aerodynamics codes are those for the analysis of flow at off-design conditions where weak shock waves appear. A major breakthrough was achieved by Murman and Cole (1971), who conceived of a retarded difference scheme which incorporates artificial viscosity to capture shocks in the supersonic zone. This concept has been used to develop codes for the analysis of transonic flow past a swept wing. Attention is given to the trailing edge and the boundary layer, entropy inequalities and wave drag, shockless airfoils, and the inverse swept wing code.

  16. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  17. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  18. The Development of Bimodal Bilingualism: Implications for Linguistic Theory

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2017-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and ‘transfer’ as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair. PMID:28603576

  19. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  20. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  1. Decoding the genome beyond sequencing: the new phase of genomic research.

    PubMed

    Heng, Henry H Q; Liu, Guo; Stevens, Joshua B; Bremer, Steven W; Ye, Karen J; Abdallah, Batoul Y; Horne, Steven D; Ye, Christine J

    2011-10-01

    While our understanding of gene-based biology has greatly improved, it is clear that the function of the genome and most diseases cannot be fully explained by genes and other regulatory elements. Genes and the genome represent distinct levels of genetic organization with their own coding systems; Genes code parts like protein and RNA, but the genome codes the structure of genetic networks, which are defined by the whole set of genes, chromosomes and their topological interactions within a cell. Accordingly, the genetic code of DNA offers limited understanding of genome functions. In this perspective, we introduce the genome theory which calls for the departure of gene-centric genomic research. To make this transition for the next phase of genomic research, it is essential to acknowledge the importance of new genome-based biological concepts and to establish new technology platforms to decode the genome beyond sequencing. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Superwide-angle coverage code-multiplexed optical scanner.

    PubMed

    Riza, Nabeel A; Arain, Muzammil A

    2004-05-01

    A superwide-angle coverage code-multiplexed optical scanner is presented that has the potential to provide 4 pi-sr coverage. As a proof-of-concept experiment, an angular scan range of 288 degrees for six randomly distributed beams is demonstrated. The proposed scanner achieves its superwide coverage by exploiting a combination of phase-encoded transmission and reflection holography within an in-line hologram recording-retrieval geometry. The basic scanner unit consists of one phase-only digital mode spatial light modulator for code entry (i.e., beam scan control) and a holographic material from which we obtained what we believe is the first-of-a-kind extremely wide coverage, low component count, high speed (e.g., microsecond domain), and large aperture (e.g., > 1-cm diameter) scanner.

  3. Chemical reacting flows

    NASA Astrophysics Data System (ADS)

    Lezberg, Erwin A.; Mularz, Edward J.; Liou, Meng-Sing

    1991-03-01

    The objectives and accomplishments of research in chemical reacting flows, including both experimental and computational problems are described. The experimental research emphasizes the acquisition of reliable reacting-flow data for code validation, the development of chemical kinetics mechanisms, and the understanding of two-phase flow dynamics. Typical results from two nonreacting spray studies are presented. The computational fluid dynamics (CFD) research emphasizes the development of efficient and accurate algorithms and codes, as well as validation of methods and modeling (turbulence and kinetics) for reacting flows. Major developments of the RPLUS code and its application to mixing concepts, the General Electric combustor, and the Government baseline engine for the National Aerospace Plane are detailed. Finally, the turbulence research in the newly established Center for Modeling of Turbulence and Transition (CMOTT) is described.

  4. The origins of informed consent: the International Scientific Commission on Medical War Crimes, and the Nuremburg code.

    PubMed

    Weindling, P

    2001-01-01

    The Nuremberg Code has generally been seen as arising from the Nuremberg Medical Trial. This paper examines developments prior to the Trial, involving the physiologist Andrew Conway Ivy and an inter-Allied Scientific Commission on Medical War Crimes. The paper traces the formulation of the concept of a medical war crime by the physiologist John West Thompson, as part of the background to Ivy's code on human experiments of 1 August 1946. It evaluates subsequent responses by the American Medical Association, and by other war crimes experts, notably Leo Alexander, who developed Ivy's conceptual framework. Ivy's interaction with the judges at Nuremberg alerted them to the importance of formulating ethical guidelines for clinical research.

  5. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  6. Topology-selective jamming of fully-connected, code-division random-access networks

    NASA Technical Reports Server (NTRS)

    Polydoros, Andreas; Cheng, Unjeng

    1990-01-01

    The purpose is to introduce certain models of topology selective stochastic jamming and examine its impact on a class of fully-connected, spread-spectrum, slotted ALOHA-type random access networks. The theory covers dedicated as well as half-duplex units. The dominant role of the spatial duty factor is established, and connections with the dual concept of time selective jamming are discussed. The optimal choices of coding rate and link access parameters (from the users' side) and the jamming spatial fraction are numerically established for DS and FH spreading.

  7. Modeling Electromagnetic Scattering From Complex Inhomogeneous Objects

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar; Reddy, C. J.

    2011-01-01

    This software innovation is designed to develop a mathematical formulation to estimate the electromagnetic scattering characteristics of complex, inhomogeneous objects using the finite-element-method (FEM) and method-of-moments (MoM) concepts, as well as to develop a FORTRAN code called FEMOM3DS (Finite Element Method and Method of Moments for 3-Dimensional Scattering), which will implement the steps that are described in the mathematical formulation. Very complex objects can be easily modeled, and the operator of the code is not required to know the details of electromagnetic theory to study electromagnetic scattering.

  8. Laboratory automation in a functional programming language.

    PubMed

    Runciman, Colin; Clare, Amanda; Harkness, Rob

    2014-12-01

    After some years of use in academic and research settings, functional languages are starting to enter the mainstream as an alternative to more conventional programming languages. This article explores one way to use Haskell, a functional programming language, in the development of control programs for laboratory automation systems. We give code for an example system, discuss some programming concepts that we need for this example, and demonstrate how the use of functional programming allows us to express and verify properties of the resulting code. © 2014 Society for Laboratory Automation and Screening.

  9. Parameter study of dual-mode space nuclear fission solid core power and propulsion systems, NUROC3A. AMS report No. 1239c

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, W.W.; Layton, J.P.

    1976-09-13

    The three-volume report describes a dual-mode nuclear space power and propulsion system concept that employs an advanced solid-core nuclear fission reactor coupled via heat pipes to one of several electric power conversion systems. The NUROC3A systems analysis code was designed to provide the user with performance characteristics of the dual-mode system. Volume 3 describes utilization of the NUROC3A code to produce a detailed parameter study of the system.

  10. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  11. Decoding the neural representation of fine-grained conceptual categories.

    PubMed

    Ghio, Marta; Vaghi, Matilde Maria Serena; Perani, Daniela; Tettamanti, Marco

    2016-05-15

    Neuroscientific research on conceptual knowledge based on the grounded cognition framework has shed light on the organization of concrete concepts into semantic categories that rely on different types of experiential information. Abstract concepts have traditionally been investigated as an undifferentiated whole, and have only recently been addressed in a grounded cognition perspective. The present fMRI study investigated the involvement of brain systems coding for experiential information in the conceptual processing of fine-grained semantic categories along the abstract-concrete continuum. These categories consisted of mental state-, emotion-, mathematics-, mouth action-, hand action-, and leg action-related meanings. Thirty-five sentences for each category were used as stimuli in a 1-back task performed by 36 healthy participants. A univariate analysis failed to reveal category-specific activations. Multivariate pattern analyses, in turn, revealed that fMRI data contained sufficient information to disentangle all six fine-grained semantic categories across participants. However, the category-specific activity patterns showed no overlap with the regions coding for experiential information. These findings demonstrate the possibility of detecting specific patterns of neural representation associated with the processing of fine-grained conceptual categories, crucially including abstract ones, though bearing no anatomical correspondence with regions coding for experiential information as predicted by the grounded cognition hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Regulatory and legal review of automated and connected truck platooning technology.

    DOT National Transportation Integrated Search

    2017-05-01

    Commercial truck platooning is a relatively novel concept in Texas and around the country. This white paper : presents the results of a review of state and federal code to identify regulatory and legislative hurdles that : may delay or deter platooni...

  13. Open Science: A Zealot's View

    EPA Science Inventory

    Open science encompasses many concepts, but most agree that for science to be truly open four things must be true. First, all components of the scientific project must be freely available including manuscripts, code, and data. Second, others must be able to repeat your work and ...

  14. A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Buonanno, Michael; Mavris, Dimitri

    2005-01-01

    The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.

  15. How do particle physicists learn the programming concepts they need?

    NASA Astrophysics Data System (ADS)

    Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.

    2015-12-01

    The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.

  16. G STL: the geostatistical template library in C++

    NASA Astrophysics Data System (ADS)

    Remy, Nicolas; Shtuka, Arben; Levy, Bruno; Caers, Jef

    2002-10-01

    The development of geostatistics has been mostly accomplished by application-oriented engineers in the past 20 years. The focus on concrete applications gave birth to many algorithms and computer programs designed to address different issues, such as estimating or simulating a variable while possibly accounting for secondary information such as seismic data, or integrating geological and geometrical data. At the core of any geostatistical data integration methodology is a well-designed algorithm. Yet, despite their obvious differences, all these algorithms share many commonalities on which to build a geostatistics programming library, lest the resulting library is poorly reusable and difficult to expand. Building on this observation, we design a comprehensive, yet flexible and easily reusable library of geostatistics algorithms in C++. The recent advent of the generic programming paradigm allows us elegantly to express the commonalities of the geostatistical algorithms into computer code. Generic programming, also referred to as "programming with concepts", provides a high level of abstraction without loss of efficiency. This last point is a major gain over object-oriented programming which often trades efficiency for abstraction. It is not enough for a numerical library to be reusable, it also has to be fast. Because generic programming is "programming with concepts", the essential step in the library design is the careful identification and thorough definition of these concepts shared by most geostatistical algorithms. Building on these definitions, a generic and expandable code can be developed. To show the advantages of such a generic library, we use G STL to build two sequential simulation programs working on two different types of grids—a surface with faults and an unstructured grid—without requiring any change to the G STL code.

  17. Modulation of the semantic system by word imageability.

    PubMed

    Sabsevitz, D S; Medler, D A; Seidenberg, M; Binder, J R

    2005-08-01

    A prevailing neurobiological theory of semantic memory proposes that part of our knowledge about concrete, highly imageable concepts is stored in the form of sensory-motor representations. While this theory predicts differential activation of the semantic system by concrete and abstract words, previous functional imaging studies employing this contrast have provided relatively little supporting evidence. We acquired event-related functional magnetic resonance imaging (fMRI) data while participants performed a semantic similarity judgment task on a large number of concrete and abstract noun triads. Task difficulty was manipulated by varying the degree to which the words in the triad were similar in meaning. Concrete nouns, relative to abstract nouns, produced greater activation in a bilateral network of multimodal and heteromodal association areas, including ventral and medial temporal, posterior-inferior parietal, dorsal prefrontal, and posterior cingulate cortex. In contrast, abstract nouns produced greater activation almost exclusively in the left hemisphere in superior temporal and inferior frontal cortex. Increasing task difficulty modulated activation mainly in attention, working memory, and response monitoring systems, with almost no effect on areas that were modulated by imageability. These data provide critical support for the hypothesis that concrete, imageable concepts activate perceptually based representations not available to abstract concepts. In contrast, processing abstract concepts makes greater demands on left perisylvian phonological and lexical retrieval systems. The findings are compatible with dual coding theory and less consistent with single-code models of conceptual representation. The lack of overlap between imageability and task difficulty effects suggests that once the neural representation of a concept is activated, further maintenance and manipulation of that information in working memory does not further increase neural activation in the conceptual store.

  18. Concept Elicitation Within Patient-Powered Research Networks: A Feasibility Study in Chronic Lymphocytic Leukemia.

    PubMed

    McCarrier, Kelly P; Bull, Scott; Fleming, Sarah; Simacek, Kristina; Wicks, Paul; Cella, David; Pierson, Renee

    2016-01-01

    To explore the feasibility of using social media-based patient networks to gather qualitative data on patient-reported outcome (PRO) concepts relevant to chronic lymphocytic leukemia (CLL). Between August and November 2013, US-residing members of the PatientsLikeMe online CLL patient community completed open-ended web-based surveys designed to elicit descriptions of CLL symptoms, impacts, and treatment-related perceptions. Qualitative telephone follow-up interviews were conducted with a subsample of respondents. Survey responses and interview transcripts were coded for qualitative analysis using Atlas.ti. Fifty survey responses were included in the analyses. Participants were age 60.5 ± 6.9 years, 54% female, and 96% white. When surveyed, 20% were receiving current treatment, 16% were in remission, and 64% were treatment-naïve. Among respondents, 369 descriptions of CLL symptoms were coded. Fatigue-related symptoms were expressed most frequently, with 54% reporting "fatigue," "tiredness," or both in their responses. These concepts were followed by night sweats (38%), swollen lymph nodes (32%), and frequent infections (28%). Among impacts of CLL, worry and fear (66% of respondents), depressed feelings (52%), and work limitations (50%) were noted most frequently. Survey results identified constitutional symptoms of CLL included in existing PRO instruments and the literature. Although the findings suggest that qualitative data obtained through social media applications can be potentially useful in supporting concept identification for newly developed PRO instruments, they also indicate that online approaches alone may not be sufficient to achieve efficient and exhaustive concept elicitation. Further research is needed to identify whether the results can support content validity in the same way as established qualitative research methods. Copyright © 2016. Published by Elsevier Inc.

  19. Making Semantic Waves: A Key to Cumulative Knowledge-Building

    ERIC Educational Resources Information Center

    Maton, Karl

    2013-01-01

    The paper begins by arguing that knowledge-blindness in educational research represents a serious obstacle to understanding knowledge-building. It then offers sociological concepts from Legitimation Code Theory--"semantic gravity" and "semantic density"--that systematically conceptualize one set of organizing principles underlying knowledge…

  20. "Thanks, Shokran, Gracias": Translingual Practices in a Facebook Group

    ERIC Educational Resources Information Center

    Kulavuz-Onal, Derya; Vásquez, Camilla

    2018-01-01

    The affordances associated with networked multilingualism (Androutsopoulos, 2015) have led social media scholars to replace traditional notions of code-switching with broader concepts such as translingual practices. In an attempt to further our understanding of online multilingual linguistic practices in the context of educational…

  1. CATS Household Travel Survey, Volume One: Documentation for the Chicago Central Business District

    DOT National Transportation Integrated Search

    1989-09-01

    This report contains descriptions of the surveying concepts, the editing and : coding logic, the data base structure, several summary tables and the data base : for the Chicago Central Business District. Also, because the data at this time : are unfa...

  2. Hypermedia Concepts and Research: An Overview.

    ERIC Educational Resources Information Center

    Burton, John K.; And Others

    1995-01-01

    Provides an overview of hypermedia, including a history of hypertext and multimedia, and discusses how they have been combined into the term hypermedia; a cognitive overview; dual coding and cue summation; and theories related to learners, including field dependence-independence, memory, and metacognition. Contains 156 references. (LRW)

  3. Translanguaging: Developing Its Conceptualisation and Contextualisation

    ERIC Educational Resources Information Center

    Lewis, Gwyn; Jones, Bryn; Baker, Colin

    2012-01-01

    Following from Lewis, Jones, and Baker (this issue), this article analyses the relationship between the new concept of "translanguaging" particularly in the classroom context and more historic terms such as code-switching and translation, indicating differences in (socio)linguistic and ideological understandings as well as in classroom…

  4. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less

  5. Flowfield Comparisons from Three Navier-Stokes Solvers for an Axisymmetric Separate Flow Jet

    NASA Technical Reports Server (NTRS)

    Koch, L. Danielle; Bridges, James; Khavaran, Abbas

    2002-01-01

    To meet new noise reduction goals, many concepts to enhance mixing in the exhaust jets of turbofan engines are being studied. Accurate steady state flowfield predictions from state-of-the-art computational fluid dynamics (CFD) solvers are needed as input to the latest noise prediction codes. The main intent of this paper was to ascertain that similar Navier-Stokes solvers run at different sites would yield comparable results for an axisymmetric two-stream nozzle case. Predictions from the WIND and the NPARC codes are compared to previously reported experimental data and results from the CRAFT Navier-Stokes solver. Similar k-epsilon turbulence models were employed in each solver, and identical computational grids were used. Agreement between experimental data and predictions from each code was generally good for mean values. All three codes underpredict the maximum value of turbulent kinetic energy. The predicted locations of the maximum turbulent kinetic energy were farther downstream than seen in the data. A grid study was conducted using the WIND code, and comments about convergence criteria and grid requirements for CFD solutions to be used as input for noise prediction computations are given. Additionally, noise predictions from the MGBK code, using the CFD results from the CRAFT code, NPARC, and WIND as input are compared to data.

  6. Performance of a Bounce-Averaged Global Model of Super-Thermal Electron Transport in the Earth's Magnetic Field

    NASA Technical Reports Server (NTRS)

    McGuire, Tim

    1998-01-01

    In this paper, we report the results of our recent research on the application of a multiprocessor Cray T916 supercomputer in modeling super-thermal electron transport in the earth's magnetic field. In general, this mathematical model requires numerical solution of a system of partial differential equations. The code we use for this model is moderately vectorized. By using Amdahl's Law for vector processors, it can be verified that the code is about 60% vectorized on a Cray computer. Speedup factors on the order of 2.5 were obtained compared to the unvectorized code. In the following sections, we discuss the methodology of improving the code. In addition to our goal of optimizing the code for solution on the Cray computer, we had the goal of scalability in mind. Scalability combines the concepts of portabilty with near-linear speedup. Specifically, a scalable program is one whose performance is portable across many different architectures with differing numbers of processors for many different problem sizes. Though we have access to a Cray at this time, the goal was to also have code which would run well on a variety of architectures.

  7. On the biological plausibility of grandmother cells: implications for neural network theories in psychology and neuroscience.

    PubMed

    Bowers, Jeffrey S

    2009-01-01

    A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated representations. One of the putative advantages of this approach is that the theories are biologically plausible. Indeed, advocates of the PDP approach often highlight the close parallels between distributed representations learned in connectionist models and neural coding in brain and often dismiss localist (grandmother cell) theories as biologically implausible. The author reviews a range a data that strongly challenge this claim and shows that localist models provide a better account of single-cell recording studies. The author also contrast local and alternative distributed coding schemes (sparse and coarse coding) and argues that common rejection of grandmother cell theories in neuroscience is due to a misunderstanding about how localist models behave. The author concludes that the localist representations embedded in theories of perception and cognition are consistent with neuroscience; biology only calls into question the distributed representations often learned in PDP models.

  8. Numerical Simulation of the Emergency Condenser of the SWR-1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krepper, Eckhard; Schaffrath, Andreas; Aszodi, Attila

    The SWR-1000 is a new innovative boiling water reactor (BWR) concept, which was developed by Siemens AG. This concept is characterized in particular by passive safety systems (e.g., four emergency condensers, four building condensers, eight passive pressure pulse transmitters, and six gravity-driven core-flooding lines). In the framework of the BWR Physics and Thermohydraulic Complementary Action to the European Union BWR Research and Development Cluster, emergency condenser tests were performed by Forschungszentrum Juelich at the NOKO test facility. Posttest calculations with ATHLET are presented, which aim at the determination of the removable power of the emergency condenser and its operation mode.more » The one-dimensional thermal-hydraulic code ATHLET was extended by the module KONWAR for the calculation of the heat transfer coefficient during condensation in horizontal tubes. In addition, results of conventional finite difference calculations using the code CFX-4 are presented, which investigate the natural convection during the heatup process at the secondary side of the NOKO test facility.« less

  9. Object-oriented design and programming in medical decision support.

    PubMed

    Heathfield, H; Armstrong, J; Kirkham, N

    1991-12-01

    The concept of object-oriented design and programming has recently received a great deal of attention from the software engineering community. This paper highlights the realisable benefits of using the object-oriented approach in the design and development of clinical decision support systems. These systems seek to build a computational model of some problem domain and therefore tend to be exploratory in nature. Conventional procedural design techniques do not support either the process of model building or rapid prototyping. The central concepts of the object-oriented paradigm are introduced, namely encapsulation, inheritance and polymorphism, and their use illustrated in a case study, taken from the domain of breast histopathology. In particular, the dual roles of inheritance in object-oriented programming are examined, i.e., inheritance as a conceptual modelling tool and inheritance as a code reuse mechanism. It is argued that the use of the former is not entirely intuitive and may be difficult to incorporate into the design process. However, inheritance as a means of optimising code reuse offers substantial technical benefits.

  10. The Advantages and Limitations of International Classification of Diseases, Injuries and Causes of Death from Aspect of Existing Health Care System of Bosnia and Herzegovina

    PubMed Central

    Kurbasic, Izeta; Pandza, Haris; Masic, Izet; Huseinagic, Senad; Tandir, Salih; Alicajic, Fredi; Toromanovic, Selim

    2008-01-01

    CONFLICT OF INTEREST: NONE DECLARED Introduction The International classification of diseases (ICD) is the most important classification in medicine. It is used by all medical professionals. Concept The basic concept of ICD is founded on the standardization of the nomenclature for the names of diseases and their basic systematization in the hierarchically structured category. Advantages and disadvantages The health care provider institutions such as hospitals are subjects that should facilitate implementation of medical applications that follows the patient medical condition and facts connected with him. The definitive diagnosis that can be coded using ICD can be achieved after several visits of patient and rarely during the first visit. Conclusion The ICD classification is one of the oldest and most important classifications in medicine. In the scope of ICD are all fields of medicine. It is used in statistical purpose and as a coding system in medical databases. PMID:24109155

  11. Coding and decoding libraries of sequence-defined functional copolymers synthesized via photoligation.

    PubMed

    Zydziak, Nicolas; Konrad, Waldemar; Feist, Florian; Afonin, Sergii; Weidner, Steffen; Barner-Kowollik, Christopher

    2016-11-30

    Designing artificial macromolecules with absolute sequence order represents a considerable challenge. Here we report an advanced light-induced avenue to monodisperse sequence-defined functional linear macromolecules up to decamers via a unique photochemical approach. The versatility of the synthetic strategy-combining sequential and modular concepts-enables the synthesis of perfect macromolecules varying in chemical constitution and topology. Specific functions are placed at arbitrary positions along the chain via the successive addition of monomer units and blocks, leading to a library of functional homopolymers, alternating copolymers and block copolymers. The in-depth characterization of each sequence-defined chain confirms the precision nature of the macromolecules. Decoding of the functional information contained in the molecular structure is achieved via tandem mass spectrometry without recourse to their synthetic history, showing that the sequence information can be read. We submit that the presented photochemical strategy is a viable and advanced concept for coding individual monomer units along a macromolecular chain.

  12. Concepts of happiness across time and cultures.

    PubMed

    Oishi, Shigehiro; Graham, Jesse; Kesebir, Selin; Galinha, Iolanda Costa

    2013-05-01

    We explored cultural and historical variations in concepts of happiness. First, we analyzed the definitions of happiness in dictionaries from 30 nations to understand cultural similarities and differences in happiness concepts. Second, we analyzed the definition of happiness in Webster's dictionaries from 1850 to the present day to understand historical changes in American English. Third, we coded the State of the Union addresses given by U.S. presidents from 1790 to 2010. Finally, we investigated the appearance of the phrases happy nation versus happy person in Google's Ngram Viewer from 1800 to 2008. Across cultures and time, happiness was most frequently defined as good luck and favorable external conditions. However, in American English, this definition was replaced by definitions focused on favorable internal feeling states. Our findings highlight the value of a historical perspective in the study of psychological concepts.

  13. An electrophysiological study of task demands on concreteness effects: evidence for dual coding theory.

    PubMed

    Welcome, Suzanne E; Paivio, Allan; McRae, Ken; Joanisse, Marc F

    2011-07-01

    We examined ERP responses during the generation of word associates or mental images in response to concrete and abstract concepts. Of interest were the predictions of dual coding theory (DCT), which proposes that processing lexical concepts depends on functionally independent but interconnected verbal and nonverbal systems. ERP responses were time-locked to either stimulus onset or response to compensate for potential latency differences across conditions. During word associate generation, but not mental imagery, concrete items elicited a greater N400 than abstract items. A concreteness effect emerged at a later time point during the mental imagery task. Data were also analyzed using time-frequency analysis that investigated synchronization of neuronal populations over time during processing. Concrete words elicited an enhanced late going desynchronization of theta-band power (723-938 ms post stimulus onset) during associate generation. During mental imagery, abstract items elicited greater delta-band power from 800 to 1,000 ms following stimulus onset, theta-band power from 350 to 205 ms before response, and alpha-band power from 900 to 800 ms before response. Overall, the findings support DCT in suggesting that lexical concepts are not amodal and that concreteness effects are modulated by tasks that focus participants on verbal versus nonverbal, imagery-based knowledge.

  14. Initial development of the Systems Approach to Home Medication Management (SAHMM) model.

    PubMed

    Doucette, William R; Vinel, Shanrae'l; Pennathur, Priyadarshini

    Adverse drug events and medication nonadherence are two problems associated with prescription medication use for chronic conditions. These issues often develop because patients have difficulty managing their medications at home. To guide patients and providers for achieving safe and effective medication use at home, the Systems Approach to Home Medication Management (SAHMM) model was derived from a systems engineering model for health care workplace safety. To explore how well concepts from the SAHMM model can represent home medication management by using patient descriptions of how they take prescription medications at home. Twelve patients were interviewed about home medication management using an interview guide based on the factors of the SAHMM model. Each interview was audio-taped and then transcribed verbatim. Interviews were coded to identify themes for home medication management using MAXQDA for Windows. SAHMM concepts extracted from the coded interview transcripts included work system components of person, tasks, tools & technology, internal environment, external environment, and household. Concepts also addressed work processes and work outcomes for home medication management. Using the SAHMM model for studying patients' home medication management is a promising approach to improving our understanding of the factors that influence patient adherence to medication and the development of adverse drug events. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Basic Concepts in Molecular Biology Related to Genetics and Epigenetics.

    PubMed

    Corella, Dolores; Ordovas, Jose M

    2017-09-01

    The observation that "one size does not fit all" for the prevention and treatment of cardiovascular disease, among other diseases, has driven the concept of precision medicine. The goal of precision medicine is to provide the best-targeted interventions tailored to an individual's genome. The human genome is composed of billions of sequence arrangements containing a code that controls how genes are expressed. This code depends on other nonstatic regulators that surround the DNA and constitute the epigenome. Moreover, environmental factors also play an important role in this complex regulation. This review provides a general perspective on the basic concepts of molecular biology related to genetics and epigenetics and a glossary of key terms. Several examples are given of polymorphisms and genetic risk scores related to cardiovascular risk. Likewise, an overview is presented of the main epigenetic regulators, including DNA methylation, methylcytosine-phosphate-guanine-binding proteins, histone modifications, other histone regulations, micro-RNA effects, and additional emerging regulators. One of the greatest challenges is to understand how environmental factors (diet, physical activity, smoking, etc.) could alter the epigenome, resulting in healthy or unhealthy cardiovascular phenotypes. We discuss some gene-environment interactions and provide a methodological overview. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  16. The Maximal C³ Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses.

    PubMed

    Michel, Christian J

    2017-04-18

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C 3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X . As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X . Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes.

  17. LACEwING: A New Moving Group Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riedel, Adric R.; Blunt, Sarah C.; Faherty, Jacqueline K.

    We present a new nearby young moving group (NYMG) kinematic membership analysis code, LocAting Constituent mEmbers In Nearby Groups (LACEwING), a new Catalog of Suspected Nearby Young Stars, a new list of bona fide members of moving groups, and a kinematic traceback code. LACEwING is a convergence-style algorithm with carefully vetted membership statistics based on a large numerical simulation of the Solar Neighborhood. Given spatial and kinematic information on stars, LACEwING calculates membership probabilities in 13 NYMGs and three open clusters within 100 pc. In addition to describing the inputs, methods, and products of the code, we provide comparisons ofmore » LACEwING to other popular kinematic moving group membership identification codes. As a proof of concept, we use LACEwING to reconsider the membership of 930 stellar systems in the Solar Neighborhood (within 100 pc) that have reported measurable lithium equivalent widths. We quantify the evidence in support of a population of young stars not attached to any NYMGs, which is a possible sign of new as-yet-undiscovered groups or of a field population of young stars.« less

  18. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  19. Decree No. 2737 issuing the Code of Minors, 27 November 1989.

    PubMed

    1989-01-01

    This document contains major provisions of the 1989 Code of Minors of Colombia. This Code spells out the rights of minors to protection, care, and adequate physical, mental, and social development. These rights go into force from the moment of conception. Minors have a specified right to life; to a defined filiation; to grow up within a family; to receive an education (compulsory to the ninth grade and free of charge); to be protected from abuse; to health care; to freedom of speech and to know their rights; to liberty of thought, conscience, and religion; to rest, recreation, and play; to participate in sports and the arts; and to be protected from labor exploitation. Handicapped minors have the right to care, education, and special training. Minors also have the right to be protected from the use of dependency-creating drugs. Any minor in an "irregular situation" will receive protective services. The Code defines abandoned minors and those in danger and provides specific protective measures which can be taken. Rules and procedures covering adoption are included in the Code, because adoption is viewed as primarily a protective measure.

  20. A User''s Guide to the Zwikker-Kosten Transmission Line Code (ZKTL)

    NASA Technical Reports Server (NTRS)

    Kelly, J. J.; Abu-Khajeel, H.

    1997-01-01

    This user's guide documents updates to the Zwikker-Kosten Transmission Line Code (ZKTL). This code was developed for analyzing new liner concepts developed to provide increased sound absorption. Contiguous arrays of multi-degree-of-freedom (MDOF) liner elements serve as the model for these liner configurations, and Zwikker and Kosten's theory of sound propagation in channels is used to predict the surface impedance. Transmission matrices for the various liner elements incorporate both analytical and semi-empirical methods. This allows standard matrix techniques to be employed in the code to systematically calculate the composite impedance due to the individual liner elements. The ZKTL code consists of four independent subroutines: 1. Single channel impedance calculation - linear version (SCIC) 2. Single channel impedance calculation - nonlinear version (SCICNL) 3. Multi-channel, multi-segment, multi-layer impedance calculation - linear version (MCMSML) 4. Multi-channel, multi-segment, multi-layer impedance calculation - nonlinear version (MCMSMLNL) Detailed examples, comments, and explanations for each liner impedance computation module are included. Also contained in the guide are depictions of the interactive execution, input files and output files.

  1. A novel approach of an absolute coding pattern based on Hamiltonian graph

    NASA Astrophysics Data System (ADS)

    Wang, Ya'nan; Wang, Huawei; Hao, Fusheng; Liu, Liqiang

    2017-02-01

    In this paper, a novel approach of an optical type absolute rotary encoder coding pattern is presented. The concept is based on the principle of the absolute encoder to find out a unique sequence that ensures an unambiguous shaft position of any angular. We design a single-ring and a n-by-2 matrix absolute encoder coding pattern by using the variations of Hamiltonian graph principle. 12 encoding bits is used in the single-ring by a linear array CCD to achieve an 1080-position cycle encoding. Besides, a 2-by-2 matrix is used as an unit in the 2-track disk to achieve a 16-bits encoding pattern by using an area array CCD sensor (as a sample). Finally, a higher resolution can be gained by an electronic subdivision of the signals. Compared with the conventional gray or binary code pattern (for a 2n resolution), this new pattern has a higher resolution (2n*n) with less coding tracks, which means the new pattern can lead to a smaller encoder, which is essential in the industrial production.

  2. The representation of abstract words: what matters? Reply to Paivio's (2013) comment on Kousta et al. (2011).

    PubMed

    Vigliocco, Gabriella; Kousta, Stavroula; Vinson, David; Andrews, Mark; Del Campo, Elena

    2013-02-01

    In Kousta, Vigliocco, Vinson, Andrews, and Del Campo (2011), we presented an embodied theory of semantic representation, which crucially included abstract concepts as internally embodied via affective states. Paivio (2013) took issue with our treatment of dual coding theory, our reliance on data from lexical decision, and our theoretical proposal. Here, we address these different issues and clarify how our findings offer a way to move forward in the investigation of how abstract concepts are represented. 2013 APA, all rights reserved

  3. "Zeroing" in on mathematics in the monkey brain.

    PubMed

    Beran, Michael J

    2016-03-01

    A new study documented that monkeys showed selective neuronal responding to the concept of zero during a numerical task, and that there were two distinct classes of neurons that coded the absence of stimuli either through a discrete activation pattern (zero or not zero) or a continuous one for which zero was integrated with other numerosities in the relative rate of activity. These data indicate that monkeys, like humans, have a concept of zero that is part of their analog number line but that also may have unique properties compared to other numerosities.

  4. Analytical and experimental validation of the Oblique Detonation Wave Engine concept

    NASA Technical Reports Server (NTRS)

    Adelman, Henry G.; Cambier, Jean-Luc; Menees, Gene P.; Balboni, John A.

    1988-01-01

    The Oblique Detonation Wave Engine (ODWE) for hypersonic flight has been analytically studied by NASA using the CFD codes which fully couple finite rate chemistry with fluid dynamics. Fuel injector designs investigated included wall and strut injectors, and the in-stream strut injectors were chosen to provide good mixing with minimal stagnation pressure losses. Plans for experimentally validating the ODWE concept in an arc-jet hypersonic wind tunnel are discussed. Measurements of the flow field properties behind the oblique wave will be compared to analytical predictions.

  5. Method for Estimating the Sonic-Boom Characteristics of Lifting Canard-Wing Aircraft Concepts

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.

    2005-01-01

    A method for estimating the sonic-boom overpressures from a conceptual aircraft where the lift is carried by both a canard and a wing during supersonic cruise is presented and discussed. Computer codes used for the prediction of the aerodynamic performance of the wing, the canard-wing interference, the nacelle-wing interference, and the sonic-boom overpressures are identified and discussed as the procedures in the method are discussed. A canard-wing supersonic-cruise concept was used as an example to demonstrate the application of the method.

  6. High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual

    NASA Technical Reports Server (NTRS)

    Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.

    2004-01-01

    This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.

  7. An Infrastructure for UML-Based Code Generation Tools

    NASA Astrophysics Data System (ADS)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  8. HSR combustion analytical research

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1992-01-01

    Increasing the pressure and temperature of the engines of a new generation of supersonic airliners increases the emissions of nitrogen oxides (NO(x)) to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of evolving and implementing low emissions combustor technologies, NASA LeRC has pursued a combustion analysis code program to guide combustor design processes, to identify potential concepts of the greatest promise, and to optimize them at low cost, with short turnaround time. The computational analyses are evaluated at actual engine operating conditions. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts were made in further improving the code capabilities for modeling the physics and the numerical methods of solution. Then test cases and measurements from experiments are used for code validation.

  9. ogs6 - a new concept for porous-fractured media simulations

    NASA Astrophysics Data System (ADS)

    Naumov, Dmitri; Bilke, Lars; Fischer, Thomas; Rink, Karsten; Wang, Wenqing; Watanabe, Norihiro; Kolditz, Olaf

    2015-04-01

    OpenGeoSys (OGS) is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THMC) processes in porous and fractured media, continuously developed since the mid-eighties. The basic concept is to provide a flexible numerical framework for solving coupled multi-field problems. OGS is targeting mainly on applications in environmental geoscience, e.g. in the fields of contaminant hydrology, water resources management, waste deposits, or geothermal energy systems, but it has also been successfully applied to new topics in energy storage recently. OGS is actively participating several international benchmarking initiatives, e.g. DECOVALEX (waste management), CO2BENCH (CO2 storage and sequestration), SeSBENCH (reactive transport processes) and HM-Intercomp (coupled hydrosystems). Despite the broad applicability of OGS in geo-, hydro- and energy-sciences, several shortcomings became obvious concerning the computational efficiency as well as the code structure became too sophisticated for further efficient development. OGS-5 was designed for object-oriented FEM applications. However, in many multi-field problems a certain flexibility of tailored numerical schemes is essential. Therefore, a new concept was designed to overcome existing bottlenecks. The paradigms for ogs6 are: - Flexibility of numerical schemes (FEM#FVM#FDM), - Computational efficiency (PetaScale ready), - Developer- and user-friendly. ogs6 has a module-oriented architecture based on thematic libraries (e.g. MeshLib, NumLib) on the large scale and uses object-oriented approach for the small scale interfaces. Usage of a linear algebra library (Eigen3) for the mathematical operations together with the ISO C++11 standard increases the expressiveness of the code and makes it more developer-friendly. The new C++ standard also makes the template meta-programming technique code used for compile-time optimizations more compact. We have transitioned the main code development to the GitHub code hosting system (https://github.com/ufz/ogs). The very flexible revision control system Git in combination with issue tracking, developer feedback and the code review options improve the code quality and the development process in general. The continuous testing procedure of the benchmarks as it was established for OGS-5 is maintained. Additionally unit testing, which is automatically triggered by any code changes, is executed by two continuous integration frameworks (Jenkins CI, Travis CI) which build and test the code on different operating systems (Windows, Linux, Mac OS), in multiple configurations and with different compilers (GCC, Clang, Visual Studio). To improve the testing possibilities further, XML based file input formats are introduced helping with automatic validation of the user contributed benchmarks. The first ogs6 prototype version 6.0.1 has been implemented for solving generic elliptic problems. Next steps are envisaged to transient, non-linear and coupled problems. Literature: [1] Kolditz O, Shao H, Wang W, Bauer S (eds) (2014): Thermo-Hydro-Mechanical-Chemical Processes in Fractured Porous Media: Modelling and Benchmarking - Closed Form Solutions. In: Terrestrial Environmental Sciences, Vol. 1, Springer, Heidelberg, ISBN 978-3-319-11893-2, 315pp. http://www.springer.com/earth+sciences+and+geography/geology/book/978-3-319-11893-2 [2] Naumov D (2015): Computational Fluid Dynamics in Unconsolidated Sediments: Model Generation and Discrete Flow Simulations, PhD thesis, Technische Universität Dresden.

  10. Identification of Requirements for Computer-Supported Matching of Food Consumption Data with Food Composition Data

    PubMed Central

    Korošec, Peter; Eftimov, Tome; Ocke, Marga; van der Laan, Jan; Roe, Mark; Berry, Rachel; Turrini, Aida; Krems, Carolin; Slimani, Nadia; Finglas, Paul

    2018-01-01

    This paper identifies the requirements for computer-supported food matching, in order to address not only national and European but also international current related needs and represents an integrated research contribution of the FP7 EuroDISH project. The available classification and coding systems and the specific problems of food matching are summarized and a new concept for food matching based on optimization methods and machine-based learning is proposed. To illustrate and test this concept, a study has been conducted in four European countries (i.e., Germany, The Netherlands, Italy and the UK) using different classification and coding systems. This real case study enabled us to evaluate the new food matching concept and provide further recommendations for future work. In the first stage of the study, we prepared subsets of food consumption data described and classified using different systems, that had already been manually matched with national food composition data. Once the food matching algorithm was trained using this data, testing was performed on another subset of food consumption data. Experts from different countries validated food matching between consumption and composition data by selecting best matches from the options given by the matching algorithm without seeing the result of the previously made manual match. The evaluation of study results stressed the importance of the role and quality of the food composition database as compared to the selected classification and/or coding systems and the need to continue compiling national food composition data as eating habits and national dishes still vary between countries. Although some countries managed to collect extensive sets of food consumption data, these cannot be easily matched with food composition data if either food consumption or food composition data are not properly classified and described using any classification and coding systems. The study also showed that the level of human expertise played an important role, at least in the training stage. Both sets of data require continuous development to improve their quality in dietary assessment. PMID:29601516

  11. An Object-Oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2009-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.

  12. Intellectual Freedom

    ERIC Educational Resources Information Center

    Knox, Emily

    2011-01-01

    Support for intellectual freedom, a concept codified in the American Library Association's Library Bill of Rights and Code of Ethics, is one of the core tenets of modern librarianship. According to the most recent interpretation of the Library Bill of Rights, academic librarians are encouraged to incorporate the principles of intellectual freedom…

  13. Organization of Programming Knowledge of Novices and Experts.

    ERIC Educational Resources Information Center

    Wiedenbeck, Susan

    1986-01-01

    Reports on an experiment where novice and expert programmers made decisions about Fortran code segments. The results show that, although expert programmers are better able to extract and use functional information, they don't differ significantly from novices in their ability to use syntactic concepts. (Author/EM)

  14. A Need for a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    1982-01-01

    Examines sources available for developing a theory of visual literacy and attempts to clarify the meaning of the term. Suggests that visual thinking, a concept supported by recent research on mental imagery, visualization, and dual coding, ought to be the emphasis for future theory development. (FL)

  15. Coupled dynamics analysis of wind energy systems

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.

    1977-01-01

    A qualitative description of all key elements of a complete wind energy system computer analysis code is presented. The analysis system addresses the coupled dynamics characteristics of wind energy systems, including the interactions of the rotor, tower, nacelle, power train, control system, and electrical network. The coupled dynamics are analyzed in both the frequency and time domain to provide the basic motions and loads data required for design, performance verification and operations analysis activities. Elements of the coupled analysis code were used to design and analyze candidate rotor articulation concepts. Fundamental results and conclusions derived from these studies are presented.

  16. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 8: Cooling Flow/heat Transfer Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.

    1994-01-01

    The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. The primary validation case was the film cooled C3X vane. The cooling hole modeling included both a porous region and grid in each discrete hold. Predictions for these models as well as smooth wall compared well with the experimental data.

  17. Konzeption, Entwicklung und Evaluierung eines Messsystems zur sortenreinen Klassifikation von fluoreszenzcodierten Kunststoffen im Rahmen des Kunststoff-Recyclings(Conception, development and evaluation of a measuring system for the classification of fluorescence coded plastics within the framework of plastic recycling)

    DTIC Science & Technology

    2016-06-13

    REPORT DOCUMENTATION PAGE 1 Form Approved OMB No. 0704-0 188 Public reporting burden for this collection of information is estimated to average 1...PRI CE CODE 19, SECURITY CLASSI FI CATI ON 20. LIM ITATION OF ABSTRACT OF ABSTRACT UNCLASSIFIED UL Standard Form 298 (Rev. 2-89) Prescribed by...Trennung komplexer Kunststoffmixturen in Form von typi- schem Kunststoffmahlgut ("Flakes") und insbesondere dunkler bzw. schwarzer Kunst- stoffe beseitigt

  18. Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor

    NASA Technical Reports Server (NTRS)

    Acree, C. W., Jr.

    2010-01-01

    Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.

  19. Wave journal bearing with compressible lubricant--Part 1: The wave bearing concept and a comparison to the plain circular bearing

    NASA Technical Reports Server (NTRS)

    Dimofte, Florin

    1995-01-01

    To improve hydrodynamic journal bearing steady-state and dynamic performance, a new bearing concept, the wave journal bearing, was developed at the author's lab. This concept features a waved inner bearing diameter. Compared to other alternative bearing geometries used to improve bearing performance such as spiral or herring-bone grooves, steps, etc., the wave bearing's design is relatively simple and allows the shaft to rotate in either direction. A three-wave bearing operating with a compressible lubricant, i.e., gas is analyzed using a numerical code. Its performance is compared to a plain (truly) circular bearing over a broad range of bearing working parameters, e.g., bearing numbers from 0.01 to 100.

  20. Integration of nursing assessment concepts into the medical entities dictionary using the LOINC semantic structure as a terminology model.

    PubMed

    Cieslowski, B J; Wajngurt, D; Cimino, J J; Bakken, S

    2001-01-01

    Recent investigations have tested the applicability of various terminology models for the representing nursing concepts including those related to nursing diagnoses, nursing interventions, and standardized nursing assessments as a prerequisite for building a reference terminology that supports the nursing domain. We used the semantic structure of Clinical LOINC (Logical Observations, Identifiers, Names, and Codes) as a reference terminology model to support the integration of standardized assessment terms from two nursing terminologies into the Medical Entities Dictionary (MED), the concept-oriented, metadata dictionary at New York Presbyterian Hospital. Although the LOINC semantic structure was used previously to represent laboratory terms in the MED, selected hierarchies and semantic slots required revisions in order to incorporate the nursing assessment concepts. This project was an initial step in integrating nursing assessment concepts into the MED in a manner consistent with evolving standards for reference terminology models. Moreover, the revisions provide the foundation for adding other types of standardized assessments to the MED.

  1. Integration of nursing assessment concepts into the medical entities dictionary using the LOINC semantic structure as a terminology model.

    PubMed Central

    Cieslowski, B. J.; Wajngurt, D.; Cimino, J. J.; Bakken, S.

    2001-01-01

    Recent investigations have tested the applicability of various terminology models for the representing nursing concepts including those related to nursing diagnoses, nursing interventions, and standardized nursing assessments as a prerequisite for building a reference terminology that supports the nursing domain. We used the semantic structure of Clinical LOINC (Logical Observations, Identifiers, Names, and Codes) as a reference terminology model to support the integration of standardized assessment terms from two nursing terminologies into the Medical Entities Dictionary (MED), the concept-oriented, metadata dictionary at New York Presbyterian Hospital. Although the LOINC semantic structure was used previously to represent laboratory terms in the MED, selected hierarchies and semantic slots required revisions in order to incorporate the nursing assessment concepts. This project was an initial step in integrating nursing assessment concepts into the MED in a manner consistent with evolving standards for reference terminology models. Moreover, the revisions provide the foundation for adding other types of standardized assessments to the MED. PMID:11825165

  2. An alternative resource sharing scheme for land mobile satellite services

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee; Sue, Miles K.

    1990-01-01

    A preliminary comparison between the two competing channelization concepts for the Land Mobile Satellite Services (LMSS), namely frequency division (FD) and code division (CD), is presented. Both random access and demand-assigned approaches are considered under these concepts. The CD concept is compared with the traditional FD concept based on the system consideration and a projected traffic model. It is shown that CD is not particularly attractive for the first generation Mobile Satellite Services because of the spectral occupancy of the network bandwidth. However, the CD concept is a viable alternative for future systems such as the personal access satellite system (PASS) in the Ka-band spectrum where spectral efficiency is not of prime concern. The effects of power robbing and voice activity factor are incorporated. It was shown that the traditional rule of thumb of dividing the number of raw channels by the voice activity factor to obtain the effective number of channels is only valid asymptotically as the aggregated traffic approaches infinity.

  3. An alternative resource sharing scheme for land mobile satellite services

    NASA Astrophysics Data System (ADS)

    Yan, Tsun-Yee; Sue, Miles K.

    A preliminary comparison between the two competing channelization concepts for the Land Mobile Satellite Services (LMSS), namely frequency division (FD) and code division (CD), is presented. Both random access and demand-assigned approaches are considered under these concepts. The CD concept is compared with the traditional FD concept based on the system consideration and a projected traffic model. It is shown that CD is not particularly attractive for the first generation Mobile Satellite Services because of the spectral occupancy of the network bandwidth. However, the CD concept is a viable alternative for future systems such as the personal access satellite system (PASS) in the Ka-band spectrum where spectral efficiency is not of prime concern. The effects of power robbing and voice activity factor are incorporated. It was shown that the traditional rule of thumb of dividing the number of raw channels by the voice activity factor to obtain the effective number of channels is only valid asymptotically as the aggregated traffic approaches infinity.

  4. Axisymmetric Tandem Mirrors: Stabilization and Confinement Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, R.F.; Fowler, T.K.; Bulmer, R.

    2005-01-15

    The 'Kinetic Stabilizer' has been proposed as a means of MHD stabilizing an axisymmetric tandem mirror system. The K-S concept is based on theoretical studies by Ryutov, confirmed experimentally in the Gas Dynamic Trap experiment in Novosibirsk. In the K-S beams of ions are directed into the end of an 'expander' region outside the outer mirror of a tandem mirror. These ions, slowed, stagnated, and reflected as they move up the magnetic gradient, produce a low-density stabilizing plasma.At the Lawrence Livermore National Laboratory we have been conducting theoretical and computational studies of the K-S Tandem Mirror. These studies have employedmore » a low-beta code written especially to analyze the beam injection/stabilization process,and a new code SYMTRAN (by Hua and Fowler)that solves the coupled radial and axial particle and energy transport in a K-S T-M. Also, a 'legacy' MHD stability code, FLORA, has been upgraded and employed to benchmark the injection/stabilization code and to extend its results to high beta values.The FLORA code studies so far have confirmed the effectiveness of the K-S in stabilizing high-beta (40%) plasmas with stabilizer plasmas the peak pressures of which are several orders of magnitude smaller than those of the confined plasma.Also the SYMTRAN code has shown D-T plasma ignition from alpha particle energy deposition in T-M regimes with strong end plugging.Our studies have confirmed the viability of the K-S T-M concept with respect to MHD stability and radial and axial confinement. We are continuing these studies in order to optimize the parameters and to examine means for the stabilization of possible residual instability modes, such as drift modes and 'trapped-particle' modes. These modes may in principle be controlled by tailoring the stabilizer plasma distribution and/or the radial potential distribution.In the paper the results to date of our studies are summarized and projected to scope out possible fusion-power versions of the K-S T-M.« less

  5. Axisymmetric Tandem Mirrors: Stabilization and Confinement Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, R F; Fowler, T K; Bulmer, R

    2004-07-15

    The 'Kinetic Stabilizer' has been proposed as a means of MHD stabilizing an axisymmetric tandem mirror system. The K-S concept is based on theoretical studies by Ryutov, confirmed experimentally in the Gas Dynamic Trap experiment in Novosibirsk. In the K-S beams of ions are directed into the end of an 'expander' region outside the outer mirror of a tandem mirror. These ions, slowed, stagnated, and reflected as they move up the magnetic gradient, produce a low-density stabilizing plasma. At the Lawrence Livermore National Laboratory we have been conducting theoretical and computational studies of the K-S Tandem Mirror. These studies havemore » employed a low-beta code written especially to analyze the beam injection/stabilization process, and a new code SYMTRAN (by Hua and Fowler) that solves the coupled radial and axial particle and energy transport in a K-S TM. Also, a 'legacy' MHD stability code, FLORA, has been upgraded and employed to benchmark the injection/stabilization code and to extend its results to high beta values. The FLORA code studies so far have confirmed the effectiveness of the K-S in stabilizing high-beta (40%) plasmas with stabilizer plasmas the peak pressures of which are several orders of magnitude smaller than those of the confined plasma. Also the SYMTRAN code has shown D-T plasma ignition from alpha particle energy deposition in T-M regimes with strong end plugging. Our studies have confirmed the viability of the K-S-T-M concept with respect to MHD stability and radial and axial confinement. We are continuing these studies in order to optimize the parameters and to examine means for the stabilization of possible residual instability modes, such as drift modes and 'trapped-particle' modes. These modes may in principle be controlled by tailoring the stabilizer plasma distribution and/or the radial potential distribution. In the paper the results to date of our studies are summarized and projected to scope out possible fusion-power versions of the K-S T-M« less

  6. Identifying Falls Risk Screenings Not Documented with Administrative Codes Using Natural Language Processing

    PubMed Central

    Zhu, Vivienne J; Walker, Tina D; Warren, Robert W; Jenny, Peggy B; Meystre, Stephane; Lenert, Leslie A

    2017-01-01

    Quality reporting that relies on coded administrative data alone may not completely and accurately depict providers’ performance. To assess this concern with a test case, we developed and evaluated a natural language processing (NLP) approach to identify falls risk screenings documented in clinical notes of patients without coded falls risk screening data. Extracting information from 1,558 clinical notes (mainly progress notes) from 144 eligible patients, we generated a lexicon of 38 keywords relevant to falls risk screening, 26 terms for pre-negation, and 35 terms for post-negation. The NLP algorithm identified 62 (out of the 144) patients who falls risk screening documented only in clinical notes and not coded. Manual review confirmed 59 patients as true positives and 77 patients as true negatives. Our NLP approach scored 0.92 for precision, 0.95 for recall, and 0.93 for F-measure. These results support the concept of utilizing NLP to enhance healthcare quality reporting. PMID:29854264

  7. Field-programmable beam reconfiguring based on digitally-controlled coding metasurface

    NASA Astrophysics Data System (ADS)

    Wan, Xiang; Qi, Mei Qing; Chen, Tian Yi; Cui, Tie Jun

    2016-02-01

    Digital phase shifters have been applied in traditional phased array antennas to realize beam steering. However, the phase shifter deals with the phase of the induced current; hence, it has to be in the path of each element of the antenna array, making the phased array antennas very expensive. Metamaterials and/or metasurfaces enable the direct modulation of electromagnetic waves by designing subwavelength structures, which opens a new way to control the beam scanning. Here, we present a direct digital mechanism to control the scattered electromagnetic waves using coding metasurface, in which each unit cell loads a pin diode to produce binary coding states of “1” and “0”. Through data lines, the instant communications are established between the coding metasurface and the internal memory of field-programmable gate arrays (FPGA). Thus, we realize the digital modulation of electromagnetic waves, from which we present the field-programmable reflective antenna with good measurement performance. The proposed mechanism and functional device have great application potential in new-concept radar and communication systems.

  8. A proposed study of multiple scattering through clouds up to 1 THz

    NASA Technical Reports Server (NTRS)

    Gerace, G. C.; Smith, E. K.

    1992-01-01

    A rigorous computation of the electromagnetic field scattered from an atmospheric liquid water cloud is proposed. The recent development of a fast recursive algorithm (Chew algorithm) for computing the fields scattered from numerous scatterers now makes a rigorous computation feasible. A method is presented for adapting this algorithm to a general case where there are an extremely large number of scatterers. It is also proposed to extend a new binary PAM channel coding technique (El-Khamy coding) to multiple levels with non-square pulse shapes. The Chew algorithm can be used to compute the transfer function of a cloud channel. Then the transfer function can be used to design an optimum El-Khamy code. In principle, these concepts can be applied directly to the realistic case of a time-varying cloud (adaptive channel coding and adaptive equalization). A brief review is included of some preliminary work on cloud dispersive effects on digital communication signals and on cloud liquid water spectra and correlations.

  9. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  10. Hypersonic vehicle simulation model: Winged-cone configuration

    NASA Technical Reports Server (NTRS)

    Shaughnessy, John D.; Pinckney, S. Zane; Mcminn, John D.; Cruz, Christopher I.; Kelley, Marie-Louise

    1990-01-01

    Aerodynamic, propulsion, and mass models for a generic, horizontal-takeoff, single-stage-to-orbit (SSTO) configuration are presented which are suitable for use in point mass as well as batch and real-time six degree-of-freedom simulations. The simulations can be used to investigate ascent performance issues and to allow research, refinement, and evaluation of integrated guidance/flight/propulsion/thermal control systems, design concepts, and methodologies for SSTO missions. Aerodynamic force and moment coefficients are given as functions of angle of attack, Mach number, and control surface deflections. The model data were estimated by using a subsonic/supersonic panel code and a hypersonic local surface inclination code. Thrust coefficient and engine specific impulse were estimated using a two-dimensional forebody, inlet, nozzle code and a one-dimensional combustor code and are given as functions of Mach number, dynamic pressure, and fuel equivalence ratio. Rigid-body mass moments of inertia and center of gravity location are functions of vehicle weight which is in turn a function of fuel flow.

  11. Permutation coding technique for image recognition systems.

    PubMed

    Kussul, Ernst M; Baidyk, Tatiana N; Wunsch, Donald C; Makeyev, Oleksandr; Martín, Anabel

    2006-11-01

    A feature extractor and neural classifier for image recognition systems are proposed. The proposed feature extractor is based on the concept of random local descriptors (RLDs). It is followed by the encoder that is based on the permutation coding technique that allows to take into account not only detected features but also the position of each feature on the image and to make the recognition process invariant to small displacements. The combination of RLDs and permutation coding permits us to obtain a sufficiently general description of the image to be recognized. The code generated by the encoder is used as an input data for the neural classifier. Different types of images were used to test the proposed image recognition system. It was tested in the handwritten digit recognition problem, the face recognition problem, and the microobject shape recognition problem. The results of testing are very promising. The error rate for the Modified National Institute of Standards and Technology (MNIST) database is 0.44% and for the Olivetti Research Laboratory (ORL) database it is 0.1%.

  12. A comparison of data interoperability approaches of fusion codes with application to synthetic diagnostics

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.

    2010-11-01

    As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.

  13. Study of fusion product effects in field-reversed mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driemeyer, D.E.

    1980-01-01

    The effect of fusion products (fps) on Field-Reversed Mirror (FRM) reactor concepts has been evaluated through the development of two new computer models. The first code (MCFRM) treats fps as test particles in a fixed background plasma, which is represented as a fluid. MCFRM includes a Monte Carlo treatment of Coulomb scattering and thus provides an accurate treatment of fp behavior even at lower energies where pitch-angle scattering becomes important. The second code (FRMOD) is a steady-state, globally averaged, two-fluid (ion and electron), point model of the FRM plasma that incorporates fp heating and ash buildup values which are consistentmore » with the MCFRM calculations. These codes have been used extensively in the development of an advanced-fuel FRM reactor design (SAFFIRE). A Catalyzed-D version of the plant is also discussed along with an investigation of the steady-state energy distribution of fps in the FRM. User guides for the two computer codes are also included.« less

  14. Decoding the Emerging Patterns Exhibited in Non-coding RNAs Characteristic of Lung Cancer with Regard to their Clinical Significance.

    PubMed

    Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana

    2018-05-01

    Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.

  15. Time synchronized video systems

    NASA Technical Reports Server (NTRS)

    Burnett, Ron

    1994-01-01

    The idea of synchronizing multiple video recordings to some type of 'range' time has been tried to varying degrees of success in the past. Combining this requirement with existing time code standards (SMPTE) and the new innovations in desktop multimedia however, have afforded an opportunity to increase the flexibility and usefulness of such efforts without adding costs over the traditional data recording and reduction systems. The concept described can use IRIG, GPS or a battery backed internal clock as the master time source. By converting that time source to Vertical Interval Time Code or Longitudinal Time Code, both in accordance with the SMPTE standards, the user will obtain a tape that contains machine/computer readable time code suitable for use with editing equipment that is available off-the-shelf. Accuracy on playback is then determined by the playback system chosen by the user. Accuracies of +/- 2 frames are common among inexpensive systems and complete frame accuracy is more a matter of the users' budget than the capability of the recording system.

  16. Moral competence among nurses in Malawi: A concept analysis approach.

    PubMed

    Maluwa, Veronica Mary; Gwaza, Elizabeth; Sakala, Betty; Kapito, Esnath; Mwale, Ruth; Haruzivishe, Clara; Chirwa, Ellen

    2018-01-01

    Nurses are expected to provide comprehensive, holistic and ethically accepted care according to their code of ethics and practice. However, in Malawi, this is not always the case. This article analyses moral competence concept using the Walker and Avant's strategy of concept analysis. The aim of this article is to analyse moral competence concept in relation to nursing practice and determine defining attributes, antecedents and consequences of moral competence in nursing practice. Analysis of moral competence concept was done using Walker and Avant's strategy of concept analysis. Deductive analysis was used to find the defining attributes of moral competence, which were kindness, compassion, caring, critical thinking, ethical decision making ability, problem solving, responsibility, discipline, accountability, communication, solidarity, honesty, and respect for human values, dignity and rights. The identified antecedents were personal, cultural and religious values; nursing ethics training, environment and guidance. The consequences of moral competence are team work spirit, effective communication, improved performance and positive attitudes in providing nursing care. Moral competence can therefore be used as a tool to improve care in nursing practice to meet patients' problems and needs and consequently increase public's satisfaction in Malawi.

  17. Plug Into "The Modernizing Machine"! Danish University Reform and Its Transformable Academic Subjectivities

    ERIC Educational Resources Information Center

    Krejsler, John Benedicto

    2013-01-01

    "The modernizing machine" codes individual bodies, things, and symbols with images from New Public Management, neo-liberal, and Knowledge Economy discourses. Drawing on Deleuze and Guattari's concept of machines, this article explores how "the modernizing machine" produces neo-liberal modernization of the public sector. Taking…

  18. Grid Standards and Codes | Grid Modernization | NREL

    Science.gov Websites

    simulations that take advantage of advanced concepts such as hardware-in-the-loop testing. Such methods of methods and solutions. Projects Accelerating Systems Integration Standards Sharp increases in goal of this project is to develop streamlined and accurate methods for New York utilities to determine

  19. NASA's Use of Human Behavior Models for Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2012-01-01

    Overview of NASA's use of computational approaches and methods to support research goals, of human performance models, with a focus on examples of the methods used in Code TH and TI at NASA Ames, followed by an in depth review of MIDAS' current FAA work.

  20. Opening our science: Open science and cyanobacterial research at the US EPA

    EPA Science Inventory

    In this blog post we introduce the idea of Open Science and discuss multiple ways we are implementing these concepts in our cyanobacteria research. We give examples of our open access publications, open source code that support our research, and provide open access to our resear...

  1. Bud's World. Grade 3. New York Agriculture in the Classroom.

    ERIC Educational Resources Information Center

    Wolanyk, Betty

    This collection of classroom exercises was designed to maximize teacher time, while creating an awareness of our food and fiber system among New York third graders. The materials are color-coded, falling into four categories: language arts, mathematics, science, and social studies. Each exercise includes background information, concepts, and…

  2. The Trouble with Legal Ethics.

    ERIC Educational Resources Information Center

    Simon, William H.

    1991-01-01

    The conceptions of legal ethics or professional responsibility as (1) disciplinary rules or codes; and (2) as the personal moralities of individual lawyers prevail. However, it is the application of general norms to specific circumstances through complex, creative judgment that is the ethical component of the ideal of legal professionalism. (MSE)

  3. Elementary School Students' Perceptions of Technology in their Pictorial Representations

    ERIC Educational Resources Information Center

    Eristi, Suzan Duygu; Kurt, Adile Askim

    2011-01-01

    The current study aimed to reveal elementary school students' perceptions of technology through their pictorial representations and their written expressions based on their pictorial representations. Content analysis based on the qualitative research method along with art-based inquiry was applied. The "coding system for the concepts revealed…

  4. Student Resistance to Schooling: Disconnections with Education in Rural Appalachia

    ERIC Educational Resources Information Center

    Hendrickson, Katie A.

    2012-01-01

    This study investigates student reasons for resisting engagement with school in a rural Appalachian area. The concept of student resistance to school is considered within a White, working-class student population. Through classroom observations, students displaying resistant behaviors were selected to participate in interviews. Coding of interview…

  5. New York Agriculture in the Classroom. Grade 4.

    ERIC Educational Resources Information Center

    Wolanyk, Betty

    These classroom exercises have been designed to maximize teacher time, while creating an awareness of our food and fiber system among New York fourth graders. The materials are color-coded, falling into four categories: language arts, mathematics, science, and social studies. Each exercise includes background information, concepts, and objectives…

  6. New York Agriculture in the Classroom. Grade 6.

    ERIC Educational Resources Information Center

    Wolanyk, Betty

    These classroom exercises have been designed to maximize teacher time, while creating an awareness of our food and fiber system among New York sixth graders. The materials are color-coded, falling into four categories: language arts, mathematics, science, and social studies. Each exercise includes background information, concepts, and objectives…

  7. Computational Participation: Understanding Coding as an Extension of Literacy Instruction

    ERIC Educational Resources Information Center

    Burke, Quinn; O'Byrne, W. Ian; Kafai, Yasmin B.

    2016-01-01

    Understanding the computational concepts on which countless digital applications run offers learners the opportunity to no longer simply read such media but also become more discerning end users and potentially innovative "writers" of new media themselves. To think computationally--to solve problems, to design systems, and to process and…

  8. High density arrays of micromirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folta, J. M.; Decker, J. Y.; Kolman, J.

    We established and achieved our goal to (1) fabricate and evaluate test structures based on the micromirror design optimized for maskless lithography applications, (2) perform system analysis and code development for the maskless lithography concept, and (3) identify specifications for micromirror arrays (MMAs) for LLNL's adaptive optics (AO) applications and conceptualize new devices.

  9. An Inquiry Activity for Genetics Using Chromosome Mapping.

    ERIC Educational Resources Information Center

    Leonard, William H.; Snodgrass, George

    1982-01-01

    Concepts to be developed, objectives, and student instructions are provided for an activity useful as an introduction to or review of Mendelian genetics and sex determination. Universal codes (read by optical scanners at supermarket checkout stands) from soup can labels are used as chromosome maps during the activity. (JN)

  10. World Wide Web Page Design: A Structured Approach.

    ERIC Educational Resources Information Center

    Gregory, Gwen; Brown, M. Marlo

    1997-01-01

    Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…

  11. Puzzling the Picture Using Grounded Theory

    ERIC Educational Resources Information Center

    Bennett, Elisabeth E.

    2016-01-01

    Since the first publication by Glaser and Strauss in 1967, Grounded Theory has become a highly influential research approach in the social sciences. The approach provides techniques and coding strategies for building theory inductively from the "ground up" as concepts within the data earn relevance into an evolving substantive theory.…

  12. New York Agriculture in the Classroom. Grade 5.

    ERIC Educational Resources Information Center

    Wolanyk, Betty

    These classroom exercises have been designed to maximize teacher time, while creating an awareness of our food and fiber system among New York fifth graders. The materials are color-coded, falling into four categories: language arts, mathematics, science, and social studies. Each exercise includes background information, concepts, and objectives…

  13. N+3 Aircraft Concept Designs and Trade Studies. Volume 2; Appendices-Design Methodologies for Aerodynamics, Structures, Weight, and Thermodynamic Cycles

    NASA Technical Reports Server (NTRS)

    Greitzer, E. M.; Bonnefoy, P. A.; delaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Lovegren, J.; hide

    2010-01-01

    Appendices A to F present the theory behind the TASOPT methodology and code. Appendix A describes the bulk of the formulation, while Appendices B to F develop the major sub-models for the engine, fuselage drag, BLI accounting, etc.

  14. High school students' views of learning chemistry concepts with analogies

    NASA Astrophysics Data System (ADS)

    Mathews, Jeffrey A.

    Analogies are often used in teaching abstract chemistry concepts, however few studies are concerned with how students actually view learning with analogies. An eight-member focus group, consisting of high school students, described the process of learning with analogies and how aware they were of their own learning. The students attended four analogy presentations and completed written responses, attended focus groups, and participated in repeated individual interview sessions throughout this eight-week, emic, phenomenological study. This study utilized an interpretive, qualitative methodology using a constant comparative, inductive analysis design. Students from a suburban high school in the southeastern United States were selected by purposeful sampling involving a concepts pre-test and an analogy presentation used to determine an eight member focus group. The focus group meetings were videotaped and emergent, semi-structured individual interviews were audio taped, transcribed and coded. Personal student journals, field notes, and a reflective journal were used to triangulate the study. Open, axial, and selective coding were used for data analysis and interpretation. Students described the process of learning with analogies as being able to visually see connections or picture mental images of familiar and unfamiliar concepts. Students pointed out the significance of investigating analogy breakdowns and described accommodation of new information as either automatic, which according to students resulted in memorization and hard learning, or quite laborious, which resulted in understanding and soft learning. Results indicated that students gave themselves more permission to ask questions and be critical of the teaching they are experiencing when their views were given merit. Implications for teachers include insight on students' views of learning and students' self-awareness.

  15. Analytical Modeling of Herschel-Quincke Concept Applied to Inlet Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Hallez, Raphael F.; Burdisso, Ricardo A.; Gerhold, Carl H. (Technical Monitor)

    2002-01-01

    This report summarizes the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the period from January 1999 to December 2000 on the project 'Investigation of an Adaptive Herschel-Quincke Tube Concept for the Reduction of Tonal and Broadband Noise from Turbofan Engines', funded by NASA Langley Research Center. The Herschel-Quincke (HQ) tube concept is a developing technique the consists of circumferential arrays of tubes around the duct. The analytical model is developed to provide prediction and design guidelines for application of the HQ concept to turbofan engine inlets. An infinite duct model is developed and used to provide insight into attenuation mechanisms and design strategies. Based on this early model, the NASA-developed TBIEM3D code is modified for the HQ system. This model allows for investigation of the HQ system combined with a passive liner.

  16. Initial Noise Assessment of an Embedded-wing-propulsion Concept Vehicle

    NASA Technical Reports Server (NTRS)

    Stone, James R.; Krejsa, Eugene A.

    2008-01-01

    Vehicle acoustic requirements are considered for a Cruise-Efficient Short Take-Off and Landing (CESTOL) vehicle concept using an Embedded-Wing-Propulsion (EWP) system based on a review of the literature. Successful development of such vehicles would enable more efficient use of existing airports in accommodating the anticipated growth in air traffic while at the same time reducing the noise impact on the community around the airport. A noise prediction capability for CESTOL-EWP aircraft is developed, based largely on NASA's FOOTPR code and other published methods, with new relations for high aspect ratio slot nozzles and wing shielding. The predictive model is applied to a preliminary concept developed by Boeing for NASA GRC. Significant noise reduction for such an aircraft relative to the current state-of-the-art is predicted, and technology issues are identified which should be addressed to assure that the potential of this design concept is fully achieved with minimum technical risk.

  17. The ASLOTS concept: An interactive, adaptive decision support concept for Final Approach Spacing of Aircraft (FASA). FAA-NASA Joint University Program

    NASA Technical Reports Server (NTRS)

    Simpson, Robert W.

    1993-01-01

    This presentation outlines a concept for an adaptive, interactive decision support system to assist controllers at a busy airport in achieving efficient use of multiple runways. The concept is being implemented as a computer code called FASA (Final Approach Spacing for Aircraft), and will be tested and demonstrated in ATCSIM, a high fidelity simulation of terminal area airspace and airport surface operations. Objectives are: (1) to provide automated cues to assist controllers in the sequencing and spacing of landing and takeoff aircraft; (2) to provide the controller with a limited ability to modify the sequence and spacings between aircraft, and to insert takeoffs and missed approach aircraft in the landing flows; (3) to increase spacing accuracy using more complex and precise separation criteria while reducing controller workload; and (4) achieve higher operational takeoff and landing rates on multiple runways in poor visibility.

  18. Description of movement quality in patients with low back pain: A qualitative study as a first step to a practical definition.

    PubMed

    van Dijk, Margriet J H; Smorenburg, Nienke T A; Visser, Bart; Nijhuis-van der Sanden, Maria W G; Heerkens, Yvonne F

    2017-03-01

    As a first step to formulate a practical definition for movement quality (MQ), this study aims to explore how Dutch allied health care professionals (AHCPs) describe MQ of daily life activities in patients with low back pain (LBP). In this qualitative cross-sectional digital survey study, Dutch AHCPs (n = 91) described MQ in open text (n = 91) and with three keywords (n = 90). After exploratory qualitative content analysis, the ICF linking rules (International Classification of Functioning, Disability and Health) were applied to classify MQ descriptions and keywords. The identified meaningful concepts (MCs) of the descriptions (274) and keywords (239) were linked to ICF codes (87.5% and 80.3%, respectively), Personal factors (5.8% and 5.9%, respectively), and supplementary codes (6.6% and 13.8%, respectively). The MCs were linked to a total of 31 ICF codes, especially to b760 'control of voluntary movement functions', b7602 'coordination of voluntary movements', d4 'Mobility', and d230 'carry out daily routine'. Negative and positive formulated descriptions elucidated different MQ interpretations. Descriptions of MQ given by Dutch AHCPs in patients with LBP cover all ICF components. Coordination and functional movements are seen as the most elementary concepts of MQ. Variation in MQ descriptions and interpretations hinders defining MQ and indicates the necessity of additional steps.

  19. Dual neutral particle induced transmutation in CINDER2008

    NASA Astrophysics Data System (ADS)

    Martin, W. J.; de Oliveira, C. R. E.; Hecht, A. A.

    2014-12-01

    Although nuclear transmutation methods for fission have existed for decades, the focus has been on neutron-induced reactions. Recent novel concepts have sought to use both neutrons and photons for purposes such as active interrogation of cargo to detect the smuggling of highly enriched uranium, a concept that would require modeling the transmutation caused by both incident particles. As photonuclear transmutation has yet to be modeled alongside neutron-induced transmutation in a production code, new methods need to be developed. The CINDER2008 nuclear transmutation code from Los Alamos National Laboratory is extended from neutron applications to dual neutral particle applications, allowing both neutron- and photon-induced reactions for this modeling with a focus on fission. Following standard reaction modeling, the induced fission reaction is understood as a two-part reaction, with an entrance channel to the excited compound nucleus, and an exit channel from the excited compound nucleus to the fission fragmentation. Because photofission yield data-the exit channel from the compound nucleus-are sparse, neutron fission yield data are used in this work. With a different compound nucleus and excitation, the translation to the excited compound state is modified, as appropriate. A verification and validation of these methods and data has been performed. This has shown that the translation of neutron-induced fission product yield sets, and their use in photonuclear applications, is appropriate, and that the code has been extended correctly.

  20. An Analysis of the Changes in Communication Techniques in the Italian Codes of Medical Deontology.

    PubMed

    Conti, Andrea Alberto

    2017-04-28

    The code of deontology of the Italian National Federation of the Colleges of Physicians, Surgeons and Dentists (FNOMCeO) contains the principles and rules to which the professional medical practitioner must adhere. This work identifies and analyzes the medical-linguistic choices and the expressive techniques present in the different editions of the code, and evaluates their purpose and function, focusing on the first appearance and the subsequent frequency of key terms. Various aspects of the formal and expressive revisions of the eight editions of the Codes of Medical Deontology published after the Second World War (from 1947/48 to 2014) are here presented, starting from a brief comparison with the first edition of 1903. Formal characteristics, choices of medical terminology and the introduction of new concepts and communicative attitudes are here identified and evaluated. This paper, in presenting a quantitative and epistemological analysis of variations, modifications and confirmations in the different editions of the Italian code of medical deontology over the last century, enucleates and demonstrates the dynamic paradigm of changing attitudes in the medical profession. This analysis shows the evolution in medical-scientific communication as embodied in the Italian code of medical deontology. This code, in its adoption, changes and adaptations, as evidenced in its successive editions, bears witness to the expressions and attitudes pertinent to and characteristic of the deontological stance of the medical profession during the twentieth century.

  1. Dynamic state estimation based on Poisson spike trains—towards a theory of optimal encoding

    NASA Astrophysics Data System (ADS)

    Susemihl, Alex; Meir, Ron; Opper, Manfred

    2013-03-01

    Neurons in the nervous system convey information to higher brain regions by the generation of spike trains. An important question in the field of computational neuroscience is how these sensory neurons encode environmental information in a way which may be simply analyzed by subsequent systems. Many aspects of the form and function of the nervous system have been understood using the concepts of optimal population coding. Most studies, however, have neglected the aspect of temporal coding. Here we address this shortcoming through a filtering theory of inhomogeneous Poisson processes. We derive exact relations for the minimal mean squared error of the optimal Bayesian filter and, by optimizing the encoder, obtain optimal codes for populations of neurons. We also show that a class of non-Markovian, smooth stimuli are amenable to the same treatment, and provide results for the filtering and prediction error which hold for a general class of stochastic processes. This sets a sound mathematical framework for a population coding theory that takes temporal aspects into account. It also formalizes a number of studies which discussed temporal aspects of coding using time-window paradigms, by stating them in terms of correlation times and firing rates. We propose that this kind of analysis allows for a systematic study of temporal coding and will bring further insights into the nature of the neural code.

  2. [Data protection and data access (I): federal data protection law and the social welfare code with reference to carrying out occupational medicine epidemiologic studies in Germany].

    PubMed

    Weigelt, E; Scherb, H

    1992-11-01

    The regulations applicable to research in occupational epidemiology are the federal data protection (confidentiality) law (BDSG), the social welfare code (SGB), medical professional secrecy regulations and the federal statistics law (BStatG). The SGB, medical professional secrecy, and BStatG codes take precedence over BDSG rulings. This paper discusses BDSG and SGB. Medical professional secrecy and BStatG will be the topic of another publication (Datenschutz and Datenzugang II). The BDSG permits processing and utilization of personal data only if 1. this is permitted by BDSG or a law with higher priority, or 2. if the individual concerned has given her or his informed consent. According to the BDSG private research institutes can have access to personal data collected within non-public institutions only via section 28 (2) without consent of the individual. The "research paragraph" section 40 governs the processing and utilisation of personal data by research institutions. As a rule, the SGB permits access to epidemiological data sources only with the informed consent of the individuals concerned. The exception is section 75 SGBX. This paragraph permits disclosure of personal data without the individual's consent by the relevant public institution only if public interest considerably outweighs the private concerns. To our knowledge, however, this clause has had no practical significance. The concept of "informed consent" is discussed in detail, including the requirements for a legal form for informed consent. The legal codes of the BDSG, professional secrecy, and BStatG permit the transfer of personal data if the individuals concerned remain anonymous. This paper deals in detail with the concept of "anonymity".(ABSTRACT TRUNCATED AT 250 WORDS)

  3. SoAx: A generic C++ Structure of Arrays for handling particles in HPC codes

    NASA Astrophysics Data System (ADS)

    Homann, Holger; Laenen, Francois

    2018-03-01

    The numerical study of physical problems often require integrating the dynamics of a large number of particles evolving according to a given set of equations. Particles are characterized by the information they are carrying such as an identity, a position other. There are generally speaking two different possibilities for handling particles in high performance computing (HPC) codes. The concept of an Array of Structures (AoS) is in the spirit of the object-oriented programming (OOP) paradigm in that the particle information is implemented as a structure. Here, an object (realization of the structure) represents one particle and a set of many particles is stored in an array. In contrast, using the concept of a Structure of Arrays (SoA), a single structure holds several arrays each representing one property (such as the identity) of the whole set of particles. The AoS approach is often implemented in HPC codes due to its handiness and flexibility. For a class of problems, however, it is known that the performance of SoA is much better than that of AoS. We confirm this observation for our particle problem. Using a benchmark we show that on modern Intel Xeon processors the SoA implementation is typically several times faster than the AoS one. On Intel's MIC co-processors the performance gap even attains a factor of ten. The same is true for GPU computing, using both computational and multi-purpose GPUs. Combining performance and handiness, we present the library SoAx that has optimal performance (on CPUs, MICs, and GPUs) while providing the same handiness as AoS. For this, SoAx uses modern C++ design techniques such template meta programming that allows to automatically generate code for user defined heterogeneous data structures.

  4. SAM Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less

  5. The Maximal C3 Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses

    PubMed Central

    Michel, Christian J.

    2017-01-01

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X. As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X. Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes. PMID:28420220

  6. A tactile-output paging communication system for the deaf-blind

    NASA Technical Reports Server (NTRS)

    Baer, J. A.

    1979-01-01

    A radio frequency paging communication system that has coded vibrotactile outputs suitable for use by deaf-blind people was developed. In concept, the system consists of a base station transmitting and receiving unit and many on-body transmitting and receiving units. The completed system has seven operating modes: fire alarm; time signal; repeated single character Morse code; manual Morse code; emergency aid request; operational status test; and message acknowledge. The on-body units can be addressed in three ways: all units; a group of units; or an individual unit. All the functions developed were integrated into a single package that can be worn on the user's wrist. The control portion of the on-body unit is implemented by a microcomputer. The microcomputer is packaged in a custom-designed hybrid circuit to reduce its physical size.

  7. Dopamine reward prediction error coding.

    PubMed

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  8. microRNA Therapeutics in Cancer - An Emerging Concept.

    PubMed

    Shah, Maitri Y; Ferrajoli, Alessandra; Sood, Anil K; Lopez-Berestein, Gabriel; Calin, George A

    2016-10-01

    MicroRNAs (miRNAs) are an evolutionarily conserved class of small, regulatory non-coding RNAs that negatively regulate protein coding gene and other non-coding transcripts expression. miRNAs have been established as master regulators of cellular processes, and they play a vital role in tumor initiation, progression and metastasis. Further, widespread deregulation of microRNAs have been reported in several cancers, with several microRNAs playing oncogenic and tumor suppressive roles. Based on these, miRNAs have emerged as promising therapeutic tools for cancer management. In this review, we have focused on the roles of miRNAs in tumorigenesis, the miRNA-based therapeutic strategies currently being evaluated for use in cancer, and the advantages and current challenges to their use in the clinic. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Numerical Analysis of Convection/Transpiration Cooling

    NASA Technical Reports Server (NTRS)

    Glass, David E.; Dilley, Arthur D.; Kelly, H. Neale

    1999-01-01

    An innovative concept utilizing the natural porosity of refractory-composite materials and hydrogen coolant to provide CONvective and TRANspiration (CONTRAN) cooling and oxidation protection has been numerically studied for surfaces exposed to a high heat flux, high temperature environment such as hypersonic vehicle engine combustor walls. A boundary layer code and a porous media finite difference code were utilized to analyze the effect of convection and transpiration cooling on surface heat flux and temperature. The boundary, layer code determined that transpiration flow is able to provide blocking of the surface heat flux only if it is above a minimum level due to heat addition from combustion of the hydrogen transpirant. The porous media analysis indicated that cooling of the surface is attained with coolant flow rates that are in the same range as those required for blocking, indicating that a coupled analysis would be beneficial.

  10. Multiprocessing on supercomputers for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Mehta, Unmeel B.

    1991-01-01

    Little use is made of multiple processors available on current supercomputers (computers with a theoretical peak performance capability equal to 100 MFLOPS or more) to improve turnaround time in computational aerodynamics. The productivity of a computer user is directly related to this turnaround time. In a time-sharing environment, such improvement in this speed is achieved when multiple processors are used efficiently to execute an algorithm. The concept of multiple instructions and multiple data (MIMD) is applied through multitasking via a strategy that requires relatively minor modifications to an existing code for a single processor. This approach maps the available memory to multiple processors, exploiting the C-Fortran-Unix interface. The existing code is mapped without the need for developing a new algorithm. The procedure for building a code utilizing this approach is automated with the Unix stream editor.

  11. Constructing graph models for software system development and analysis

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, Andrey V.

    2017-01-01

    We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.

  12. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377

  13. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  14. Molecular interplay of pro-inflammatory transcription factors and non-coding RNAs in esophageal squamous cell carcinoma.

    PubMed

    Sundaram, Gopinath M; Veera Bramhachari, Pallaval

    2017-06-01

    Esophageal squamous cell carcinoma is the sixth most common cancer in the developing world. The aggressive nature of esophageal squamous cell carcinoma, its tendency for relapse, and the poor survival prospects of patients diagnosed at advanced stages, represent a pressing need for the development of new therapies for this disease. Chronic inflammation is known to have a causal link to cancer pre-disposition. Nuclear factor kappa B and signal transducer and activator of transcription 3 are transcription factors which regulate immunity and inflammation and are emerging as key regulators of tumor initiation, progression, and metastasis. Although these pro-inflammatory factors in esophageal squamous cell carcinoma have been well-characterized with reference to protein-coding targets, their functional interactions with non-coding RNAs have only recently been gaining attention. Non-coding RNAs, especially microRNAs and long non-coding RNAs demonstrate potential as biomarkers and alternative therapeutic targets. In this review, we summarize the recent literature and concepts on non-coding RNAs that are regulated by/regulate nuclear factor kappa B and signal transducer and activator of transcription 3 in esophageal cancer progression. We also discuss how these recent discoveries can pave way for future therapeutic options to treat esophageal squamous cell carcinoma.

  15. An Object-oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2008-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented

  16. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  17. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  18. DMD-based implementation of patterned optical filter arrays for compressive spectral imaging.

    PubMed

    Rueda, Hoover; Arguello, Henry; Arce, Gonzalo R

    2015-01-01

    Compressive spectral imaging (CSI) captures multispectral imagery using fewer measurements than those required by traditional Shannon-Nyquist theory-based sensing procedures. CSI systems acquire coded and dispersed random projections of the scene rather than direct measurements of the voxels. To date, the coding procedure in CSI has been realized through the use of block-unblock coded apertures (CAs), commonly implemented as chrome-on-quartz photomasks. These apertures block or permit us to pass the entire spectrum from the scene at given spatial locations, thus modulating the spatial characteristics of the scene. This paper extends the framework of CSI by replacing the traditional block-unblock photomasks by patterned optical filter arrays, referred to as colored coded apertures (CCAs). These, in turn, allow the source to be modulated not only spatially but spectrally as well, entailing more powerful coding strategies. The proposed CCAs are synthesized through linear combinations of low-pass, high-pass, and bandpass filters, paired with binary pattern ensembles realized by a digital micromirror device. The optical forward model of the proposed CSI architecture is presented along with a proof-of-concept implementation, which achieves noticeable improvements in the quality of the reconstruction.

  19. Software for Better Documentation of Other Software

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  20. [Representation of knowledge in respiratory medicine: ontology should help the coding process].

    PubMed

    Blanc, F-X; Baneyx, A; Charlet, J; Housset, B

    2010-09-01

    Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.

  1. Development of CFD model for augmented core tripropellant rocket engine

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth M.

    1994-10-01

    The Space Shuttle era has made major advances in technology and vehicle design to the point that the concept of a single-stage-to-orbit (SSTO) vehicle appears more feasible. NASA presently is conducting studies into the feasibility of certain advanced concept rocket engines that could be utilized in a SSTO vehicle. One such concept is a tripropellant system which burns kerosene and hydrogen initially and at altitude switches to hydrogen. This system will attain a larger mass fraction because LOX-kerosene engines have a greater average propellant density and greater thrust-to-weight ratio. This report describes the investigation to model the tripropellant augmented core engine. The physical aspects of the engine, the CFD code employed, and results of the numerical model for a single modular thruster are discussed.

  2. Coherent concepts are computed in the anterior temporal lobes.

    PubMed

    Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J

    2010-02-09

    In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.

  3. Numerical model for learning concepts of streamflow simulation

    USGS Publications Warehouse

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  4. The qualitative interview and challenges for clinicians undertaking research: a personal reflection.

    PubMed

    Fisher, Karin

    2011-01-01

    Drawing on my doctoral experience the aim of this article is to present my transition from practitioner to novice researcher and the challenges I encountered when undertaking qualitative in-depth interviews. The contents of my research diary were coded for words, sentences and paragraphs and were then grouped into themes and subsequently organised into concepts and categories. The analysis identified one core category: 'changing states: learning to become a researcher'. The related categories included 'guessing responses', 'confusing boundaries' and 'revealing hidden concepts'. These concepts provide a description of how I learnt to become a researcher and became a changed state. The paper provides practitioners with practical examples of my transition from practitioner to novice researcher. I offer some tips for practitioners who wish to undertake research in their clinical role.

  5. Toward a New Theory for Selecting Instructional Visuals.

    ERIC Educational Resources Information Center

    Croft, Richard S.; Burton, John K.

    This paper provides a rationale for the selection of illustrations and visual aids for the classroom. The theories that describe the processing of visuals are dual coding theory and cue summation theory. Concept attainment theory offers a basis for selecting which cues are relevant for any learning task which includes a component of identification…

  6. The Representation of Abstract Words: Why Emotion Matters

    ERIC Educational Resources Information Center

    Kousta, Stavroula-Thaleia; Vigliocco, Gabriella; Vinson, David P.; Andrews, Mark; Del Campo, Elena

    2011-01-01

    Although much is known about the representation and processing of concrete concepts, knowledge of what abstract semantics might be is severely limited. In this article we first address the adequacy of the 2 dominant accounts (dual coding theory and the context availability model) put forward in order to explain representation and processing…

  7. Best interests of adults who lack capacity part 2: key considerations.

    PubMed

    Griffith, Richard

    Last month's article discussed the key concepts underpinning the notion of best interests. In this article the author discusses the requirements for determining the best interests of an adult who lacks capacity under the provisions of the Mental Capacity Act 2005 and its code of practice (Department for Constitutional Affairs 2007).

  8. Jere Brophy: An Appreciation

    ERIC Educational Resources Information Center

    Rosenshine, Barak

    2015-01-01

    The first generations of researchers on classroom instruction were the pioneers who developed the term, categories, and concepts that were used to view and code what was happening during classroom lessons. The five pioneers in this first wave were Ned Flanders, Arno Bellack, B.O. Smith, Don Medley, and Harold Mitzel. Each of these pioneers used…

  9. Mobile Konami Codes: Analysis of Android Malware Services Utilizing Sensor and Resource-Based State Changes

    DTIC Science & Technology

    2015-03-01

    our focus will remain on Android rather than being all-inclusive of others such as iOS, Blackberry 10, and Windows Phone. The proof-of-concept...the attack surface for malicious applications to compromise vulnerable Services grows . Additionally, Services also have a life cycle with

  10. Overcoming Misconceptions in Neurophysiology Learning: An Approach Using Color-Coded Animations

    ERIC Educational Resources Information Center

    Guy, Richard

    2012-01-01

    Anyone who has taught neurophysiology would be aware of recurring concepts that students find difficult to understand. However, a greater problem is the development of misconceptions that may be difficult to change. For example, one common misconception is that action potentials pass directly across chemical synapses. Difficulties may be…

  11. 77 FR 40338 - Announcing Revised Draft Federal Information Processing Standard (FIPS) 201-2, Personal Identity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... may be sent to: Chief, Computer Security Division, Information Technology Laboratory, ATTN: Comments... introduces the concept of a virtual contact interface, over which all functionality of the PIV Card is... Laboratory Programs. [FR Doc. 2012-16725 Filed 7-6-12; 8:45 am] BILLING CODE 3510-13-P ...

  12. Eleven Years of Primary Health Care Delivery in an Academic Nursing Center.

    ERIC Educational Resources Information Center

    Hildebrandt, Eugenie; Baisch, Mary Jo; Lundeen, Sally P.; Bell-Calvin, Jean; Kelber, Sheryl

    2003-01-01

    Client visits to an academic community nursing center (n=25,495) were coded and analyzed. Results show expansion of nursing practice and services, strong case management, and management of illness care. The usefulness of computerized clinical documentation system and of the Lundeen conceptional model of community nursing care was demonstrated.…

  13. Imagination in School Children's Choice of Their Learning Environment: An Australian Study

    ERIC Educational Resources Information Center

    Bland, Derek; Sharma-Brymer, Vinathe

    2012-01-01

    A visual research project addressed school children's concepts of ideal learning environments. Drawings and accompanying narratives were collected from Year 5 and Year 6 children in nine Queensland primary schools. The 133 submissions were analysed and coded to develop themes, identify key features and consider the uses of imagination. The…

  14. Towards Just-In-Time Partial Evaluation of Prolog

    NASA Astrophysics Data System (ADS)

    Bolz, Carl Friedrich; Leuschel, Michael; Rigo, Armin

    We introduce a just-in-time specializer for Prolog. Just-in-time specialization attempts to unify of the concepts and benefits of partial evaluation (PE) and just-in-time (JIT) compilation. It is a variant of PE that occurs purely at runtime, which lazily generates residual code and is constantly driven by runtime feedback.

  15. Continuity and Change in the Measurement of Infant Attachment: Comment on Fraley and Spieker (2003).

    ERIC Educational Resources Information Center

    Cassidy, Jude

    2003-01-01

    Highlights usefulness of the categorical approach to measuring infant attachment by reviewing some major advances in the field that have been fostered by that approach. Advances include identification of the disorganized attachment group, development of the concept of conditional behavior strategies, creation of systems for coding attachment…

  16. "I Take Engineering with Me": Epistemological Transitions across an Engineering Curriculum

    ERIC Educational Resources Information Center

    Winberg, Christine; Winberg, Simon; Jacobs, Cecilia; Garraway, James; Engel-Hills, Penelope

    2016-01-01

    In this paper we study epistemological transitions across an intended engineering curriculum and recommend strategies to assist students in attaining the increasingly complex concepts and insights that are necessary for transition to advanced levels of study. We draw on Legitimation Code Theory [Maton, Karl. 2014, "Knowledge and Knowers:…

  17. An Exploration and Analysis of the Relationships among Object Oriented Programming, Hypermedia, and Hypertalk.

    ERIC Educational Resources Information Center

    Milet, Lynn K.; Harvey, Francis A.

    Hypermedia and object oriented programming systems (OOPs) represent examples of "open" computer environments that allow the user access to parts of the code or operating system. Both systems share fundamental intellectual concepts (objects, messages, methods, classes, and inheritance), so that an understanding of hypermedia can help in…

  18. Numerical stability of the error diffusion concept

    NASA Astrophysics Data System (ADS)

    Weissbach, Severin; Wyrowski, Frank

    1992-10-01

    The error diffusion algorithm is an easy implementable mean to handle nonlinearities in signal processing, e.g. in picture binarization and coding of diffractive elements. The numerical stability of the algorithm depends on the choice of the diffusion weights. A criterion for the stability of the algorithm is presented and evaluated for some examples.

  19. Quantum Steganography and Quantum Error-Correction

    ERIC Educational Resources Information Center

    Shaw, Bilal A.

    2010-01-01

    Quantum error-correcting codes have been the cornerstone of research in quantum information science (QIS) for more than a decade. Without their conception, quantum computers would be a footnote in the history of science. When researchers embraced the idea that we live in a world where the effects of a noisy environment cannot completely be…

  20. The representation of abstract words: why emotion matters.

    PubMed

    Kousta, Stavroula-Thaleia; Vigliocco, Gabriella; Vinson, David P; Andrews, Mark; Del Campo, Elena

    2011-02-01

    Although much is known about the representation and processing of concrete concepts, knowledge of what abstract semantics might be is severely limited. In this article we first address the adequacy of the 2 dominant accounts (dual coding theory and the context availability model) put forward in order to explain representation and processing differences between concrete and abstract words. We find that neither proposal can account for experimental findings and that this is, at least partly, because abstract words are considered to be unrelated to experiential information in both of these accounts. We then address a particular type of experiential information, emotional content, and demonstrate that it plays a crucial role in the processing and representation of abstract concepts: Statistically, abstract words are more emotionally valenced than are concrete words, and this accounts for a residual latency advantage for abstract words, when variables such as imageability (a construct derived from dual coding theory) and rated context availability are held constant. We conclude with a discussion of our novel hypothesis for embodied abstract semantics. (c) 2010 APA, all rights reserved.

  1. Influence of culture on tripartite self-concept development in adolescence: a comparison between Han and Uyghur cultures.

    PubMed

    Abdukeram, Ziwida; Mamat, Marhaba; Luo, Wei; Wu, Yanhong

    2015-02-01

    This study investigated the development of cultural variability in interdependent self-construal by comparing the differences in the tripartite self-concept of adolescent samples from the Han and Uyghur cultures. Participants (460 males, 522 females; M age = 16.3 yr., SD = 4.8) in the sub-phases of pre-, early-, mid-, late- and post-adolescence were asked to completed the revised Twenty Statements Test, and the items generated by the participants were coded into private, relational, and collective self-statements. The private self-statements were further differentiated by personal and social orientation, and the relational self-statements were further coded into family and friend focus. The relational aspect of an individual's self, or personal relationship, became increasingly important with age in the Han cultural groups, whereas the collective aspect of an individual's self, or social identity, became increasingly important with age in the Uyghur cultural groups. These findings seem to show the development of differences between relational and collective interdependent self-construals. Furthermore, these findings emphasize the need for further research into the development of within-cultural differences in self-construal.

  2. The medical responsibility: current view from the Council of Physicians side.

    PubMed

    Squifflet, J P

    2003-04-01

    The medical responsibility has been clearly defined in the Royal Decree no. 78 dated November 11, 1967 concerning the medical practice. Moreover, several articles from the Ethical Code (Code de Déontologie) have clarified some social and economical responsibilities in the medical practice (articles 99 to 103) and the quality of patient care (article 36). The National Council has also published at least 31 advises facing the daily reality and the growing insecurity. That atmosphere is coming from the jurisprudence, the increasing responsibility insurance fees, the obligation of results instead of means, and the project of patient rights law. That project is currently dissociated from other projects such as an update on the medical responsibility and/or the no fault indemnity. Therefore, there is a current need for developing written patient information and using informed consent forms for risky surgical procedures. Before recognizing the no fault concept with indemnity, it is necessary to review the coverage of the responsibility insurance, educate the medical doctors in the no fault concept, study the mode of compensation for therapeutic hazards and differentiate the objective and subjective parts of the patient's chart.

  3. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.

  4. Simulation of Trajectories for High Specific Impulse Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Difficulties in approximating flight times and deliverable masses for continuous thrust propulsion systems have complicated comparison and evaluation of proposed propulsion concepts. These continuous thrust propulsion systems are of interest to many groups, not the least of which are the electric propulsion and fusion communities. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. The analytical method derived in the companion paper was also used to simulate the trajectory. The accuracy of this method is discussed in the paper.

  5. The Potential of Different Concepts of Fast Breeder Reactor for the French Fleet Renewal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massara, Simone; Tetart, Philippe; Lecarpentier, David

    2006-07-01

    The performances of different concepts of Fast Breeder Reactor (Na-cooled, He-cooled and Pb-cooled FBR) for the current French fleet renewal are analyzed in the framework of a transition scenario to a 100% FBR fleet at the end of the 21. century. Firstly, the modeling of these three FBR types by means of a semi-analytical approach in TIRELIRE - STRATEGIE, the EDF fuel cycle simulation code, is presented, together with some validation elements against ERANOS, the French reference code system for neutronic FBR analysis (CEA). Afterwards, performances comparisons are made in terms of maximum deployable power, natural uranium consumption and wastemore » production. The results show that the FBR maximum deployable capacity, independently from the FBR technology, is highly sensitive to the fuel cycle options, like the spent nuclear fuel cooling time or the Minor Actinides management strategy. Thus, some of the key parameters defining the dynamic of FBR deployment are highlighted, to inform the orientation of R and D in the development and optimization of these systems. (authors)« less

  6. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  7. Visual Information Processing for Television and Telerobotics

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)

    1989-01-01

    This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.

  8. Protection of data carriers using secure optical codes

    NASA Astrophysics Data System (ADS)

    Peters, John A.; Schilling, Andreas; Staub, René; Tompkin, Wayne R.

    2006-02-01

    Smartcard technologies, combined with biometric-enabled access control systems, are required for many high-security government ID card programs. However, recent field trials with some of the most secure biometric systems have indicated that smartcards are still vulnerable to well equipped and highly motivated counterfeiters. In this paper, we present the Kinegram Secure Memory Technology which not only provides a first-level visual verification procedure, but also reinforces the existing chip-based security measures. This security concept involves the use of securely-coded data (stored in an optically variable device) which communicates with the encoded hashed information stored in the chip memory via a smartcard reader device.

  9. A Graphical User-Interface for Propulsion System Analysis

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Ryall, Kathleen

    1992-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  10. A graphical user-interface for propulsion system analysis

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Ryall, Kathleen

    1993-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  11. Common spaceborne multicomputer operating system and development environment

    NASA Technical Reports Server (NTRS)

    Craymer, L. G.; Lewis, B. F.; Hayes, P. J.; Jones, R. L.

    1994-01-01

    A preliminary technical specification for a multicomputer operating system is developed. The operating system is targeted for spaceborne flight missions and provides a broad range of real-time functionality, dynamic remote code-patching capability, and system fault tolerance and long-term survivability features. Dataflow concepts are used for representing application algorithms. Functional features are included to ensure real-time predictability for a class of algorithms which require data-driven execution on an iterative steady state basis. The development environment supports the development of algorithm code, design of control parameters, performance analysis, simulation of real-time dataflow applications, and compiling and downloading of the resulting application.

  12. Use of General-purpose Negation Detection to Augment Concept Indexing of Medical Documents

    PubMed Central

    Mutalik, Pradeep G.; Deshpande, Aniruddha; Nadkarni, Prakash M.

    2001-01-01

    Objectives: To test the hypothesis that most instances of negated concepts in dictated medical documents can be detected by a strategy that relies on tools developed for the parsing of formal (computer) languages—specifically, a lexical scanner (“lexer”) that uses regular expressions to generate a finite state machine, and a parser that relies on a restricted subset of context-free grammars, known as LALR(1) grammars. Methods: A diverse training set of 40 medical documents from a variety of specialties was manually inspected and used to develop a program (Negfinder) that contained rules to recognize a large set of negated patterns occurring in the text. Negfinder's lexer and parser were developed using tools normally used to generate programming language compilers. The input to Negfinder consisted of medical narrative that was preprocessed to recognize UMLS concepts: the text of a recognized concept had been replaced with a coded representation that included its UMLS concept ID. The program generated an index with one entry per instance of a concept in the document, where the presence or absence of negation of that concept was recorded. This information was used to mark up the text of each document by color-coding it to make it easier to inspect. The parser was then evaluated in two ways: 1) a test set of 60 documents (30 discharge summaries, 30 surgical notes) marked-up by Negfinder was inspected visually to quantify false-positive and false-negative results; and 2) a different test set of 10 documents was independently examined for negatives by a human observer and by Negfinder, and the results were compared. Results: In the first evaluation using marked-up documents, 8,358 instances of UMLS concepts were detected in the 60 documents, of which 544 were negations detected by the program and verified by human observation (true-positive results, or TPs). Thirteen instances were wrongly flagged as negated (false-positive results, or FPs), and the program missed 27 instances of negation (false-negative results, or FNs), yielding a sensitivity of 95.3 percent and a specificity of 97.7 percent. In the second evaluation using independent negation detection, 1,869 concepts were detected in 10 documents, with 135 TPs, 12 FPs, and 6 FNs, yielding a sensitivity of 95.7 percent and a specificity of 91.8 percent. One of the words “no,” “denies/denied,” “not,” or “without” was present in 92.5 percent of all negations. Conclusions: Negation of most concepts in medical narrative can be reliably detected by a simple strategy. The reliability of detection depends on several factors, the most important being the accuracy of concept matching. PMID:11687566

  13. Error-Rate Bounds for Coded PPM on a Poisson Channel

    NASA Technical Reports Server (NTRS)

    Moision, Bruce; Hamkins, Jon

    2009-01-01

    Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.

  14. Concept analysis of moral courage in nursing: A hybrid model.

    PubMed

    Sadooghiasl, Afsaneh; Parvizy, Soroor; Ebadi, Abbas

    2018-02-01

    Moral courage is one of the most fundamental virtues in the nursing profession, however, little attention has been paid to it. As a result, no exact and clear definition of moral courage has ever been accessible. This study is carried out for the purposes of defining and clarifying its concept in the nursing profession. This study used a hybrid model of concept analysis comprising three phases, namely, a theoretical phase, field work phase, and a final analysis phase. To find relevant literature, electronic search of valid databases was utilized using keywords related to the concept of courage. Field work data were collected over an 11 months' time period from 2013 to 2014. In the field work phase, in-depth interviews were performed with 10 nurses. The conventional content analysis was used in two theoretical and field work phases using Graneheim and Lundman stages, and the results were combined in the final analysis phase. Ethical consideration: Permission for this study was obtained from the ethics committee of Tehran University of Medical Sciences. Oral and written informed consent was received from the participants. From the sum of 750 gained titles in theoretical phase, 26 texts were analyzed. The analysis resulted in 494 codes in text analysis and 226 codes in interview analysis. The literature review in the theoretical phase revealed two features of inherent-transcendental characteristics, two of which possessed a difficult nature. Working in the field phase added moral self-actualization characteristic, rationalism, spiritual beliefs, and scientific-professional qualifications to the feature of the concept. Moral courage is a pure and prominent characteristic of human beings. The antecedents of moral courage include model orientation, model acceptance, rationalism, individual excellence, acquiring academic and professional qualification, spiritual beliefs, organizational support, organizational repression, and internal and external personal barriers. Professional excellence resulting from moral courage can be crystallized in the form of provision of professional care, creating peace of mind, and the nurse's decision making and proper functioning.

  15. Coupling hydrodynamics with comoving frame radiative transfer. I. A unified approach for OB and WR stars

    NASA Astrophysics Data System (ADS)

    Sander, A. A. C.; Hamann, W.-R.; Todt, H.; Hainich, R.; Shenar, T.

    2017-07-01

    Context. For more than two decades, stellar atmosphere codes have been used to derive the stellar and wind parameters of massive stars. Although they have become a powerful tool and sufficiently reproduce the observed spectral appearance, they can hardly be used for more than measuring parameters. One major obstacle is their inconsistency between the calculated radiation field and the wind stratification due to the usage of prescribed mass-loss rates and wind-velocity fields. Aims: We present the concepts for a new generation of hydrodynamically consistent non-local thermodynamical equilibrium (non-LTE) stellar atmosphere models that allow for detailed studies of radiation-driven stellar winds. As a first demonstration, this new kind of model is applied to a massive O star. Methods: Based on earlier works, the PoWR code has been extended with the option to consistently solve the hydrodynamic equation together with the statistical equations and the radiative transfer in order to obtain a hydrodynamically consistent atmosphere stratification. In these models, the whole velocity field is iteratively updated together with an adjustment of the mass-loss rate. Results: The concepts for obtaining hydrodynamically consistent models using a comoving-frame radiative transfer are outlined. To provide a useful benchmark, we present a demonstration model, which was motivated to describe the well-studied O4 supergiant ζPup. The obtained stellar and wind parameters are within the current range of literature values. Conclusions: For the first time, the PoWR code has been used to obtain a hydrodynamically consistent model for a massive O star. This has been achieved by a profound revision of earlier concepts used for Wolf-Rayet stars. The velocity field is shaped by various elements contributing to the radiative acceleration, especially in the outer wind. The results further indicate that for more dense winds deviations from a standard β-law occur.

  16. 'It means everyone should know their status': exploring lay conceptions of sickle cell trait and sickle cell trait screening among African Americans within middle reproductive age.

    PubMed

    Mayo-Gamble, Tilicia L; Barnes, Priscilla A; Cunningham Erves, Jennifer; Middlestadt, Susan E; Lin, Hsien-Chang

    2017-02-21

    This study examined the meaning of sickle cell trait and sickle cell trait screening from the lay perspective of African Americans. African Americans (N = 300), ages 18-35 and unaware of their sickle cell trait status, completed two open-ended questions from a larger survey. One question asked for their understanding of sickle cell trait; the other asked for their understanding of sickle cell trait screening. Content analysis occurred in two phases: (1) In vivo and holistic coding; and (2) focused coding. Four categories emerged illustrating lay conceptions of sickle cell trait; (1) Perceived as an illness; (2) Perceived recognition of the inheritance pattern of sickle cell trait; (3) Perceived lack of knowledge of sickle cell trait; and (4) Perceived importance of sickle cell trait. Five categories emerged illustrating lay conceptions for sickle cell trait screening: (1) Perceived recognition that screening means getting tested for sickle cell trait; (2) Perceived lack of knowledge of sickle cell trait screening; (3) Perceived health benefit of sickle cell trait screening; (4) Perceived importance of sickle cell trait screening; and (5) Perceived barriers to sickle cell trait screening. Sickle cell trait and sickle cell trait screening are concepts that are both regarded as important among this high-risk population. However, there is still misunderstanding concerning the hereditary nature and reproductive implications of sickle cell trait. Interventions seeking to improve communication on the need for sickle cell trait screening should begin by identifying what the population at large understands, knows and/or believes to improve their ability to make informed health decisions.

  17. The 'wayfinding' experience of family carers who learn to manage technical health procedures at home: a grounded theory study.

    PubMed

    McDonald, Janet; McKinlay, Eileen; Keeling, Sally; Levack, William

    2017-12-01

    With more care taking place in the home, family carers play an important role in supporting patients. Some family carers undertake technical health procedures generally managed by health professionals in hospital settings (e.g. managing a tracheostomy or enteral feeding). To explore how family carers learn to manage technical health procedures in order to help health professionals better understand and support this process. A grounded theory study using data from interviews with 26 New Zealand family carers who managed technical health procedures including nasogastric or gastrostomy feeding, stoma care, urinary catheterisation, tracheostomy management, intravenous therapy, diabetes management and complex wound dressings. Most (20 participants) were caring for their child and the remaining six for their spouse, parent or grandparent. Following grounded theory methods, each interview was coded soon after completion. Additional data were compared with existing material, and as analysis proceeded, initial codes were grouped into higher order concepts until a core concept was developed. Interviewing continued until no new ideas emerged and concepts were well defined. The core concept of 'wayfinding' indicates that the learning process for family carers is active, individualised and multi-influenced, developing over time as a response to lived experience. Health professional support was concentrated on the initial phase of carers' training, reducing and becoming more reactive as carers took responsibility for day-to-day management. Wayfinding involves self-navigation by carers, in contrast to patient navigator models which provide continuing professional assistance to patients receiving cancer or chronic care services. Wayfinding by carers raises questions about how carers should be best supported in their initial and ongoing learning as the management of these procedures changes over time. © 2017 Nordic College of Caring Science.

  18. Jupyter Notebooks as tools for interactive learning of Concepts in Structural Geology and efficient grading of exercises.

    NASA Astrophysics Data System (ADS)

    Niederau, Jan; Wellmann, Florian; Maersch, Jannik; Urai, Janos

    2017-04-01

    Programming is increasingly recognised an important skill for geoscientists - however, the hurdle to jump into programming for students with little or no experience can be high. We present here teaching concepts on the basis of Jupyter notebooks that combine, in an intuitive way, formatted instruction text with code cells in a single environment. This integration allows for an exposure to programming on several levels: from a complete interactive presentation of content, where students require no or very limited programming experience, to highly complex geoscientific computations. We consider these notebooks therefore as an ideal medium to present computational content to students in the field of geosciences. We show here how we use these notebooks to develop digital documents in Python for undergrad-students, who can then learn about basic concepts in structural geology via self-assessment. Such notebooks comprise concepts such as: stress tensor, strain ellipse, or the mohr circle. Students can interactively change parameters, e.g. by using sliders and immediately see the results. They can further experiment and extend the notebook by writing their own code within the notebook. Jupyter Notebooks for teaching purposes can be provided ready-to-use via online services. That is, students do not need to install additional software on their devices in order to work with the notebooks. We also use Jupyter Notebooks for automatic grading of programming assignments in multiple lectures. An implemented workflow facilitates the generation, distribution of assignments, as well as the final grading. Compared to previous grading methods with a high percentage of repetitive manual grading, the implemented workflow proves to be much more time efficient.

  19. An engineer's view on genetic information and biological evolution.

    PubMed

    Battail, Gérard

    2004-01-01

    We develop ideas on genome replication introduced in Battail [Europhys. Lett. 40 (1997) 343]. Starting with the hypothesis that the genome replication process uses error-correcting means, and the auxiliary one that nested codes are used to this end, we first review the concepts of redundancy and error-correcting codes. Then we show that these hypotheses imply that: distinct species exist with a hierarchical taxonomy, there is a trend of evolution towards complexity, and evolution proceeds by discrete jumps. At least the first two features above may be considered as biological facts so, in the absence of direct evidence, they provide an indirect proof in favour of the hypothesized error-correction system. The very high redundancy of genomes makes it possible. In order to explain how it is implemented, we suggest that soft codes and replication decoding, to be briefly described, are plausible candidates. Experimentally proven properties of long-range correlation of the DNA message substantiate this claim.

  20. Dynamical beam manipulation based on 2-bit digitally-controlled coding metasurface.

    PubMed

    Huang, Cheng; Sun, Bo; Pan, Wenbo; Cui, Jianhua; Wu, Xiaoyu; Luo, Xiangang

    2017-02-08

    Recently, a concept of digital metamaterials has been proposed to manipulate field distribution through proper spatial mixtures of digital metamaterial bits. Here, we present a design of 2-bit digitally-controlled coding metasurface that can effectively modulate the scattered electromagnetic wave and realize different far-field beams. Each meta-atom of this metasurface integrates two pin diodes, and by tuning their operating states, the metasurface has four phase responses of 0, π/2, π, and 3π/2, corresponding to four basic digital elements "00", "01", "10", and "11", respectively. By designing the coding sequence of the above digital element array, the reflected beam can be arbitrarily controlled. The proposed 2-bit digital metasurface has been demonstrated to possess capability of achieving beam deflection, multi-beam and beam diffusion, and the dynamical switching of these different scattering patterns is completed by a programmable electric source.

  1. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  2. Dynamical beam manipulation based on 2-bit digitally-controlled coding metasurface

    PubMed Central

    Huang, Cheng; Sun, Bo; Pan, Wenbo; Cui, Jianhua; Wu, Xiaoyu; Luo, Xiangang

    2017-01-01

    Recently, a concept of digital metamaterials has been proposed to manipulate field distribution through proper spatial mixtures of digital metamaterial bits. Here, we present a design of 2-bit digitally-controlled coding metasurface that can effectively modulate the scattered electromagnetic wave and realize different far-field beams. Each meta-atom of this metasurface integrates two pin diodes, and by tuning their operating states, the metasurface has four phase responses of 0, π/2, π, and 3π/2, corresponding to four basic digital elements “00”, “01”, “10”, and “11”, respectively. By designing the coding sequence of the above digital element array, the reflected beam can be arbitrarily controlled. The proposed 2-bit digital metasurface has been demonstrated to possess capability of achieving beam deflection, multi-beam and beam diffusion, and the dynamical switching of these different scattering patterns is completed by a programmable electric source. PMID:28176870

  3. The photo-colorimetric space as a medium for the representation of spatial data

    NASA Technical Reports Server (NTRS)

    Kraiss, K. Friedrich; Widdel, Heino

    1989-01-01

    Spatial displays and instruments are usually used in the context of vehicle guidance, but it is hard to find applicable spatial formats in information retrieval and interaction systems. Human interaction with spatial data structures and the applicability of the CIE color space to improve dialogue transparency is discussed. A proposal is made to use the color space to code spatially represented data. The semantic distances of the categories of dialogue structures or, more general, of database structures, are determined empirically. Subsequently the distances are transformed and depicted into the color space. The concept is demonstrated for a car diagnosis system, where the category cooling system could, e.g., be coded in blue, the category ignition system in red. Hereby a correspondence between color and semantic distances is achieved. Subcategories can be coded as luminance differences within the color space.

  4. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  5. Study of shock-induced combustion using an implicit TVD scheme

    NASA Technical Reports Server (NTRS)

    Yungster, Shayne

    1992-01-01

    The supersonic combustion flowfields associated with various hypersonic propulsion systems, such as the ram accelerator, the oblique detonation wave engine, and the scramjet, are being investigated using a new computational fluid dynamics (CFD) code. The code solves the fully coupled Reynolds-averaged Navier-Stokes equations and species continuity equations in an efficient manner. It employs an iterative method and a second order differencing scheme to improve computational efficiency. The code is currently being applied to study shock wave/boundary layer interactions in premixed combustible gases, and to investigate the ram accelerator concept. Results obtained for a ram accelerator configuration indicate a new combustion mechanism in which a shock wave induces combustion in the boundary layer, which then propagates outward and downstream. The combustion process creates a high pressure region over the back of the projectile resulting in a net positive thrust forward.

  6. A Computational Chemistry Database for Semiconductor Processing

    NASA Technical Reports Server (NTRS)

    Jaffe, R.; Meyyappan, M.; Arnold, J. O. (Technical Monitor)

    1998-01-01

    The concept of 'virtual reactor' or 'virtual prototyping' has received much attention recently in the semiconductor industry. Commercial codes to simulate thermal CVD and plasma processes have become available to aid in equipment and process design efforts, The virtual prototyping effort would go nowhere if codes do not come with a reliable database of chemical and physical properties of gases involved in semiconductor processing. Commercial code vendors have no capabilities to generate such a database, rather leave the task to the user of finding whatever is needed. While individual investigations of interesting chemical systems continue at Universities, there has not been any large scale effort to create a database. In this presentation, we outline our efforts in this area. Our effort focuses on the following five areas: 1. Thermal CVD reaction mechanism and rate constants. 2. Thermochemical properties. 3. Transport properties.4. Electron-molecule collision cross sections. and 5. Gas-surface interactions.

  7. Computational Thermodynamics of Materials Zi-Kui Liu and Yi Wang

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devanathan, Ram

    This authoritative volume introduces the reader to computational thermodynamics and the use of this approach to the design of material properties by tailoring the chemical composition. The text covers applications of this approach, introduces the relevant computational codes, and offers exercises at the end of each chapter. The book has nine chapters and two appendices that provide background material on computer codes. Chapter 1 covers the first and second laws of thermodynamics, introduces the spinodal as the limit of stability, and presents the Gibbs-Duhem equation. Chapter 2 focuses on the Gibbs energy function. Starting with a homogeneous system with amore » single phase, the authors proceed to phases with variable compositions, and polymer blends. The discussion includes the contributions of external electric and magnetic fields to the Gibbs energy. Chapter 3 deals with phase equilibria in heterogeneous systems, the Gibbs phase rule, and phase diagrams. Chapter 4 briefly covers experimental measurements of thermodynamic properties used as input for thermodynamic modeling by Calculation of Phase Diagrams (CALPHAD). Chapter 5 discusses the use of density functional theory to obtain thermochemical data and fill gaps where experimental data is missing. The reader is introduced to the Vienna Ab Initio Simulation Package (VASP) for density functional theory and the YPHON code for phonon calculations. Chapter 6 introduces the modeling of Gibbs energy of phases with the CALPHAD method. Chapter 7 deals with chemical reactions and the Ellingham diagram for metal-oxide systems and presents the calculation of the maximum reaction rate from equilibrium thermodynamics. Chapter 8 is devoted to electrochemical reactions and Pourbaix diagrams with application examples. Chapter 9 concludes this volume with the application of a model of multiple microstates to Ce and Fe3Pt. CALPHAD modeling is briefly discussed in the context of genomics of materials. The book introduces basic thermodynamic concepts clearly and directs readers to appropriate references for advanced concepts and details of software implementation. The list of references is quite comprehensive. The authors make liberal use of diagrams to illustrate key concepts. The two Appendices at the end discuss software requirements and the file structure, and present templates for special quasi-random structures. There is also a link to download pre-compiled binary files of the YPHON code for Linux or Microsoft Windows systems. The exercises at the end of the chapters assume that the reader has access to VASP, which is not freeware. Readers without access to this code can work on a limited number of exercises. However, results from other first principles codes can be organized in the YPHON format as explained in the Appendix. This book will serve as an excellent reference on computational thermodynamics and the exercises provided at the end of each chapter make it valuable as a graduate level textbook. Reviewer: Ram Devanathan is Acting Director of Earth Systems Science Division, Pacific Northwest National Laboratory, USA.« less

  8. Visualization: a tool for enhancing students' concept images of basic object-oriented concepts

    NASA Astrophysics Data System (ADS)

    Cetin, Ibrahim

    2013-03-01

    The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey including open-ended questions, which was administered to the participants. Follow-up interviews with 12 randomly selected students were conducted to explore their answers to the survey in depth. The results of the first part of the research were utilized to construct visualization scenarios. The students used these scenarios to develop animations using Flash software. The study found that most of the students experienced difficulties in learning object-oriented notions. Overdependence on code-writing practice and examples and incorrectly learned analogies were determined to be the sources of their difficulties. Moreover, visualization was found to be a promising approach in facilitating students' concept images of basic object-oriented notions. The results of this study have implications for researchers and practitioners when designing programming instruction.

  9. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines.

    PubMed

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.

  10. It's time to make management a true profession.

    PubMed

    Khurana, Rakesh; Nohria, Nitin

    2008-10-01

    In the face of the recent institutional breakdown of trust in business, managers are losing legitimacy. To regain public trust, management needs to become a true profession in much the way medicine and law have, argue Khurana and Nohria of Harvard Business School. True professions have codes, and the meaning and consequences of those codes are taught as part of the formal education required of their members. Through these codes, professional institutions forge an implicit social contract with society: Trust us to control and exercise jurisdiction over an important occupational category, and, in return, we will ensurethat the members of our profession are worthy of your trust--that they will not only be competent to perform the tasks entrusted to them, but that they will also conduct themselves with high standardsand great integrity. The authors believe that enforcing educational standards and a code of ethics is unlikely to choke entrepreneurial creativity. Indeed, if the field of medicine is any indication, a code may even stimulate creativity. The main challenge in writing a code lies in reaching a broad consensus on the aims and social purpose of management. There are two deeply divided schools of thought. One school argues that management's aim should simply be to maximize shareholder wealth; the other argues that management's purpose is to balance the claims of all the firm's stakeholders. Any code will have to steer a middle course in order to accommodate both the value-creating impetus of the shareholder value concept and the accountability inherent in the stakeholder approach.

  11. Towards measuring the semantic capacity of a physical medium demonstrated with elementary cellular automata.

    PubMed

    Dittrich, Peter

    2018-02-01

    The organic code concept and its operationalization by molecular codes have been introduced to study the semiotic nature of living systems. This contribution develops further the idea that the semantic capacity of a physical medium can be measured by assessing its ability to implement a code as a contingent mapping. For demonstration and evaluation, the approach is applied to a formal medium: elementary cellular automata (ECA). The semantic capacity is measured by counting the number of ways codes can be implemented. Additionally, a link to information theory is established by taking multivariate mutual information for quantifying contingency. It is shown how ECAs differ in their semantic capacities, how this is related to various ECA classifications, and how this depends on how a meaning is defined. Interestingly, if the meaning should persist for a certain while, the highest semantic capacity is found in CAs with apparently simple behavior, i.e., the fixed-point and two-cycle class. Synergy as a predictor for a CA's ability to implement codes can only be used if context implementing codes are common. For large context spaces with sparse coding contexts synergy is a weak predictor. Concluding, the approach presented here can distinguish CA-like systems with respect to their ability to implement contingent mappings. Applying this to physical systems appears straight forward and might lead to a novel physical property indicating how suitable a physical medium is to implement a semiotic system. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Expert system for maintenance management of a boiling water reactor power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong Shen; Liou, L.W.; Levine, S.

    1992-01-01

    An expert system code has been developed for the maintenance of two boiling water reactor units in Berwick, Pennsylvania, that are operated by the Pennsylvania Power and Light Company (PP and L). The objective of this expert system code, where the knowledge of experienced operators and engineers is captured and implemented, is to support the decisions regarding which components can be safely and reliably removed from service for maintenance. It can also serve as a query-answering facility for checking the plant system status and for training purposes. The operating and maintenance information of a large number of support systems, whichmore » must be available for emergencies and/or in the event of an accident, is stored in the data base of the code. It identifies the relevant technical specifications and management rules for shutting down any one of the systems or removing a component from service to support maintenance. Because of the complexity and time needed to incorporate a large number of systems and their components, the first phase of the expert system develops a prototype code, which includes only the reactor core isolation coolant system, the high-pressure core injection system, the instrument air system, the service water system, and the plant electrical system. The next phase is scheduled to expand the code to include all other systems. This paper summarizes the prototype code and the design concept of the complete expert system code for maintenance management of all plant systems and components.« less

  13. TRIAD IV: Nationwide Survey of Medical Students' Understanding of Living Wills and DNR Orders.

    PubMed

    Mirarchi, Ferdinando L; Ray, Matthew; Cooney, Timothy

    2016-12-01

    Living wills are a form of advance directives that help to protect patient autonomy. They are frequently encountered in the conduct of medicine. Because of their impact on care, it is important to understand the adequacy of current medical school training in the preparation of physicians to interpret these directives. Between April and August 2011 of third and fourth year medical students participated in an internet survey involving the interpretation of living wills. The survey presented a standard living will as a "stand-alone," a standard living will with the addition an emergent clinical scenario and then variations of the standard living will that included a code status designation ("DNR," "Full Code," or "Comfort Care"). For each version/ scenario, respondents were asked to assign a code status and choose interventions based on the cases presented. Four hundred twenty-five students from medical schools throughout the country responded. The majority indicated they had received some form of advance directive training and understood the concept of code status and the term "DNR." Based on a stand-alone document, 15% of respondents correctly denoted "full code" as the appropriate code status; adding a clinical scenario yielded negligible improvement. When a code designation was added to the living will, correct code status responses ranged from 68% to 93%, whereas correct treatment decisions ranged from 18% to 78%. Previous training in advance directives had no impact on these results. Our data indicate that the majority of students failed to understand the key elements of a living will; adding a code status designations improved correct responses with the exception of the term DNR. Misunderstanding of advance directives is a nationwide problem and jeopardizes patient safety. Medical School ethics curricula need to be improved to ensure competency with respect to understanding advance directives.

  14. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    PubMed

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  15. Convergence studies of deterministic methods for LWR explicit reflector methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canepa, S.; Hursin, M.; Ferroukhi, H.

    2013-07-01

    The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on verymore » different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)« less

  16. Cooperative solutions coupling a geometry engine and adaptive solver codes

    NASA Technical Reports Server (NTRS)

    Dickens, Thomas P.

    1995-01-01

    Follow-on work has progressed in using Aero Grid and Paneling System (AGPS), a geometry and visualization system, as a dynamic real time geometry monitor, manipulator, and interrogator for other codes. In particular, AGPS has been successfully coupled with adaptive flow solvers which iterate, refining the grid in areas of interest, and continuing on to a solution. With the coupling to the geometry engine, the new grids represent the actual geometry much more accurately since they are derived directly from the geometry and do not use refits to the first-cut grids. Additional work has been done with design runs where the geometric shape is modified to achieve a desired result. Various constraints are used to point the solution in a reasonable direction which also more closely satisfies the desired results. Concepts and techniques are presented, as well as examples of sample case studies. Issues such as distributed operation of the cooperative codes versus running all codes locally and pre-calculation for performance are discussed. Future directions are considered which will build on these techniques in light of changing computer environments.

  17. Sonic Boom Propagation Codes Validated by Flight Test

    NASA Technical Reports Server (NTRS)

    Poling, Hugh W.

    1996-01-01

    The sonic boom propagation codes reviewed in this study, SHOCKN and ZEPHYRUS, implement current theory on air absorption using different computational concepts. Review of the codes with a realistic atmosphere model confirm the agreement of propagation results reported by others for idealized propagation conditions. ZEPHYRUS offers greater flexibility in propagation conditions and is thus preferred for practical aircraft analysis. The ZEPHYRUS code was used to propagate sonic boom waveforms measured approximately 1000 feet away from an SR-71 aircraft flying at Mach 1.25 to 5000 feet away. These extrapolated signatures were compared to measurements at 5000 feet. Pressure values of the significant shocks (bow, canopy, inlet and tail) in the waveforms are consistent between extrapolation and measurement. Of particular interest is that four (independent) measurements taken under the aircraft centerline converge to the same extrapolated result despite differences in measurement conditions. Agreement between extrapolated and measured signature duration is prevented by measured duration of the 5000 foot signatures either much longer or shorter than would be expected. The duration anomalies may be due to signature probing not sufficiently parallel to the aircraft flight direction.

  18. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  19. Signal Prediction With Input Identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Chen, Ya-Chin

    1999-01-01

    A novel coding technique is presented for signal prediction with applications including speech coding, system identification, and estimation of input excitation. The approach is based on the blind equalization method for speech signal processing in conjunction with the geometric subspace projection theory to formulate the basic prediction equation. The speech-coding problem is often divided into two parts, a linear prediction model and excitation input. The parameter coefficients of the linear predictor and the input excitation are solved simultaneously and recursively by a conventional recursive least-squares algorithm. The excitation input is computed by coding all possible outcomes into a binary codebook. The coefficients of the linear predictor and excitation, and the index of the codebook can then be used to represent the signal. In addition, a variable-frame concept is proposed to block the same excitation signal in sequence in order to reduce the storage size and increase the transmission rate. The results of this work can be easily extended to the problem of disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. Simulations are included to demonstrate the proposed method.

  20. New t-gap insertion-deletion-like metrics for DNA hybridization thermodynamic modeling.

    PubMed

    D'yachkov, Arkadii G; Macula, Anthony J; Pogozelski, Wendy K; Renz, Thomas E; Rykov, Vyacheslav V; Torney, David C

    2006-05-01

    We discuss the concept of t-gap block isomorphic subsequences and use it to describe new abstract string metrics that are similar to the Levenshtein insertion-deletion metric. Some of the metrics that we define can be used to model a thermodynamic distance function on single-stranded DNA sequences. Our model captures a key aspect of the nearest neighbor thermodynamic model for hybridized DNA duplexes. One version of our metric gives the maximum number of stacked pairs of hydrogen bonded nucleotide base pairs that can be present in any secondary structure in a hybridized DNA duplex without pseudoknots. Thermodynamic distance functions are important components in the construction of DNA codes, and DNA codes are important components in biomolecular computing, nanotechnology, and other biotechnical applications that employ DNA hybridization assays. We show how our new distances can be calculated by using a dynamic programming method, and we derive a Varshamov-Gilbert-like lower bound on the size of some of codes using these distance functions as constraints. We also discuss software implementation of our DNA code design methods.

Top