Sample records for abstracting code specific

  1. Coding the Eggen Cards (Poster abstract)

    NASA Astrophysics Data System (ADS)

    Silvis, G.

    2014-06-01

    (Abstract only) A look at the Eggen Portal for accessing the Eggen cards. And a call for volunteers to help code the cards: 100,000 cards must be looked at and their star references identified and coded into the database for this to be a valuable resource.

  2. Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).

    PubMed

    Paivio, Allan

    2013-02-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved

  3. Validation of International Classification of Diseases coding for bone metastases in electronic health records using technology-enabled abstraction.

    PubMed

    Liede, Alexander; Hernandez, Rohini K; Roth, Maayan; Calkins, Geoffrey; Larrabee, Katherine; Nicacio, Leo

    2015-01-01

    The accuracy of bone metastases diagnostic coding based on International Classification of Diseases, ninth revision (ICD-9) is unknown for most large databases used for epidemiologic research in the US. Electronic health records (EHR) are the preferred source of data, but often clinically relevant data occur only as unstructured free text. We examined the validity of bone metastases ICD-9 coding in structured EHR and administrative claims relative to the complete (structured and unstructured) patient chart obtained through technology-enabled chart abstraction. Female patients with breast cancer with ≥1 visit after November 2010 were identified from three community oncology practices in the US. We calculated sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of bone metastases ICD-9 code 198.5. The technology-enabled abstraction displays portions of the chart to clinically trained abstractors for targeted review, thereby maximizing efficiency. We evaluated effects of misclassification of patients developing skeletal complications or treated with bone-targeting agents (BTAs), and timing of BTA. Among 8,796 patients with breast cancer, 524 had confirmed bone metastases using chart abstraction. Sensitivity was 0.67 (95% confidence interval [CI] =0.63-0.71) based on structured EHR, and specificity was high at 0.98 (95% CI =0.98-0.99) with corresponding PPV of 0.71 (95% CI =0.67-0.75) and NPV of 0.98 (95% CI =0.98-0.98). From claims, sensitivity was 0.78 (95% CI =0.74-0.81), and specificity was 0.98 (95% CI =0.98-0.98) with PPV of 0.72 (95% CI =0.68-0.76) and NPV of 0.99 (95% CI =0.98-0.99). Structured data and claims missed 17% of bone metastases (89 of 524). False negatives were associated with measurable overestimation of the proportion treated with BTA or with a skeletal complication. Median date of diagnosis was delayed in structured data (32 days) and claims (43 days) compared with technology-assisted EHR. Technology

  4. Coding of obesity in administrative hospital discharge abstract data: accuracy and impact for future research studies.

    PubMed

    Martin, Billie-Jean; Chen, Guanmin; Graham, Michelle; Quan, Hude

    2014-02-13

    Obesity is a pervasive problem and a popular subject of academic assessment. The ability to take advantage of existing data, such as administrative databases, to study obesity is appealing. The objective of our study was to assess the validity of obesity coding in an administrative database and compare the association between obesity and outcomes in an administrative database versus registry. This study was conducted using a coronary catheterization registry and an administrative database (Discharge Abstract Database (DAD)). A Body Mass Index (BMI) ≥30 kg/m2 within the registry defined obesity. In the DAD obesity was defined by diagnosis codes E65-E68 (ICD-10). The sensitivity, specificity, negative predictive value (NPV) and positive predictive value (PPV) of an obesity diagnosis in the DAD was determined using obesity diagnosis in the registry as the referent. The association between obesity and outcomes was assessed. The study population of 17380 subjects was largely male (68.8%) with a mean BMI of 27.0 kg/m2. Obesity prevalence was lower in the DAD than registry (2.4% vs. 20.3%). A diagnosis of obesity in the DAD had a sensitivity 7.75%, specificity 98.98%, NPV 80.84% and PPV 65.94%. Obesity was associated with decreased risk of death or re-hospitalization, though non-significantly within the DAD. Obesity was significantly associated with an increased risk of cardiac procedure in both databases. Overall, obesity was poorly coded in the DAD. However, when coded, it was coded accurately. Administrative databases are not an optimal datasource for obesity prevalence and incidence surveillance but could be used to define obese cohorts for follow-up.

  5. Assessment of incidence of severe sepsis in Sweden using different ways of abstracting International Classification of Diseases codes: difficulties with methods and interpretation of results.

    PubMed

    Wilhelms, Susanne B; Huss, Fredrik R; Granath, Göran; Sjöberg, Folke

    2010-06-01

    To compare three International Classification of Diseases code abstraction strategies that have previously been reported to mirror severe sepsis by examining retrospective Swedish national data from 1987 to 2005 inclusive. Retrospective cohort study. Swedish hospital discharge database. All hospital admissions during the period 1987 to 2005 were extracted and these patients were screened for severe sepsis using the three International Classification of Diseases code abstraction strategies, which were adapted for the Swedish version of the International Classification of Diseases. Two code abstraction strategies included both International Classification of Diseases, Ninth Revision and International Classification of Diseases, Tenth Revision codes, whereas one included International Classification of Diseases, Tenth Revision codes alone. None. The three International Classification of Diseases code abstraction strategies identified 37,990, 27,655, and 12,512 patients, respectively, with severe sepsis. The incidence increased over the years, reaching 0.35 per 1000, 0.43 per 1000, and 0.13 per 1000 inhabitants, respectively. During the International Classification of Diseases, Ninth Revision period, we found 17,096 unique patients and of these, only 2789 patients (16%) met two of the code abstraction strategy lists and 14,307 (84%) met one list. The International Classification of Diseases, Tenth Revision period included 46,979 unique patients, of whom 8% met the criteria of all three International Classification of Diseases code abstraction strategies, 7% met two, and 84% met one only. The three different International Classification of Diseases code abstraction strategies generated three almost separate cohorts of patients with severe sepsis. Thus, the International Classification of Diseases code abstraction strategies for recording severe sepsis in use today provides an unsatisfactory way of estimating the true incidence of severe sepsis. Further studies relating

  6. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  7. Dual Coding Theory, Word Abstractness, and Emotion: A Critical Review of Kousta et al. (2011)

    ERIC Educational Resources Information Center

    Paivio, Allan

    2013-01-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of…

  8. Emergence of an abstract categorical code enabling the discrimination of temporally structured tactile stimuli

    PubMed Central

    Rossi-Pool, Román; Salinas, Emilio; Zainos, Antonio; Alvarez, Manuel; Vergara, José; Parga, Néstor; Romo, Ranulfo

    2016-01-01

    The problem of neural coding in perceptual decision making revolves around two fundamental questions: (i) How are the neural representations of sensory stimuli related to perception, and (ii) what attributes of these neural responses are relevant for downstream networks, and how do they influence decision making? We studied these two questions by recording neurons in primary somatosensory (S1) and dorsal premotor (DPC) cortex while trained monkeys reported whether the temporal pattern structure of two sequential vibrotactile stimuli (of equal mean frequency) was the same or different. We found that S1 neurons coded the temporal patterns in a literal way and only during the stimulation periods and did not reflect the monkeys’ decisions. In contrast, DPC neurons coded the stimulus patterns as broader categories and signaled them during the working memory, comparison, and decision periods. These results show that the initial sensory representation is transformed into an intermediate, more abstract categorical code that combines past and present information to ultimately generate a perceptually informed choice. PMID:27872293

  9. Semantic domain-specific functional integration for action-related vs. abstract concepts.

    PubMed

    Ghio, Marta; Tettamanti, Marco

    2010-03-01

    A central topic in cognitive neuroscience concerns the representation of concepts and the specific neural mechanisms that mediate conceptual knowledge. Recently proposed modal theories assert that concepts are grounded on the integration of multimodal, distributed representations. The aim of the present work is to complement the available neuropsychological and neuroimaging evidence suggesting partially segregated anatomo-functional correlates for concrete vs. abstract concepts, by directly testing the semantic domain-specific patterns of functional integration between language and modal semantic brain regions. We report evidence from a functional magnetic resonance imaging study, in which healthy participants listened to sentences with either an action-related (actions involving physical entities) or an abstract (no physical entities involved) content. We measured functional integration using dynamic causal modeling, and found that the left superior temporal gyrus was more strongly connected: (1) for action-related vs. abstract sentences, with the left-hemispheric action representation system, including sensorimotor areas; (2) for abstract vs. action-related sentences, with left infero-ventral frontal, temporal, and retrosplenial cingulate areas. A selective directionality effect was observed, with causal modulatory effects exerted by perisylvian language regions on peripheral modal areas, and not vice versa. The observed condition-specific modulatory effects are consistent with embodied and situated language processing theories, and indicate that linguistic areas promote a semantic content-specific reactivation of modal simulations by top-down mechanisms. Copyright 2008 Elsevier Inc. All rights reserved.

  10. Modelling Metamorphism by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.

    Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.

  11. The Feedback-related Negativity Codes Components of Abstract Inference during Reward-based Decision-making.

    PubMed

    Reiter, Andrea M F; Koch, Stefan P; Schröger, Erich; Hinrichs, Hermann; Heinze, Hans-Jochen; Deserno, Lorenz; Schlagenhauf, Florian

    2016-08-01

    Behavioral control is influenced not only by learning from the choices made and the rewards obtained but also by "what might have happened," that is, inference about unchosen options and their fictive outcomes. Substantial progress has been made in understanding the neural signatures of direct learning from choices that are actually made and their associated rewards via reward prediction errors (RPEs). However, electrophysiological correlates of abstract inference in decision-making are less clear. One seminal theory suggests that the so-called feedback-related negativity (FRN), an ERP peaking 200-300 msec after a feedback stimulus at frontocentral sites of the scalp, codes RPEs. Hitherto, the FRN has been predominantly related to a so-called "model-free" RPE: The difference between the observed outcome and what had been expected. Here, by means of computational modeling of choice behavior, we show that individuals employ abstract, "double-update" inference on the task structure by concurrently tracking values of chosen stimuli (associated with observed outcomes) and unchosen stimuli (linked to fictive outcomes). In a parametric analysis, model-free RPEs as well as their modification because of abstract inference were regressed against single-trial FRN amplitudes. We demonstrate that components related to abstract inference uniquely explain variance in the FRN beyond model-free RPEs. These findings advance our understanding of the FRN and its role in behavioral adaptation. This might further the investigation of disturbed abstract inference, as proposed, for example, for psychiatric disorders, and its underlying neural correlates.

  12. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  13. Single neurons in prefrontal cortex encode abstract rules.

    PubMed

    Wallis, J D; Anderson, K C; Miller, E K

    2001-06-21

    The ability to abstract principles or rules from direct experience allows behaviour to extend beyond specific circumstances to general situations. For example, we learn the 'rules' for restaurant dining from specific experiences and can then apply them in new restaurants. The use of such rules is thought to depend on the prefrontal cortex (PFC) because its damage often results in difficulty in following rules. Here we explore its neural basis by recording from single neurons in the PFC of monkeys trained to use two abstract rules. They were required to indicate whether two successively presented pictures were the same or different depending on which rule was currently in effect. The monkeys performed this task with new pictures, thus showing that they had learned two general principles that could be applied to stimuli that they had not yet experienced. The most prevalent neuronal activity observed in the PFC reflected the coding of these abstract rules.

  14. An abstract model of rogue code insertion into radio frequency wireless networks. The effects of computer viruses on the Program Management Office

    NASA Astrophysics Data System (ADS)

    Feudo, Christopher V.

    1994-04-01

    This dissertation demonstrates that inadequately protected wireless LANs are more vulnerable to rogue program attack than traditional LANs. Wireless LANs not only run the same risks as traditional LANs, but they also run additional risks associated with an open transmission medium. Intruders can scan radio waves and, given enough time and resources, intercept, analyze, decipher, and reinsert data into the transmission medium. This dissertation describes the development and instantiation of an abstract model of the rogue code insertion process into a DOS-based wireless communications system using radio frequency (RF) atmospheric signal transmission. The model is general enough to be applied to widely used target environments such as UNIX, Macintosh, and DOS operating systems. The methodology and three modules, the prober, activator, and trigger modules, to generate rogue code and insert it into a wireless LAN were developed to illustrate the efficacy of the model. Also incorporated into the model are defense measures against remotely introduced rogue programs and a cost-benefit analysis that determined that such defenses for a specific environment were cost-justified.

  15. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  16. Certifying Domain-Specific Policies

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2001-01-01

    Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.

  17. Data Abstraction in GLISP.

    ERIC Educational Resources Information Center

    Novak, Gordon S., Jr.

    GLISP is a high-level computer language (based on Lisp and including Lisp as a sublanguage) which is compiled into Lisp. GLISP programs are compiled relative to a knowledge base of object descriptions, a form of abstract datatypes. A primary goal of the use of abstract datatypes in GLISP is to allow program code to be written in terms of objects,…

  18. Improving the sensitivity and specificity of the abbreviated injury scale coding system.

    PubMed Central

    Kramer, C F; Barancik, J I; Thode, H C

    1990-01-01

    The Abbreviated Injury Scale with Epidemiologic Modifications (AIS 85-EM) was developed to make it possible to code information about anatomic injury types and locations that, although generally available from medical records, is not codable under the standard Abbreviated Injury Scale, published by the American Association for Automotive Medicine in 1985 (AIS 85). In a population-based sample of 3,223 motor vehicle trauma cases, 68 percent of the patients had one or more injuries that were coded to the AIS 85 body region nonspecific category external. When the same patients' injuries were coded using the AIS 85-EM coding procedure, only 15 percent of the patients had injuries that could not be coded to a specific body region. With AIS 85-EM, the proportion of codable head injury cases increased from 16 percent to 37 percent, thereby improving the potential for identifying cases with head and threshold brain injury. The data suggest that body region coding of all injuries is necessary to draw valid and reliable conclusions about changes in injury patterns and their sequelae. The increased specificity of body region coding improves assessments of the efficacy of injury intervention strategies and countermeasure programs using epidemiologic methodology. PMID:2116633

  19. Argument structure and the representation of abstract semantics.

    PubMed

    Rodríguez-Ferreiro, Javier; Andreu, Llorenç; Sanz-Torrent, Mònica

    2014-01-01

    According to the dual coding theory, differences in the ease of retrieval between concrete and abstract words are related to the exclusive dependence of abstract semantics on linguistic information. Argument structure can be considered a measure of the complexity of the linguistic contexts that accompany a verb. If the retrieval of abstract verbs relies more on the linguistic codes they are associated to, we could expect a larger effect of argument structure for the processing of abstract verbs. In this study, sets of length- and frequency-matched verbs including 40 intransitive verbs, 40 transitive verbs taking simple complements, and 40 transitive verbs taking sentential complements were presented in separate lexical and grammatical decision tasks. Half of the verbs were concrete and half were abstract. Similar results were obtained in the two tasks, with significant effects of imageability and transitivity. However, the interaction between these two variables was not significant. These results conflict with hypotheses assuming a stronger reliance of abstract semantics on linguistic codes. In contrast, our data are in line with theories that link the ease of retrieval with availability and robustness of semantic information.

  20. ERP evidence for hemispheric asymmetries in abstract but not exemplar-specific repetition priming.

    PubMed

    Küper, Kristina; Liesefeld, Anna M; Zimmer, Hubert D

    2015-12-01

    Implicit memory retrieval is thought to be exemplar-specific in the right hemisphere (RH) but abstract in the left hemisphere (LH). Yet, conflicting behavioral priming results illustrate that the level at which asymmetries take effect is difficult to pinpoint. In the present divided visual field experiment, we tried to address this issue by analyzing ERPs in addition to behavioral measures. Participants made a natural/artificial decision on lateralized visual objects that were either new, identical repetitions, or different exemplars of studied items. Hemispheric asymmetries did not emerge in either behavioral or late positive complex (LPC) priming effects, but did affect the process of implicit memory retrieval proper as indexed by an early frontal negativity (N350/(F)N400). Whereas exemplar-specific N350/(F)N400 priming effects emerged irrespective of presentation side, abstract implicit memory retrieval of different exemplars was contingent on right visual field presentation and the ensuing initial stimulus processing by the LH. © 2015 Society for Psychophysiological Research.

  1. The representation of abstract words: why emotion matters.

    PubMed

    Kousta, Stavroula-Thaleia; Vigliocco, Gabriella; Vinson, David P; Andrews, Mark; Del Campo, Elena

    2011-02-01

    Although much is known about the representation and processing of concrete concepts, knowledge of what abstract semantics might be is severely limited. In this article we first address the adequacy of the 2 dominant accounts (dual coding theory and the context availability model) put forward in order to explain representation and processing differences between concrete and abstract words. We find that neither proposal can account for experimental findings and that this is, at least partly, because abstract words are considered to be unrelated to experiential information in both of these accounts. We then address a particular type of experiential information, emotional content, and demonstrate that it plays a crucial role in the processing and representation of abstract concepts: Statistically, abstract words are more emotionally valenced than are concrete words, and this accounts for a residual latency advantage for abstract words, when variables such as imageability (a construct derived from dual coding theory) and rated context availability are held constant. We conclude with a discussion of our novel hypothesis for embodied abstract semantics. (c) 2010 APA, all rights reserved.

  2. Poly(A) code analyses reveal key determinants for tissue-specific mRNA alternative polyadenylation

    PubMed Central

    Weng, Lingjie; Li, Yi; Xie, Xiaohui; Shi, Yongsheng

    2016-01-01

    mRNA alternative polyadenylation (APA) is a critical mechanism for post-transcriptional gene regulation and is often regulated in a tissue- and/or developmental stage-specific manner. An ultimate goal for the APA field has been to be able to computationally predict APA profiles under different physiological or pathological conditions. As a first step toward this goal, we have assembled a poly(A) code for predicting tissue-specific poly(A) sites (PASs). Based on a compendium of over 600 features that have known or potential roles in PAS selection, we have generated and refined a machine-learning algorithm using multiple high-throughput sequencing-based data sets of tissue-specific and constitutive PASs. This code can predict tissue-specific PASs with >85% accuracy. Importantly, by analyzing the prediction performance based on different RNA features, we found that PAS context, including the distance between alternative PASs and the relative position of a PAS within the gene, is a key feature for determining the susceptibility of a PAS to tissue-specific regulation. Our poly(A) code provides a useful tool for not only predicting tissue-specific APA regulation, but also for studying its underlying molecular mechanisms. PMID:27095026

  3. Specific and Modular Binding Code for Cytosine Recognition in Pumilio/FBF (PUF) RNA-binding Domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Shuyun; Wang, Yang; Cassidy-Amstutz, Caleb

    2011-10-28

    Pumilio/fem-3 mRNA-binding factor (PUF) proteins possess a recognition code for bases A, U, and G, allowing designed RNA sequence specificity of their modular Pumilio (PUM) repeats. However, recognition side chains in a PUM repeat for cytosine are unknown. Here we report identification of a cytosine-recognition code by screening random amino acid combinations at conserved RNA recognition positions using a yeast three-hybrid system. This C-recognition code is specific and modular as specificity can be transferred to different positions in the RNA recognition sequence. A crystal structure of a modified PUF domain reveals specific contacts between an arginine side chain and themore » cytosine base. We applied the C-recognition code to design PUF domains that recognize targets with multiple cytosines and to generate engineered splicing factors that modulate alternative splicing. Finally, we identified a divergent yeast PUF protein, Nop9p, that may recognize natural target RNAs with cytosine. This work deepens our understanding of natural PUF protein target recognition and expands the ability to engineer PUF domains to recognize any RNA sequence.« less

  4. Thyra Abstract Interface Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A.

    2005-09-01

    Thrya primarily defines a set of abstract C++ class interfaces needed for the development of abstract numerical atgorithms (ANAs) such as iterative linear solvers, transient solvers all the way up to optimization. At the foundation of these interfaces are abstract C++ classes for vectors, vector spaces, linear operators and multi-vectors. Also included in the Thyra package is C++ code for creating concrete vector, vector space, linear operator, and multi-vector subclasses as well as other utilities to aid in the development of ANAs. Currently, very general and efficient concrete subclass implementations exist for serial and SPMD in-core vectors and multi-vectors. Codemore » also currently exists for testing objects and providing composite objects such as product vectors.« less

  5. Emergence of Coding and its Specificity as a Physico-Informatic Problem

    NASA Astrophysics Data System (ADS)

    Wills, Peter R.; Nieselt, Kay; McCaskill, John S.

    2015-06-01

    We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.

  6. Under-coding of secondary conditions in coded hospital health data: Impact of co-existing conditions, death status and number of codes in a record.

    PubMed

    Peng, Mingkai; Southern, Danielle A; Williamson, Tyler; Quan, Hude

    2017-12-01

    This study examined the coding validity of hypertension, diabetes, obesity and depression related to the presence of their co-existing conditions, death status and the number of diagnosis codes in hospital discharge abstract database. We randomly selected 4007 discharge abstract database records from four teaching hospitals in Alberta, Canada and reviewed their charts to extract 31 conditions listed in Charlson and Elixhauser comorbidity indices. Conditions associated with the four study conditions were identified through multivariable logistic regression. Coding validity (i.e. sensitivity, positive predictive value) of the four conditions was related to the presence of their associated conditions. Sensitivity increased with increasing number of diagnosis code. Impact of death on coding validity is minimal. Coding validity of conditions is closely related to its clinical importance and complexity of patients' case mix. We recommend mandatory coding of certain secondary diagnosis to meet the need of health research based on administrative health data.

  7. Use abstracted patient-specific features to assist an information-theoretic measurement to assess similarity between medical cases

    PubMed Central

    Cao, Hui; Melton, Genevieve B.; Markatou, Marianthi; Hripcsak, George

    2008-01-01

    Inter-case similarity metrics can potentially help find similar cases from a case base for evidence-based practice. While several methods to measure similarity between cases have been proposed, developing an effective means for measuring patient case similarity remains a challenging problem. We were interested in examining how abstracting could potentially assist computing case similarity. In this study, abstracted patient-specific features from medical records were used to improve an existing information-theoretic measurement. The developed metric, using a combination of abstracted disease, finding, procedure and medication features, achieved a correlation between 0.6012 and 0.6940 to experts. PMID:18487093

  8. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  9. Disease-Specific Trends of Comorbidity Coding and Implications for Risk Adjustment in Hospital Administrative Data.

    PubMed

    Nimptsch, Ulrike

    2016-06-01

    To investigate changes in comorbidity coding after the introduction of diagnosis related groups (DRGs) based prospective payment and whether trends differ regarding specific comorbidities. Nationwide administrative data (DRG statistics) from German acute care hospitals from 2005 to 2012. Observational study to analyze trends in comorbidity coding in patients hospitalized for common primary diseases and the effects on comorbidity-related risk of in-hospital death. Comorbidity coding was operationalized by Elixhauser diagnosis groups. The analyses focused on adult patients hospitalized for the primary diseases of heart failure, stroke, and pneumonia, as well as hip fracture. When focusing the total frequency of diagnosis groups per record, an increase in depth of coding was observed. Between-hospital variations in depth of coding were present throughout the observation period. Specific comorbidity increases were observed in 15 of the 31 diagnosis groups, and decreases in comorbidity were observed for 11 groups. In patients hospitalized for heart failure, shifts of comorbidity-related risk of in-hospital death occurred in nine diagnosis groups, in which eight groups were directed toward the null. Comorbidity-adjusted outcomes in longitudinal administrative data analyses may be biased by nonconstant risk over time, changes in completeness of coding, and between-hospital variations in coding. Accounting for such issues is important when the respective observation period coincides with changes in the reimbursement system or other conditions that are likely to alter clinical coding practice. © Health Research and Educational Trust.

  10. A phase code for memory could arise from circuit mechanisms in entorhinal cortex

    PubMed Central

    Hasselmo, Michael E.; Brandon, Mark P.; Yoshida, Motoharu; Giocomo, Lisa M.; Heys, James G.; Fransen, Erik; Newman, Ehren L.; Zilli, Eric A.

    2009-01-01

    Neurophysiological data reveals intrinsic cellular properties that suggest how entorhinal cortical neurons could code memory by the phase of their firing. Potential cellular mechanisms for this phase coding in models of entorhinal function are reviewed. This mechanism for phase coding provides a substrate for modeling the responses of entorhinal grid cells, as well as the replay of neural spiking activity during waking and sleep. Efforts to implement these abstract models in more detailed biophysical compartmental simulations raise specific issues that could be addressed in larger scale population models incorporating mechanisms of inhibition. PMID:19656654

  11. Domain Specific Language Support for Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadayappan, Ponnuswamy

    Domain-Specific Languages (DSLs) offer an attractive path to Exascale software since they provide expressive power through appropriate abstractions and enable domain-specific optimizations. But the advantages of a DSL compete with the difficulties of implementing a DSL, even for a narrowly defined domain. The DTEC project addresses how a variety of DSLs can be easily implemented to leverage existing compiler analysis and transformation capabilities within the ROSE open source compiler as part of a research program focusing on Exascale challenges. The OSU contributions to the DTEC project are in the area of code generation from high-level DSL descriptions, as well asmore » verification of the automatically-generated code.« less

  12. An automatic method to generate domain-specific investigator networks using PubMed abstracts.

    PubMed

    Yu, Wei; Yesupriya, Ajay; Wulf, Anja; Qu, Junfeng; Gwinn, Marta; Khoury, Muin J

    2007-06-20

    Collaboration among investigators has become critical to scientific research. This includes ad hoc collaboration established through personal contacts as well as formal consortia established by funding agencies. Continued growth in online resources for scientific research and communication has promoted the development of highly networked research communities. Extending these networks globally requires identifying additional investigators in a given domain, profiling their research interests, and collecting current contact information. We present a novel strategy for building investigator networks dynamically and producing detailed investigator profiles using data available in PubMed abstracts. We developed a novel strategy to obtain detailed investigator information by automatically parsing the affiliation string in PubMed records. We illustrated the results by using a published literature database in human genome epidemiology (HuGE Pub Lit) as a test case. Our parsing strategy extracted country information from 92.1% of the affiliation strings in a random sample of PubMed records and in 97.0% of HuGE records, with accuracies of 94.0% and 91.0%, respectively. Institution information was parsed from 91.3% of the general PubMed records (accuracy 86.8%) and from 94.2% of HuGE PubMed records (accuracy 87.0). We demonstrated the application of our approach to dynamic creation of investigator networks by creating a prototype information system containing a large database of PubMed abstracts relevant to human genome epidemiology (HuGE Pub Lit), indexed using PubMed medical subject headings converted to Unified Medical Language System concepts. Our method was able to identify 70-90% of the investigators/collaborators in three different human genetics fields; it also successfully identified 9 of 10 genetics investigators within the PREBIC network, an existing preterm birth research network. We successfully created a web-based prototype capable of creating domain-specific

  13. An automatic method to generate domain-specific investigator networks using PubMed abstracts

    PubMed Central

    Yu, Wei; Yesupriya, Ajay; Wulf, Anja; Qu, Junfeng; Gwinn, Marta; Khoury, Muin J

    2007-01-01

    capable of creating domain-specific investigator networks based on an application that accurately generates detailed investigator profiles from PubMed abstracts combined with robust standard vocabularies. This approach could be used for other biomedical fields to efficiently establish domain-specific investigator networks. PMID:17584920

  14. A domain specific language for performance portable molecular dynamics algorithms

    NASA Astrophysics Data System (ADS)

    Saunders, William Robert; Grant, James; Müller, Eike Hermann

    2018-03-01

    Developers of Molecular Dynamics (MD) codes face significant challenges when adapting existing simulation packages to new hardware. In a continuously diversifying hardware landscape it becomes increasingly difficult for scientists to be experts both in their own domain (physics/chemistry/biology) and specialists in the low level parallelisation and optimisation of their codes. To address this challenge, we describe a "Separation of Concerns" approach for the development of parallel and optimised MD codes: the science specialist writes code at a high abstraction level in a domain specific language (DSL), which is then translated into efficient computer code by a scientific programmer. In a related context, an abstraction for the solution of partial differential equations with grid based methods has recently been implemented in the (Py)OP2 library. Inspired by this approach, we develop a Python code generation system for molecular dynamics simulations on different parallel architectures, including massively parallel distributed memory systems and GPUs. We demonstrate the efficiency of the auto-generated code by studying its performance and scalability on different hardware and compare it to other state-of-the-art simulation packages. With growing data volumes the extraction of physically meaningful information from the simulation becomes increasingly challenging and requires equally efficient implementations. A particular advantage of our approach is the easy expression of such analysis algorithms. We consider two popular methods for deducing the crystalline structure of a material from the local environment of each atom, show how they can be expressed in our abstraction and implement them in the code generation framework.

  15. Design implications for task-specific search utilities for retrieval and re-engineering of code

    NASA Astrophysics Data System (ADS)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  16. The PP1 binding code: a molecular-lego strategy that governs specificity.

    PubMed

    Heroes, Ewald; Lesage, Bart; Görnemann, Janina; Beullens, Monique; Van Meervelt, Luc; Bollen, Mathieu

    2013-01-01

    Ser/Thr protein phosphatase 1 (PP1) is a single-domain hub protein with nearly 200 validated interactors in vertebrates. PP1-interacting proteins (PIPs) are ubiquitously expressed but show an exceptional diversity in brain, testis and white blood cells. The binding of PIPs is mainly mediated by short motifs that dock to surface grooves of PP1. Although PIPs often contain variants of the same PP1 binding motifs, they differ in the number and combination of docking sites. This molecular-lego strategy for binding to PP1 creates holoenzymes with unique properties. The PP1 binding code can be described as specific, universal, degenerate, nonexclusive and dynamic. PIPs control associated PP1 by interference with substrate recruitment or access to the active site. In addition, some PIPs have a subcellular targeting domain that promotes dephosphorylation by increasing the local concentration of PP1. The diversity of the PP1 interactome and the properties of the PP1 binding code account for the exquisite specificity of PP1 in vivo. © 2012 The Authors Journal compilation © 2012 FEBS.

  17. California State Library: Processing Center Design and Specifications. Volume III, Coding Manual.

    ERIC Educational Resources Information Center

    Sherman, Don; Shoffner, Ralph M.

    As part of the report on the California State Library Processing Center design and specifications, this volume is a coding manual for the conversion of catalog card data to a machine-readable form. The form is compatible with the national MARC system, while at the same time it contains provisions for problems peculiar to the local situation. This…

  18. Non-coding cancer driver candidates identified with a sample- and position-specific model of the somatic mutation rate

    PubMed Central

    Juul, Malene; Bertl, Johanna; Guo, Qianyun; Nielsen, Morten Muhlig; Świtnicki, Michał; Hornshøj, Henrik; Madsen, Tobias; Hobolth, Asger; Pedersen, Jakob Skou

    2017-01-01

    Non-coding mutations may drive cancer development. Statistical detection of non-coding driver regions is challenged by a varying mutation rate and uncertainty of functional impact. Here, we develop a statistically founded non-coding driver-detection method, ncdDetect, which includes sample-specific mutational signatures, long-range mutation rate variation, and position-specific impact measures. Using ncdDetect, we screened non-coding regulatory regions of protein-coding genes across a pan-cancer set of whole-genomes (n = 505), which top-ranked known drivers and identified new candidates. For individual candidates, presence of non-coding mutations associates with altered expression or decreased patient survival across an independent pan-cancer sample set (n = 5454). This includes an antigen-presenting gene (CD1A), where 5’UTR mutations correlate significantly with decreased survival in melanoma. Additionally, mutations in a base-excision-repair gene (SMUG1) correlate with a C-to-T mutational-signature. Overall, we find that a rich model of mutational heterogeneity facilitates non-coding driver identification and integrative analysis points to candidates of potential clinical relevance. DOI: http://dx.doi.org/10.7554/eLife.21778.001 PMID:28362259

  19. Effects of syntactic structure in the memory of concrete and abstract Chinese sentences.

    PubMed

    Ho, C S; Chen, H C

    1993-09-01

    Smith (1981) found that concrete English sentences were better recognized than abstract sentences and that this concreteness effect was potent only when the concrete sentence was also affirmative but the effect switched to an opposite end when the concrete sentence was negative. These results were partially replicated in Experiment 1 by using materials from a very different language (i.e., Chinese): concrete-affirmative sentences were better remembered than concrete-negative and abstract sentences, but no reliable difference was found between the latter two types. In Experiment 2, the task was modified by using a visual presentation instead of an oral one as in Experiment 1. Both concrete-affirmative and concrete-negative sentences were better memorized then abstract ones in Experiment 2. The findings in the two experiments are explained by a combination of the dual-coding model and Marschark's (1985) item-specific and relational processing. The differential effects of experience with different language systems on processing verbal materials in memory are also discussed.

  20. Abstract feature codes: The building blocks of the implicit learning system.

    PubMed

    Eberhardt, Katharina; Esser, Sarah; Haider, Hilde

    2017-07-01

    According to the Theory of Event Coding (TEC; Hommel, Müsseler, Aschersleben, & Prinz, 2001), action and perception are represented in a shared format in the cognitive system by means of feature codes. In implicit sequence learning research, it is still common to make a conceptual difference between independent motor and perceptual sequences. This supposedly independent learning takes place in encapsulated modules (Keele, Ivry, Mayr, Hazeltine, & Heuer 2003) that process information along single dimensions. These dimensions have remained underspecified so far. It is especially not clear whether stimulus and response characteristics are processed in separate modules. Here, we suggest that feature dimensions as they are described in the TEC should be viewed as the basic content of modules of implicit learning. This means that the modules process all stimulus and response information related to certain feature dimensions of the perceptual environment. In 3 experiments, we investigated by means of a serial reaction time task the nature of the basic units of implicit learning. As a test case, we used stimulus location sequence learning. The results show that a stimulus location sequence and a response location sequence cannot be learned without interference (Experiment 2) unless one of the sequences can be coded via an alternative, nonspatial dimension (Experiment 3). These results support the notion that spatial location is one module of the implicit learning system and, consequently, that there are no separate processing units for stimulus versus response locations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Processing concrete words: fMRI evidence against a specific right-hemisphere involvement.

    PubMed

    Fiebach, Christian J; Friederici, Angela D

    2004-01-01

    Behavioral, patient, and electrophysiological studies have been taken as support for the assumption that processing of abstract words is confined to the left hemisphere, whereas concrete words are processed also by right-hemispheric brain areas. These are thought to provide additional information from an imaginal representational system, as postulated in the dual-coding theory of memory and cognition. Here we report new event-related fMRI data on the processing of concrete and abstract words in a lexical decision task. While abstract words activated a subregion of the left inferior frontal gyrus (BA 45) more strongly than concrete words, specific activity for concrete words was observed in the left basal temporal cortex. These data as well as data from other neuroimaging studies reviewed here are not compatible with the assumption of a specific right-hemispheric involvement for concrete words. The combined findings rather suggest a revised view of the neuroanatomical bases of the imaginal representational system assumed in the dual-coding theory, at least with respect to word recognition.

  2. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  3. Single nucleotide polymorphism-specific regulation of matrix metalloproteinase-9 by multiple miRNAs targeting the coding exon

    PubMed Central

    Duellman, Tyler; Warren, Christopher; Yang, Jay

    2014-01-01

    Microribonucleic acids (miRNAs) work with exquisite specificity and are able to distinguish a target from a non-target based on a single nucleotide mismatch in the core nucleotide domain. We questioned whether miRNA regulation of gene expression could occur in a single nucleotide polymorphism (SNP)-specific manner, manifesting as a post-transcriptional control of expression of genetic polymorphisms. In our recent study of the functional consequences of matrix metalloproteinase (MMP)-9 SNPs, we discovered that expression of a coding exon SNP in the pro-domain of the protein resulted in a profound decrease in the secreted protein. This missense SNP results in the N38S amino acid change and a loss of an N-glycosylation site. A systematic study demonstrated that the loss of secreted protein was due not to the loss of an N-glycosylation site, but rather an SNP-specific targeting by miR-671-3p and miR-657. Bioinformatics analysis identified 41 SNP-specific miRNA targeting MMP-9 SNPs, mostly in the coding exon and an extension of the analysis to chromosome 20, where the MMP-9 gene is located, suggesting that SNP-specific miRNAs targeting the coding exon are prevalent. This selective post-transcriptional regulation of a target messenger RNA harboring genetic polymorphisms by miRNAs offers an SNP-dependent post-transcriptional regulatory mechanism, allowing for polymorphic-specific differential gene regulation. PMID:24627221

  4. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  5. Abstraction and reformulation in artificial intelligence.

    PubMed Central

    Holte, Robert C.; Choueiry, Berthe Y.

    2003-01-01

    This paper contributes in two ways to the aims of this special issue on abstraction. The first is to show that there are compelling reasons motivating the use of abstraction in the purely computational realm of artificial intelligence. The second is to contribute to the overall discussion of the nature of abstraction by providing examples of the abstraction processes currently used in artificial intelligence. Although each type of abstraction is specific to a somewhat narrow context, it is hoped that collectively they illustrate the richness and variety of abstraction in its fullest sense. PMID:12903653

  6. Abstraction and reformulation in artificial intelligence.

    PubMed

    Holte, Robert C; Choueiry, Berthe Y

    2003-07-29

    This paper contributes in two ways to the aims of this special issue on abstraction. The first is to show that there are compelling reasons motivating the use of abstraction in the purely computational realm of artificial intelligence. The second is to contribute to the overall discussion of the nature of abstraction by providing examples of the abstraction processes currently used in artificial intelligence. Although each type of abstraction is specific to a somewhat narrow context, it is hoped that collectively they illustrate the richness and variety of abstraction in its fullest sense.

  7. Imaginal, semantic, and surface-level processing of concrete and abstract words: an electrophysiological investigation.

    PubMed

    West, W C; Holcomb, P J

    2000-11-01

    Words representing concrete concepts are processed more quickly and efficiently than words representing abstract concepts. Concreteness effects have also been observed in studies using event-related brain potentials (ERPs). The aim of this study was to examine concrete and abstract words using both reaction time (RT) and ERP measurements to determine (1) at what point in the stream of cognitive processing concreteness effects emerge and (2) how different types of cognitive operations influence these concreteness effects. Three groups of subjects performed a sentence verification task in which the final word of each sentence was concrete or abstract. For each group the truthfulness judgment required either (1) image generation, (2) semantic decision, or (3) evaluation of surface characteristics. Concrete and abstract words produced similar RTs and ERPs in the surface task, suggesting that postlexical semantic processing is necessary to elicit concreteness effects. In both the semantic and imagery tasks, RTs were shorter for concrete than for abstract words. This difference was greatest in the imagery task. Also, in both of these tasks concrete words elicited more negative ERPs than abstract words between 300 and 550 msec (N400). This effect was widespread across the scalp and may reflect activation in a linguistic semantic system common to both concrete and abstract words. ERPs were also more negative for concrete than abstract words between 550 and 800 msec. This effect was more frontally distributed and was most evident in the imagery task. We propose that this later anterior effect represents a distinct ERP component (N700) that is sensitive to the use of mental imagery. The N700 may reflect the a access of specific characteristics of the imaged item or activation in a working memory system specific to mental imagery. These results also support the extended dual-coding hypothesis that superior associative connections and the use of mental imagery both contribute

  8. Vague Language in Conference Abstracts

    ERIC Educational Resources Information Center

    Cutting, Joan

    2012-01-01

    This study examined abstracts for a British Association for Applied Linguistics conference and a Sociolinguistics Symposium, to define the genre of conference abstracts in terms of vague language, specifically universal general nouns (e.g. people) and research general nouns (e.g. results), and to discover if the language used reflected the level…

  9. Abstract Datatypes in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan

    1997-01-01

    PVS (Prototype Verification System) is a general-purpose environment for developing specifications and proofs. This document deals primarily with the abstract datatype mechanism in PVS which generates theories containing axioms and definitions for a class of recursive datatypes. The concepts underlying the abstract datatype mechanism are illustrated using ordered binary trees as an example. Binary trees are described by a PVS abstract datatype that is parametric in its value type. The type of ordered binary trees is then presented as a subtype of binary trees where the ordering relation is also taken as a parameter. We define the operations of inserting an element into, and searching for an element in an ordered binary tree; the bulk of the report is devoted to PVS proofs of some useful properties of these operations. These proofs illustrate various approaches to proving properties of abstract datatype operations. They also describe the built-in capabilities of the PVS proof checker for simplifying abstract datatype expressions.

  10. Specific expression of novel long non-coding RNAs in high-hyperdiploid childhood acute lymphoblastic leukemia

    PubMed Central

    Drouin, Simon; Caron, Maxime; St-Onge, Pascal; Gioia, Romain; Richer, Chantal; Oualkacha, Karim; Droit, Arnaud; Sinnett, Daniel

    2017-01-01

    Pre-B cell childhood acute lymphoblastic leukemia (pre-B cALL) is a heterogeneous disease involving many subtypes typically stratified using a combination of cytogenetic and molecular-based assays. These methods, although widely used, rely on the presence of known chromosomal translocations, which is a limiting factor. There is therefore a need for robust, sensitive, and specific molecular biomarkers unaffected by such limitations that would allow better risk stratification and consequently better clinical outcome. In this study we performed a transcriptome analysis of 56 pre-B cALL patients to identify expression signatures in different subtypes. In both protein-coding and long non-coding RNAs (lncRNA), we identified subtype-specific gene signatures distinguishing pre-B cALL subtypes, particularly in t(12;21) and hyperdiploid cases. The genes up-regulated in pre-B cALL subtypes were enriched in bivalent chromatin marks in their promoters. LncRNAs is a new and under-studied class of transcripts. The subtype-specific nature of lncRNAs suggests they may be suitable clinical biomarkers to guide risk stratification and targeted therapies in pre-B cALL patients. PMID:28346506

  11. The Representation of Abstract Words: Why Emotion Matters

    ERIC Educational Resources Information Center

    Kousta, Stavroula-Thaleia; Vigliocco, Gabriella; Vinson, David P.; Andrews, Mark; Del Campo, Elena

    2011-01-01

    Although much is known about the representation and processing of concrete concepts, knowledge of what abstract semantics might be is severely limited. In this article we first address the adequacy of the 2 dominant accounts (dual coding theory and the context availability model) put forward in order to explain representation and processing…

  12. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  13. Three stages during the evolution of the genetic code. [Abstract only

    NASA Technical Reports Server (NTRS)

    Baumann, U.; Oro, J.

    1994-01-01

    A diversification of the genetic code based on the number of codons available for the proteinous amino acids is established. Three groups of amino acids during evolution of the code are distinguished. On the basis of their chemical complexity and a small codon number those amino acids emerging later in a translation process are derived. Both criteria indicate that His, Phe, Tyr, Cys and either Lys or Asn were introduced in the second stage, whereas the number of codons alone gives evidence that Trp and Met were introduced in the third stage. The amino acids of stage one use purines rich codons, thus purines have been retained in their third codon position. All the amino acids introduced in the second stage, in contrast, use pyrimidines in this codon position. A low abundance of pyrimidines during early translation is derived. This assumption is supported by experiments on non enzymatic replication and interactions of DNA hairpin loops with a complementary strand. A back extrapolation concludes a high purine content of the first nucleic acids which gradually decreased during their evolution. Amino acids independently available form prebiotic synthesis were thus correlated to purine rich codons. Conclusions on prebiotic replication are discussed also in the light of recent codon usage data.

  14. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  15. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  16. Interfacing modules for integrating discipline specific structural mechanics codes

    NASA Technical Reports Server (NTRS)

    Endres, Ned M.

    1989-01-01

    An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.

  17. Inheritance-mode specific pathogenicity prioritization (ISPP) for human protein coding genes.

    PubMed

    Hsu, Jacob Shujui; Kwan, Johnny S H; Pan, Zhicheng; Garcia-Barcelo, Maria-Mercè; Sham, Pak Chung; Li, Miaoxin

    2016-10-15

    Exome sequencing studies have facilitated the detection of causal genetic variants in yet-unsolved Mendelian diseases. However, the identification of disease causal genes among a list of candidates in an exome sequencing study is still not fully settled, and it is often difficult to prioritize candidate genes for follow-up studies. The inheritance mode provides crucial information for understanding Mendelian diseases, but none of the existing gene prioritization tools fully utilize this information. We examined the characteristics of Mendelian disease genes under different inheritance modes. The results suggest that Mendelian disease genes with autosomal dominant (AD) inheritance mode are more haploinsufficiency and de novo mutation sensitive, whereas those autosomal recessive (AR) genes have significantly more non-synonymous variants and regulatory transcript isoforms. In addition, the X-linked (XL) Mendelian disease genes have fewer non-synonymous and synonymous variants. As a result, we derived a new scoring system for prioritizing candidate genes for Mendelian diseases according to the inheritance mode. Our scoring system assigned to each annotated protein-coding gene (N = 18 859) three pathogenic scores according to the inheritance mode (AD, AR and XL). This inheritance mode-specific framework achieved higher accuracy (area under curve  = 0.84) in XL mode. The inheritance-mode specific pathogenicity prioritization (ISPP) outperformed other well-known methods including Haploinsufficiency, Recessive, Network centrality, Genic Intolerance, Gene Damage Index and Gene Constraint scores. This systematic study suggests that genes manifesting disease inheritance modes tend to have unique characteristics. ISPP is included in KGGSeq v1.0 (http://grass.cgs.hku.hk/limx/kggseq/), and source code is available from (https://github.com/jacobhsu35/ISPP.git). mxli@hku.hkSupplementary information: Supplementary data are available at Bioinformatics online. © The Author

  18. Representations of temporal information in short-term memory: Are they modality-specific?

    PubMed

    Bratzke, Daniel; Quinn, Katrina R; Ulrich, Rolf; Bausenhart, Karin M

    2016-10-01

    Rattat and Picard (2012) reported that the coding of temporal information in short-term memory is modality-specific, that is, temporal information received via the visual (auditory) modality is stored as a visual (auditory) code. This conclusion was supported by modality-specific interference effects on visual and auditory duration discrimination, which were induced by secondary tasks (visual tracking or articulatory suppression), presented during a retention interval. The present study assessed the stability of these modality-specific interference effects. Our study did not replicate the selective interference pattern but rather indicated that articulatory suppression not only impairs short-term memory for auditory but also for visual durations. This result pattern supports a crossmodal or an abstract view of temporal encoding. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Knowledge acquisition for temporal abstraction.

    PubMed

    Stein, A; Musen, M A; Shahar, Y

    1996-01-01

    Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.

  20. Code Compression for DSP

    DTIC Science & Technology

    1998-12-01

    PAGES 6 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8...Automation Conference, June 1998. [Liao95] S. Liao, S. Devadas , K. Keutzer, “Code Density Optimization for Embedded DSP Processors Using Data Compression

  1. Tristan code and its application

    NASA Astrophysics Data System (ADS)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  2. Specificity Protein (Sp) Transcription Factors and Metformin Regulate Expression of the Long Non-coding RNA HULC

    EPA Science Inventory

    There is evidence that specificity protein 1 (Sp1) transcription factor (TF) regulates expression of long non-coding RNAs (lncRNAs) in hepatocellular carcinoma (HCC) cells. RNA interference (RNAi) studies showed that among several lncRNAs expressed in HepG2, SNU-449 and SK-Hep-1...

  3. Creating Semantic Waves: Using Legitimation Code Theory as a Tool to Aid the Teaching of Chemistry

    ERIC Educational Resources Information Center

    Blackie, Margaret A. L.

    2014-01-01

    This is a conceptual paper aimed at chemistry educators. The purpose of this paper is to illustrate the use of the semantic code of Legitimation Code Theory in chemistry teaching. Chemistry is an abstract subject which many students struggle to grasp. Legitimation Code Theory provides a way of separating out abstraction from complexity both of…

  4. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  5. Compendium of abstracts on statistical applications in geotechnical engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Deer, G. W.

    1983-09-01

    The results of a literature search of geotechnical and statistical abstracts are presented in tables listing specific topics, title of the abstract, main author and the file number under which the abstract can be found.

  6. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  7. Concreteness effects in semantic processing: ERP evidence supporting dual-coding theory.

    PubMed

    Kounios, J; Holcomb, P J

    1994-07-01

    Dual-coding theory argues that processing advantages for concrete over abstract (verbal) stimuli result from the operation of 2 systems (i.e., imaginal and verbal) for concrete stimuli, rather than just 1 (for abstract stimuli). These verbal and imaginal systems have been linked with the left and right hemispheres of the brain, respectively. Context-availability theory argues that concreteness effects result from processing differences in a single system. The merits of these theories were investigated by examining the topographic distribution of event-related brain potentials in 2 experiments (lexical decision and concrete-abstract classification). The results were most consistent with dual-coding theory. In particular, different scalp distributions of an N400-like negativity were elicited by concrete and abstract words.

  8. Utility of QR codes in biological collections

    PubMed Central

    Diazgranados, Mauricio; Funk, Vicki A.

    2013-01-01

    Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709

  9. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation

    PubMed Central

    Pujar, Shashikant; O’Leary, Nuala A; Farrell, Catherine M; Mudge, Jonathan M; Wallin, Craig; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bult, Carol J; Frankish, Adam; Pruitt, Kim D

    2018-01-01

    Abstract The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. PMID:29126148

  10. Design and optimization of a portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.

  11. Report number codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.N.

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in thismore » publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.« less

  12. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states

  13. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  14. Flexible Generation of Kalman Filter Code

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Wilson, Edward

    2006-01-01

    Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator

  15. Metacognition and abstract reasoning.

    PubMed

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue.

  16. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  17. Abstract shapes of RNA.

    PubMed

    Giegerich, Robert; Voss, Björn; Rehmsmeier, Marc

    2004-01-01

    The function of a non-protein-coding RNA is often determined by its structure. Since experimental determination of RNA structure is time-consuming and expensive, its computational prediction is of great interest, and efficient solutions based on thermodynamic parameters are known. Frequently, however, the predicted minimum free energy structures are not the native ones, leading to the necessity of generating suboptimal solutions. While this can be accomplished by a number of programs, the user is often confronted with large outputs of similar structures, although he or she is interested in structures with more fundamental differences, or, in other words, with different abstract shapes. Here, we formalize the concept of abstract shapes and introduce their efficient computation. Each shape of an RNA molecule comprises a class of similar structures and has a representative structure of minimal free energy within the class. Shape analysis is implemented in the program RNAshapes. We applied RNAshapes to the prediction of optimal and suboptimal abstract shapes of several RNAs. For a given energy range, the number of shapes is considerably smaller than the number of structures, and in all cases, the native structures were among the top shape representatives. This demonstrates that the researcher can quickly focus on the structures of interest, without processing up to thousands of near-optimal solutions. We complement this study with a large-scale analysis of the growth behaviour of structure and shape spaces. RNAshapes is available for download and as an online version on the Bielefeld Bioinformatics Server.

  18. Interactive Synthesis of Code Level Security Rules

    DTIC Science & Technology

    2017-04-01

    Interactive Synthesis of Code-Level Security Rules A Thesis Presented by Leo St. Amour to The Department of Computer Science in partial fulfillment...of the requirements for the degree of Master of Science in Computer Science Northeastern University Boston, Massachusetts April 2017 DISTRIBUTION...Abstract of the Thesis Interactive Synthesis of Code-Level Security Rules by Leo St. Amour Master of Science in Computer Science Northeastern University

  19. An ERP study of recognition memory for concrete and abstract pictures in school-aged children

    PubMed Central

    Boucher, Olivier; Chouinard-Leclaire, Christine; Muckle, Gina; Westerlund, Alissa; Burden, Matthew J.; Jacobson, Sandra W.; Jacobson, Joseph L.

    2016-01-01

    Recognition memory for concrete, nameable pictures is typically faster and more accurate than for abstract pictures. A dual-coding account for these findings suggests that concrete pictures are processed into verbal and image codes, whereas abstract pictures are encoded in image codes only. Recognition memory relies on two successive and distinct processes, namely familiarity and recollection. Whether these two processes are similarly or differently affected by stimulus concreteness remains unknown. This study examined the effect of picture concreteness on visual recognition memory processes using event-related potentials (ERPs). In a sample of children involved in a longitudinal study, participants (N = 96; mean age = 11.3 years) were assessed on a continuous visual recognition memory task in which half the pictures were easily nameable, everyday concrete objects, and the other half were three-dimensional abstract, sculpture-like objects. Behavioral performance and ERP correlates of familiarity and recollection (respectively, the FN400 and P600 repetition effects) were measured. Behavioral results indicated faster and more accurate identification of concrete pictures as “new” or “old” (i.e., previously displayed) compared to abstract pictures. ERPs were characterised by a larger repetition effect, on the P600 amplitude, for concrete than for abstract images, suggesting a graded recollection process dependant on the type of material to be recollected. Topographic differences were observed within the FN400 latency interval, especially over anterior-inferior electrodes, with the repetition effect more pronounced and localized over the left hemisphere for concrete stimuli, potentially reflecting different neural processes underlying early processing of verbal/semantic and visual material in memory. PMID:27329352

  20. A neural mechanism of cognitive control for resolving conflict between abstract task rules

    PubMed Central

    Sheu, Yi-Shin; Courtney, Susan M.

    2016-01-01

    Conflict between multiple sensory stimuli or potential motor responses is thought to be resolved via bias signals from prefrontal cortex. However, population codes in the prefrontal cortex also represent abstract information, such as task rules. How is conflict between active abstract representations resolved? We used functional neuroimaging to investigate the mechanism responsible for resolving conflict between abstract representations of task rules. Participants performed two different tasks based on a cue. We manipulated the degree of conflict at the task-rule level by training participants to associate the color and shape dimensions of the cue with either the same task rule (congruent cues) or different ones (incongruent cues). Phonological and semantic tasks were used in which performance depended on learned, abstract representations of information, rather than sensory features of the target stimulus or on any habituated stimulus-response associations. In addition, these tasks activate distinct regions that allowed us to measure magnitude of conflict between tasks. We found that incongruent cues were associated with increased activity in several cognitive control areas, including the inferior frontal gyrus, inferior parietal lobule, insula, and subcortical regions. Conflict between abstract representations appears to be resolved by rule-specific activity in the inferior frontal gyrus that is correlated with enhanced activity related to the relevant information. Furthermore, multivoxel pattern analysis of the activity in the inferior frontal gyrus was shown to carry information about both the currently relevant rule (semantic/phonological) and the currently relevant cue context (color/shape). Similar to models of attentional selection of conflicting sensory or motor representations, the current findings indicate part of the frontal cortex provides a bias signal, representing task rules, that enhances task-relevant information. However, the frontal cortex can

  1. The specific purpose Monte Carlo code McENL for simulating the response of epithermal neutron lifetime well logging tools

    NASA Astrophysics Data System (ADS)

    Prettyman, T. H.; Gardner, R. P.; Verghese, K.

    1993-08-01

    A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.

  2. Architecture-driven reuse of code in KASE

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    In order to support the synthesis of large, complex software systems, we need to focus on issues pertaining to the architectural design of a system in addition to algorithm and data structure design. An approach that is based on abstracting the architectural design of a set of problems in the form of a generic architecture, and providing tools that can be used to instantiate the generic architecture for specific problem instances is presented. Such an approach also facilitates reuse of code between different systems belonging to the same problem class. An application of our approach on a realistic problem is described; the results of the exercise are presented; and how our approach compares to other work in this area is discussed.

  3. Automated Discovery of Machine-Specific Code Improvements

    DTIC Science & Technology

    1984-12-01

    operation of the source language. Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient...Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient code. Such analysis is optional...incorporate knowledge of the source language, but do not refer to features of the target machine. These early phases are sometimes referred to as the

  4. Aerodynamic Analysis of the M33 Projectile Using the CFX Code

    DTIC Science & Technology

    2011-12-01

    is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) The M33 projectile has been analyzed using the ANSYS CFX code that is based...analyzed using the ANSYS CFX code that is based on the numerical solution of the full Navier-Stokes equations. Simulation data were obtained...using the CFX code. The ANSYS - CFX code is a commercial CFD program used to simulate fluid flow in a variety of applications such as gas turbine

  5. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  6. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    PubMed

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  7. Test Input Generation for Red-Black Trees using Abstraction

    NASA Technical Reports Server (NTRS)

    Visser, Willem; Pasareanu, Corina S.; Pelanek, Radek

    2005-01-01

    We consider the problem of test input generation for code that manipulates complex data structures. Test inputs are sequences of method calls from the data structure interface. We describe test input generation techniques that rely on state matching to avoid generation of redundant tests. Exhaustive techniques use explicit state model checking to explore all the possible test sequences up to predefined input sizes. Lossy techniques rely on abstraction mappings to compute and store abstract versions of the concrete states; they explore under-approximations of all the possible test sequences. We have implemented the techniques on top of the Java PathFinder model checker and we evaluate them using a Java implementation of red-black trees.

  8. Fire Technology Abstracts, volume 4, issue 1, August, 1981

    NASA Astrophysics Data System (ADS)

    Holtschlag, L. J.; Kuvshinoff, B. W.; Jernigan, J. B.

    This bibliography contains over 400 citations with abstracts addressing various aspects of fire technology. Subjects cover the dynamics of fire, behavior and properties of materials, fire modeling and test burns, fire protection, fire safety, fire service organization, apparatus and equipment, fire prevention, suppression, planning, human behavior, medical problems, codes and standards, hazard identification, safe handling of materials, insurance, economics of loss and prevention, and more.

  9. An ERP study of recognition memory for concrete and abstract pictures in school-aged children.

    PubMed

    Boucher, Olivier; Chouinard-Leclaire, Christine; Muckle, Gina; Westerlund, Alissa; Burden, Matthew J; Jacobson, Sandra W; Jacobson, Joseph L

    2016-08-01

    Recognition memory for concrete, nameable pictures is typically faster and more accurate than for abstract pictures. A dual-coding account for these findings suggests that concrete pictures are processed into verbal and image codes, whereas abstract pictures are encoded in image codes only. Recognition memory relies on two successive and distinct processes, namely familiarity and recollection. Whether these two processes are similarly or differently affected by stimulus concreteness remains unknown. This study examined the effect of picture concreteness on visual recognition memory processes using event-related potentials (ERPs). In a sample of children involved in a longitudinal study, participants (N=96; mean age=11.3years) were assessed on a continuous visual recognition memory task in which half the pictures were easily nameable, everyday concrete objects, and the other half were three-dimensional abstract, sculpture-like objects. Behavioral performance and ERP correlates of familiarity and recollection (respectively, the FN400 and P600 repetition effects) were measured. Behavioral results indicated faster and more accurate identification of concrete pictures as "new" or "old" (i.e., previously displayed) compared to abstract pictures. ERPs were characterized by a larger repetition effect, on the P600 amplitude, for concrete than for abstract images, suggesting a graded recollection process dependent on the type of material to be recollected. Topographic differences were observed within the FN400 latency interval, especially over anterior-inferior electrodes, with the repetition effect more pronounced and localized over the left hemisphere for concrete stimuli, potentially reflecting different neural processes underlying early processing of verbal/semantic and visual material in memory. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Identification of Transposable Elements Contributing to Tissue-Specific Expression of Long Non-Coding RNAs

    PubMed Central

    Chishima, Takafumi; Iwakiri, Junichi

    2018-01-01

    It has been recently suggested that transposable elements (TEs) are re-used as functional elements of long non-coding RNAs (lncRNAs). This is supported by some examples such as the human endogenous retrovirus subfamily H (HERVH) elements contained within lncRNAs and expressed specifically in human embryonic stem cells (hESCs), as required to maintain hESC identity. There are at least two unanswered questions about all lncRNAs. How many TEs are re-used within lncRNAs? Are there any other TEs that affect tissue specificity of lncRNA expression? To answer these questions, we comprehensively identify TEs that are significantly related to tissue-specific expression levels of lncRNAs. We downloaded lncRNA expression data corresponding to normal human tissue from the Expression Atlas and transformed the data into tissue specificity estimates. Then, Fisher’s exact tests were performed to verify whether the presence or absence of TE-derived sequences influences the tissue specificity of lncRNA expression. Many TE–tissue pairs associated with tissue-specific expression of lncRNAs were detected, indicating that multiple TE families can be re-used as functional domains or regulatory sequences of lncRNAs. In particular, we found that the antisense promoter region of L1PA2, a LINE-1 subfamily, appears to act as a promoter for lncRNAs with placenta-specific expression. PMID:29315213

  11. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  12. The Emotions of Abstract Words: A Distributional Semantic Analysis.

    PubMed

    Lenci, Alessandro; Lebani, Gianluca E; Passaro, Lucia C

    2018-04-06

    Recent psycholinguistic and neuroscientific research has emphasized the crucial role of emotions for abstract words, which would be grounded by affective experience, instead of a sensorimotor one. The hypothesis of affective embodiment has been proposed as an alternative to the idea that abstract words are linguistically coded and that linguistic processing plays a key role in their acquisition and processing. In this paper, we use distributional semantic models to explore the complex interplay between linguistic and affective information in the representation of abstract words. Distributional analyses on Italian norming data show that abstract words have more affective content and tend to co-occur with contexts with higher emotive values, according to affective statistical indices estimated in terms of distributional similarity with a restricted number of seed words strongly associated with a set of basic emotions. Therefore, the strong affective content of abstract words might just be an indirect byproduct of co-occurrence statistics. This is consistent with a version of representational pluralism in which concepts that are fully embodied either at the sensorimotor or at the affective level live side-by-side with concepts only indirectly embodied via their linguistic associations with other embodied words. Copyright © 2018 Cognitive Science Society, Inc.

  13. Reading Achievement: Characteristics Associated with Success and Failure: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," July through September 1978 (Vol. 39 Nos. 1 through 3).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 25 titles deal with a variety of topics, including the following: reading achievement as it relates to child dependency, the development of phonological coding, short-term memory and associative learning, variables available in…

  14. On the Power of Abstract Interpretation

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.; Kamin, Samuel N.

    1991-01-01

    Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally

  15. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  16. Exome chip meta-analysis identifies novel loci and East Asian-specific coding variants contributing to lipid levels and coronary artery disease

    PubMed Central

    Lu, Xiangfeng; Peloso, Gina M; Liu, Dajiang J.; Wu, Ying; Zhang, He; Zhou, Wei; Li, Jun; Tang, Clara Sze-man; Dorajoo, Rajkumar; Li, Huaixing; Long, Jirong; Guo, Xiuqing; Xu, Ming; Spracklen, Cassandra N.; Chen, Yang; Liu, Xuezhen; Zhang, Yan; Khor, Chiea Chuen; Liu, Jianjun; Sun, Liang; Wang, Laiyuan; Gao, Yu-Tang; Hu, Yao; Yu, Kuai; Wang, Yiqin; Cheung, Chloe Yu Yan; Wang, Feijie; Huang, Jianfeng; Fan, Qiao; Cai, Qiuyin; Chen, Shufeng; Shi, Jinxiu; Yang, Xueli; Zhao, Wanting; Sheu, Wayne H.-H.; Cherny, Stacey Shawn; He, Meian; Feranil, Alan B.; Adair, Linda S.; Gordon-Larsen, Penny; Du, Shufa; Varma, Rohit; da Chen, Yii-Der I; Shu, XiaoOu; Lam, Karen Siu Ling; Wong, Tien Yin; Ganesh, Santhi K.; Mo, Zengnan; Hveem, Kristian; Fritsche, Lars; Nielsen, Jonas Bille; Tse, Hung-fat; Huo, Yong; Cheng, Ching-Yu; Chen, Y. Eugene; Zheng, Wei; Tai, E Shyong; Gao, Wei; Lin, Xu; Huang, Wei; Abecasis, Goncalo; Consortium, GLGC; Kathiresan, Sekar; Mohlke, Karen L.; Wu, Tangchun; Sham, Pak Chung; Gu, Dongfeng; Willer, Cristen J

    2017-01-01

    Most genome-wide association studies have been conducted in European individuals, even though most genetic variation in humans is seen only in non-European samples. To search for novel loci associated with blood lipid levels and clarify the mechanism of action at previously identified lipid loci, we examined protein-coding genetic variants in 47,532 East Asian individuals using an exome array. We identified 255 variants at 41 loci reaching chip-wide significance, including 3 novel loci and 14 East Asian-specific coding variant associations. After meta-analysis with > 300,000 European samples, we identified an additional 9 novel loci. The same 16 genes were identified by the protein-altering variants in both East Asians and Europeans, likely pointing to the functional genes. Our data demonstrate that most of the low-frequency or rare coding variants associated with lipids are population-specific, and that examining genomic data across diverse ancestries may facilitate the identification of functional genes at associated loci. PMID:29083407

  17. Land Application of Sewage Effluents and Sludges: Selected Abstracts.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Washington, DC. Office of Research and Development.

    This report contains 568 selected abstracts concerned with the land application of sewage effluents and sludges. The abstracts are arranged in chronological groupings of ten-year periods from the l940's to the mid-l970's. The report also includes an author index and a subject matter index to facilitate reference to specific abstracts or narrower…

  18. Differentiation of ileostomy from colostomy procedures: assessing the accuracy of current procedural terminology codes and the utility of natural language processing.

    PubMed

    Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K

    2013-08-01

    Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75

  19. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states

  20. Expression profiles of long non-coding RNAs located in autoimmune disease-associated regions reveal immune cell-type specificity.

    PubMed

    Hrdlickova, Barbara; Kumar, Vinod; Kanduri, Kartiek; Zhernakova, Daria V; Tripathi, Subhash; Karjalainen, Juha; Lund, Riikka J; Li, Yang; Ullah, Ubaid; Modderman, Rutger; Abdulahad, Wayel; Lähdesmäki, Harri; Franke, Lude; Lahesmaa, Riitta; Wijmenga, Cisca; Withoff, Sebo

    2014-01-01

    Although genome-wide association studies (GWAS) have identified hundreds of variants associated with a risk for autoimmune and immune-related disorders (AID), our understanding of the disease mechanisms is still limited. In particular, more than 90% of the risk variants lie in non-coding regions, and almost 10% of these map to long non-coding RNA transcripts (lncRNAs). lncRNAs are known to show more cell-type specificity than protein-coding genes. We aimed to characterize lncRNAs and protein-coding genes located in loci associated with nine AIDs which have been well-defined by Immunochip analysis and by transcriptome analysis across seven populations of peripheral blood leukocytes (granulocytes, monocytes, natural killer (NK) cells, B cells, memory T cells, naive CD4(+) and naive CD8(+) T cells) and four populations of cord blood-derived T-helper cells (precursor, primary, and polarized (Th1, Th2) T-helper cells). We show that lncRNAs mapping to loci shared between AID are significantly enriched in immune cell types compared to lncRNAs from the whole genome (α <0.005). We were not able to prioritize single cell types relevant for specific diseases, but we observed five different cell types enriched (α <0.005) in five AID (NK cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, and psoriasis; memory T and CD8(+) T cells in juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis; Th0 and Th2 cells for inflammatory bowel disease, juvenile idiopathic arthritis, primary biliary cirrhosis, psoriasis, and rheumatoid arthritis). Furthermore, we show that co-expression analyses of lncRNAs and protein-coding genes can predict the signaling pathways in which these AID-associated lncRNAs are involved. The observed enrichment of lncRNA transcripts in AID loci implies lncRNAs play an important role in AID etiology and suggests that lncRNA genes should be studied in more detail to interpret GWAS

  1. OHD/HL - SHEF: code

    Science.gov Websites

    specification How to install the software How to use the software Download the source code (using .gz). Standard Exchange Format (SHEF) is a documented set of rules for coding of data in a form for both visual and information to describe the data. Current SHEF specification How to install the software How to use the

  2. Reliability of reporting nosocomial infections in the discharge abstract and implications for receipt of revenues under prospective reimbursement.

    PubMed Central

    Massanari, R M; Wilkerson, K; Streed, S A; Hierholzer, W J

    1987-01-01

    Proper reporting of discharge diagnoses, including complications of medical care, is essential for maximum recovery of revenues under the prospective reimbursement system. To evaluate the effectiveness of abstracting techniques in identifying nosocomial infections at discharge, discharge abstracts of patients with nosocomial infections were reviewed during September through November of 1984. Patients with nosocomial infections were identified using modified Centers for Disease Control (CDC) definitions and trained surveillance technicians. Records which did not include the diagnosis of nosocomial infections in the discharge abstract were identified, and potential lost revenues were estimated. We identified 631 infections in 498 patients. On average, only 57 per cent of the infections were properly recorded and coded in the discharge abstract. Of the additional monies which might be anticipated by the health care institution to assist in the cost of care of adverse events, approximately one-third would have been lost due to errors in coding in the discharge abstract. Although these lost revenues are substantial, they constitute but a small proportion of the potential costs to the institution when patients acquire nosocomial infections. PMID:3105338

  3. A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.

    PubMed

    Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice

    2016-07-01

    The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.

  4. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  5. Color coding of control room displays: the psychocartography of visual layering effects.

    PubMed

    Van Laar, Darren; Deshe, Ofer

    2007-06-01

    To evaluate which of three color coding methods (monochrome, maximally discriminable, and visual layering) used to code four types of control room display format (bars, tables, trend, mimic) was superior in two classes of task (search, compare). It has recently been shown that color coding of visual layers, as used in cartography, may be used to color code any type of information display, but this has yet to be fully evaluated. Twenty-four people took part in a 2 (task) x 3 (coding method) x 4 (format) wholly repeated measures design. The dependent variables assessed were target location reaction time, error rates, workload, and subjective feedback. Overall, the visual layers coding method produced significantly faster reaction times than did the maximally discriminable and the monochrome methods for both the search and compare tasks. No significant difference in errors was observed between conditions for either task type. Significantly less perceived workload was experienced with the visual layers coding method, which was also rated more highly than the other coding methods on a 14-item visual display quality questionnaire. The visual layers coding method is superior to other color coding methods for control room displays when the method supports the user's task. The visual layers color coding method has wide applicability to the design of all complex information displays utilizing color coding, from the most maplike (e.g., air traffic control) to the most abstract (e.g., abstracted ecological display).

  6. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.

    PubMed

    Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-03-31

    As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the

  7. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review

    PubMed Central

    Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-01-01

    Background As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. Objective The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Methods Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. Results E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter’s Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Conclusions Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter’s databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. PMID:28363883

  8. Domain Specific Language Support for Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domainmore » specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.« less

  9. Identifying individuals with physician-diagnosed chronic obstructive pulmonary disease in primary care electronic medical records: a retrospective chart abstraction study.

    PubMed

    Lee, Theresa M; Tu, Karen; Wing, Laura L; Gershon, Andrea S

    2017-05-15

    Little is known about using electronic medical records to identify patients with chronic obstructive pulmonary disease to improve quality of care. Our objective was to develop electronic medical record algorithms that can accurately identify patients with obstructive pulmonary disease. A retrospective chart abstraction study was conducted on data from the Electronic Medical Record Administrative data Linked Database (EMRALD ® ) housed at the Institute for Clinical Evaluative Sciences. Abstracted charts provided the reference standard based on available physician-diagnoses, chronic obstructive pulmonary disease-specific medications, smoking history and pulmonary function testing. Chronic obstructive pulmonary disease electronic medical record algorithms using combinations of terminology in the cumulative patient profile (CPP; problem list/past medical history), physician billing codes (chronic bronchitis/emphysema/other chronic obstructive pulmonary disease), and prescriptions, were tested against the reference standard. Sensitivity, specificity, and positive/negative predictive values (PPV/NPV) were calculated. There were 364 patients with chronic obstructive pulmonary disease identified in a 5889 randomly sampled cohort aged ≥ 35 years (prevalence = 6.2%). The electronic medical record algorithm consisting of ≥ 3 physician billing codes for chronic obstructive pulmonary disease per year; documentation in the CPP; tiotropium prescription; or ipratropium (or its formulations) prescription and a chronic obstructive pulmonary disease billing code had sensitivity of 76.9% (95% CI:72.2-81.2), specificity of 99.7% (99.5-99.8), PPV of 93.6% (90.3-96.1), and NPV of 98.5% (98.1-98.8). Electronic medical record algorithms can accurately identify patients with chronic obstructive pulmonary disease in primary care records. They can be used to enable further studies in practice patterns and chronic obstructive pulmonary disease management in primary care. NOVEL

  10. A neural mechanism of cognitive control for resolving conflict between abstract task rules.

    PubMed

    Sheu, Yi-Shin; Courtney, Susan M

    2016-12-01

    Conflict between multiple sensory stimuli or potential motor responses is thought to be resolved via bias signals from prefrontal cortex (PFC). However, population codes in the PFC also represent abstract information, such as task rules. How is conflict between active abstract representations resolved? We used functional neuroimaging to investigate the mechanism responsible for resolving conflict between abstract representations of task rules. Participants performed two different tasks based on a cue. We manipulated the degree of conflict at the task-rule level by training participants to associate the color and shape dimensions of the cue with either the same task rule (congruent cues) or different ones (incongruent cues). Phonological and semantic tasks were used in which performance depended on learned, abstract representations of information, rather than sensory features of the target stimulus or on any habituated stimulus-response associations. In addition, these tasks activate distinct regions that allowed us to measure magnitude of conflict between tasks. We found that incongruent cues were associated with increased activity in several cognitive control areas, including the inferior frontal gyrus, inferior parietal lobule, insula, and subcortical regions. Conflict between abstract representations appears to be resolved by rule-specific activity in the inferior frontal gyrus that is correlated with enhanced activity related to the relevant information. Furthermore, multi-voxel pattern analysis of the activity in the inferior frontal gyrus was shown to carry information about both the currently relevant rule (semantic/phonological) and the currently relevant cue context (color/shape). Similar to models of attentional selection of conflicting sensory or motor representations, the current findings indicate part of the frontal cortex provides a bias signal, representing task rules, that enhances task-relevant information. However, the frontal cortex can also be

  11. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  12. CHMWTR: A Plasma Chemistry Code for Water Vapor

    DTIC Science & Technology

    2012-02-01

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6790--12-9383 CHMWTR: A Plasma Chemistry Code for Water Vapor Daniel F. GorDon Michael...NUMBER OF PAGES 17. LIMITATION OF ABSTRACT CHMWTR: A Plasma Chemistry Code for Water Vapor Daniel F. Gordon, Michael H. Helle, Theodore G. Jones, and K...October 2011 NRL *Directed Energy Scholar, Directed Energy Professional Society Plasma chemistry Breakdown field Conductivity 67-4270-02 CHMWTR: a Plasma

  13. Hide and Seek: Exploiting and Hardening Leakage-Resilient Code Randomization

    DTIC Science & Technology

    2016-05-30

    Hide and Seek: Exploiting and Hardening Leakage -Resilient Code Randomization Robert Rudd MIT Lincoln Laboratory Thomas Hobson MIT Lincoln Laboratory...Irvine Ahmad-Reza Sadeghi TU Darmstadt Hamed Okhravi MIT Lincoln Laboratory Abstract Information leakage vulnerabilities can allow adversaries to...bypass mitigations based on code randomization. This discovery motivates numerous techniques that diminish direct and indirect information leakage : (i

  14. Hide and Seek: Exploiting and Hardening Leakage-Resilient Code Randomization

    DTIC Science & Technology

    2016-03-30

    Hide and Seek: Exploiting and Hardening Leakage -Resilient Code Randomization Robert Rudd MIT Lincoln Laboratory Thomas Hobson MIT Lincoln Laboratory...Irvine Ahmad-Reza Sadeghi TU Darmstadt Hamed Okhravi MIT Lincoln Laboratory Abstract Information leakage vulnerabilities can allow adversaries to...bypass mitigations based on code randomization. This discovery motivates numerous techniques that diminish direct and indirect information leakage : (i

  15. Exome chip meta-analysis identifies novel loci and East Asian-specific coding variants that contribute to lipid levels and coronary artery disease.

    PubMed

    Lu, Xiangfeng; Peloso, Gina M; Liu, Dajiang J; Wu, Ying; Zhang, He; Zhou, Wei; Li, Jun; Tang, Clara Sze-Man; Dorajoo, Rajkumar; Li, Huaixing; Long, Jirong; Guo, Xiuqing; Xu, Ming; Spracklen, Cassandra N; Chen, Yang; Liu, Xuezhen; Zhang, Yan; Khor, Chiea Chuen; Liu, Jianjun; Sun, Liang; Wang, Laiyuan; Gao, Yu-Tang; Hu, Yao; Yu, Kuai; Wang, Yiqin; Cheung, Chloe Yu Yan; Wang, Feijie; Huang, Jianfeng; Fan, Qiao; Cai, Qiuyin; Chen, Shufeng; Shi, Jinxiu; Yang, Xueli; Zhao, Wanting; Sheu, Wayne H-H; Cherny, Stacey Shawn; He, Meian; Feranil, Alan B; Adair, Linda S; Gordon-Larsen, Penny; Du, Shufa; Varma, Rohit; Chen, Yii-Der Ida; Shu, Xiao-Ou; Lam, Karen Siu Ling; Wong, Tien Yin; Ganesh, Santhi K; Mo, Zengnan; Hveem, Kristian; Fritsche, Lars G; Nielsen, Jonas Bille; Tse, Hung-Fat; Huo, Yong; Cheng, Ching-Yu; Chen, Y Eugene; Zheng, Wei; Tai, E Shyong; Gao, Wei; Lin, Xu; Huang, Wei; Abecasis, Goncalo; Kathiresan, Sekar; Mohlke, Karen L; Wu, Tangchun; Sham, Pak Chung; Gu, Dongfeng; Willer, Cristen J

    2017-12-01

    Most genome-wide association studies have been of European individuals, even though most genetic variation in humans is seen only in non-European samples. To search for novel loci associated with blood lipid levels and clarify the mechanism of action at previously identified lipid loci, we used an exome array to examine protein-coding genetic variants in 47,532 East Asian individuals. We identified 255 variants at 41 loci that reached chip-wide significance, including 3 novel loci and 14 East Asian-specific coding variant associations. After a meta-analysis including >300,000 European samples, we identified an additional nine novel loci. Sixteen genes were identified by protein-altering variants in both East Asians and Europeans, and thus are likely to be functional genes. Our data demonstrate that most of the low-frequency or rare coding variants associated with lipids are population specific, and that examining genomic data across diverse ancestries may facilitate the identification of functional genes at associated loci.

  16. Interdependence, Reflexivity, Fidelity, Impedance Matching, and the Evolution of Genetic Coding

    PubMed Central

    Carter, Charles W; Wills, Peter R

    2018-01-01

    Abstract Genetic coding is generally thought to have required ribozymes whose functions were taken over by polypeptide aminoacyl-tRNA synthetases (aaRS). Two discoveries about aaRS and their interactions with tRNA substrates now furnish a unifying rationale for the opposite conclusion: that the key processes of the Central Dogma of molecular biology emerged simultaneously and naturally from simple origins in a peptide•RNA partnership, eliminating the epistemological utility of a prior RNA world. First, the two aaRS classes likely arose from opposite strands of the same ancestral gene, implying a simple genetic alphabet. The resulting inversion symmetries in aaRS structural biology would have stabilized the initial and subsequent differentiation of coding specificities, rapidly promoting diversity in the proteome. Second, amino acid physical chemistry maps onto tRNA identity elements, establishing reflexive, nanoenvironmental sensing in protein aaRS. Bootstrapping of increasingly detailed coding is thus intrinsic to polypeptide aaRS, but impossible in an RNA world. These notions underline the following concepts that contradict gradual replacement of ribozymal aaRS by polypeptide aaRS: 1) aaRS enzymes must be interdependent; 2) reflexivity intrinsic to polypeptide aaRS production dynamics promotes bootstrapping; 3) takeover of RNA-catalyzed aminoacylation by enzymes will necessarily degrade specificity; and 4) the Central Dogma’s emergence is most probable when replication and translation error rates remain comparable. These characteristics are necessary and sufficient for the essentially de novo emergence of a coupled gene–replicase–translatase system of genetic coding that would have continuously preserved the functional meaning of genetically encoded protein genes whose phylogenetic relationships match those observed today. PMID:29077934

  17. Portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele

    2018-03-01

    Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.

  18. Using Pulsed Power for Hydrodynamic Code Validation

    DTIC Science & Technology

    2001-06-01

    Air Force Research Laboratory ( AFRL ). A...bank at the Air Force Research Laboratory ( AFRL ). A cylindrical aluminum liner that is magnetically imploded onto a central target by self-induced...James Degnan, George Kiuttu Air Force Research Laboratory Albuquerque, NM 87117 Abstract As part of ongoing hydrodynamic code

  19. Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices

    EPA Science Inventory

    Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of “Complete Streets” Practices Primary Author: Nicholas R. Flanders 109 T.W. Alexander Drive Mail Code: E343-02 Research Triangle Park, NC 27709 919-541-3660 Flanders.nick@Epa.gov Topic categ...

  20. Thinking Big or Small: Does Mental Abstraction Affect Social Network Organization?

    PubMed Central

    Bacev-Giles, Chantal; Peetz, Johanna

    2016-01-01

    Four studies examined how mental abstraction affects how people perceive their relationships with other people, specifically, how these relationships may be categorized in social groups. We expected that individuals induced to think abstractly would report fewer more global social groups, compared to those induced to think concretely, who would report more specific groups. However, induced abstract mindset did not affect how people structured their social groups (Study 2–4), despite evidence that the mindset manipulation changed the level of abstraction in their thoughts (Study 3) and evidence that it changed how people structured groups for a control condition (household objects, Study 4). Together, these studies suggest that while the way people organize their relationships into groups is malleable; cognitive abstraction does not seem to affect how people categorize their relationships into social groups. PMID:26808086

  1. Learning by Doing: Teaching Decision Making through Building a Code of Ethics.

    ERIC Educational Resources Information Center

    Hawthorne, Mark D.

    2001-01-01

    Notes that applying abstract ethical principles to the practical business of building a code of applied ethics for a technical communication department teaches students that they share certain unarticulated or unconscious values that they can translate into ethical principles. Suggests that combining abstract theory with practical policy writing…

  2. Heat pipe design handbook, part 2. [digital computer code specifications

    NASA Technical Reports Server (NTRS)

    Skrabek, E. A.

    1972-01-01

    The utilization of a digital computer code for heat pipe analysis and design (HPAD) is described which calculates the steady state hydrodynamic heat transport capability of a heat pipe with a particular wick configuration, the working fluid being a function of wick cross-sectional area. Heat load, orientation, operating temperature, and heat pipe geometry are specified. Both one 'g' and zero 'g' environments are considered, and, at the user's option, the code will also perform a weight analysis and will calculate heat pipe temperature drops. The central porous slab, circumferential porous wick, arterial wick, annular wick, and axial rectangular grooves are the wick configurations which HPAD has the capability of analyzing. For Vol. 1, see N74-22569.

  3. Functional Abstraction as a Method to Discover Knowledge in Gene Ontologies

    PubMed Central

    Ultsch, Alfred; Lötsch, Jörn

    2014-01-01

    Computational analyses of functions of gene sets obtained in microarray analyses or by topical database searches are increasingly important in biology. To understand their functions, the sets are usually mapped to Gene Ontology knowledge bases by means of over-representation analysis (ORA). Its result represents the specific knowledge of the functionality of the gene set. However, the specific ontology typically consists of many terms and relationships, hindering the understanding of the ‘main story’. We developed a methodology to identify a comprehensibly small number of GO terms as “headlines” of the specific ontology allowing to understand all central aspects of the roles of the involved genes. The Functional Abstraction method finds a set of headlines that is specific enough to cover all details of a specific ontology and is abstract enough for human comprehension. This method exceeds the classical approaches at ORA abstraction and by focusing on information rather than decorrelation of GO terms, it directly targets human comprehension. Functional abstraction provides, with a maximum of certainty, information value, coverage and conciseness, a representation of the biological functions in a gene set plays a role. This is the necessary means to interpret complex Gene Ontology results thus strengthening the role of functional genomics in biomarker and drug discovery. PMID:24587272

  4. Brain network response underlying decisions about abstract reinforcers.

    PubMed

    Mills-Finnerty, Colleen; Hanson, Catherine; Hanson, Stephen Jose

    2014-12-01

    Decision making studies typically use tasks that involve concrete action-outcome contingencies, in which subjects do something and get something. No studies have addressed decision making involving abstract reinforcers, where there are no action-outcome contingencies and choices are entirely hypothetical. The present study examines these kinds of choices, as well as whether the same biases that exist for concrete reinforcer decisions, specifically framing effects, also apply during abstract reinforcer decisions. We use both General Linear Model as well as Bayes network connectivity analysis using the Independent Multi-sample Greedy Equivalence Search (IMaGES) algorithm to examine network response underlying choices for abstract reinforcers under positive and negative framing. We find for the first time that abstract reinforcer decisions activate the same network of brain regions as concrete reinforcer decisions, including the striatum, insula, anterior cingulate, and VMPFC, results that are further supported via comparison to a meta-analysis of decision making studies. Positive and negative framing activated different parts of this network, with stronger activation in VMPFC during negative framing and in DLPFC during positive, suggesting different decision making pathways depending on frame. These results were further clarified using connectivity analysis, which revealed stronger connections between anterior cingulate, insula, and accumbens during negative framing compared to positive. Taken together, these results suggest that not only do abstract reinforcer decisions rely on the same brain substrates as concrete reinforcers, but that the response underlying framing effects on abstract reinforcers also resemble those for concrete reinforcers, specifically increased limbic system connectivity during negative frames. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Sparse and Specific Coding during Information Transmission between Co-cultured Dentate Gyrus and CA3 Hippocampal Networks

    PubMed Central

    Poli, Daniele; Thiagarajan, Srikanth; DeMarse, Thomas B.; Wheeler, Bruce C.; Brewer, Gregory J.

    2017-01-01

    To better understand encoding and decoding of stimulus information in two specific hippocampal sub-regions, we isolated and co-cultured rat primary dentate gyrus (DG) and CA3 neurons within a two-chamber device with axonal connectivity via micro-tunnels. We tested the hypothesis that, in these engineered networks, decoding performance of stimulus site information would be more accurate when stimuli and information flow occur in anatomically correct feed-forward DG to CA3 vs. CA3 back to DG. In particular, we characterized the neural code of these sub-regions by measuring sparseness and uniqueness of the responses evoked by specific paired-pulse stimuli. We used the evoked responses in CA3 to decode the stimulation sites in DG (and vice-versa) by means of learning algorithms for classification (support vector machine, SVM). The device was placed over an 8 × 8 grid of extracellular electrodes (micro-electrode array, MEA) in order to provide a platform for monitoring development, self-organization, and improved access to stimulation and recording at multiple sites. The micro-tunnels were designed with dimensions 3 × 10 × 400 μm allowing axonal growth but not migration of cell bodies and long enough to exclude traversal by dendrites. Paired-pulse stimulation (inter-pulse interval 50 ms) was applied at 22 different sites and repeated 25 times in each chamber for each sub-region to evoke time-locked activity. DG-DG and CA3-CA3 networks were used as controls. Stimulation in DG drove signals through the axons in the tunnels to activate a relatively small set of specific electrodes in CA3 (sparse code). CA3-CA3 and DG-DG controls were less sparse in coding than CA3 in DG-CA3 networks. Using all target electrodes with the three highest spike rates (14%), the evoked responses in CA3 specified each stimulation site in DG with optimum uniqueness of 64%. Finally, by SVM learning, these evoked responses in CA3 correctly decoded the stimulation sites in DG for 43% of the

  6. PYTHON for Variable Star Astronomy (Abstract)

    NASA Astrophysics Data System (ADS)

    Craig, M.

    2018-06-01

    (Abstract only) Open source PYTHON packages that are useful for data reduction, photometry, and other tasks relevant to variable star astronomy have been developed over the last three to four years as part of the Astropy project. Using this software, it is relatively straightforward to reduce images, automatically detect sources, and match them to catalogs. Over the last year browser-based tools for performing some of those tasks have been developed that minimize or eliminate the need to write any of your own code. After providing an overview of the current state of the software, an application that calculates transformation coefficients on a frame-by-frame basis by matching stars in an image to the APASS catalog will be described.

  7. IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean

    2014-01-01

    The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.

  8. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  9. An Abstract Plan Preparation Language

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.

    2006-01-01

    This paper presents a new planning language that is more abstract than most existing planning languages such as the Planning Domain Definition Language (PDDL) or the New Domain Description Language (NDDL). The goal of this language is to simplify the formal analysis and specification of planning problems that are intended for safety-critical applications such as power management or automated rendezvous in future manned spacecraft. The new language has been named the Abstract Plan Preparation Language (APPL). A translator from APPL to NDDL has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats Project (SAVH) sponsored by the Explorations Technology Development Program, which is seeking to mature autonomy technology for application to the new Crew Exploration Vehicle (CEV) that will replace the Space Shuttle.

  10. SEER*Educate: Use of Abstracting Quality Index Scores to Monitor Improvement of All Employees.

    PubMed

    Potts, Mary S; Scott, Tim; Hafterson, Jennifer L

    2016-01-01

    Integral parts of the Seattle-Puget Sound's Cancer Surveillance System registry's continuous improvement model include the incorporation of SEER*Educate into its training program for all staff and analyzing assessment results using the Abstracting Quality Index (AQI). The AQI offers a comprehensive measure of overall performance in SEER*Educate, which is a Web-based application used to personalize learning and diagnostically pinpoint each staff member's place on the AQI continuum. The assessment results are tallied from 6 abstracting standards within 2 domains: incidence reporting and coding accuracy. More than 100 data items are aligned to 1 or more of the 6 standards to build an aggregated score that is placed on a continuum for continuous improvement. The AQI score accurately identifies those individuals who have a good understanding of how to apply the 6 abstracting standards to reliably generate high quality abstracts.

  11. Some partial-unit-memory convolutional codes

    NASA Technical Reports Server (NTRS)

    Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.

    1991-01-01

    The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.

  12. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  13. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  14. The neural representation of abstract words: the role of emotion.

    PubMed

    Vigliocco, Gabriella; Kousta, Stavroula-Thaleia; Della Rosa, Pasquale Anthony; Vinson, David P; Tettamanti, Marco; Devlin, Joseph T; Cappa, Stefano F

    2014-07-01

    It is generally assumed that abstract concepts are linguistically coded, in line with imaging evidence of greater engagement of the left perisylvian language network for abstract than concrete words (Binder JR, Desai RH, Graves WW, Conant LL. 2009. Where is the semantic system? A critical review and meta-analysis of 120 functional neuroimaging studies. Cerebral Cortex. 19:2767-2796; Wang J, Conder JA, Blitzer DN, Shinkareva SV. 2010. Neural representation of abstract and concrete concepts: A meta-analysis of neuroimaging studies. Hum Brain Map. 31:1459-1468). Recent behavioral work, which used tighter matching of items than previous studies, however, suggests that abstract concepts also entail affective processing to a greater extent than concrete concepts (Kousta S-T, Vigliocco G, Vinson DP, Andrews M, Del Campo E. The representation of abstract words: Why emotion matters. J Exp Psychol Gen. 140:14-34). Here we report a functional magnetic resonance imaging experiment that shows greater engagement of the rostral anterior cingulate cortex, an area associated with emotion processing (e.g., Etkin A, Egner T, Peraza DM, Kandel ER, Hirsch J. 2006. Resolving emotional conflict: A role for the rostral anterior cingulate cortex in modulating activity in the amygdala. Neuron. 52:871), in abstract processing. For abstract words, activation in this area was modulated by the hedonic valence (degree of positive or negative affective association) of our items. A correlation analysis of more than 1,400 English words further showed that abstract words, in general, receive higher ratings for affective associations (both valence and arousal) than concrete words, supporting the view that engagement of emotional processing is generally required for processing abstract words. We argue that these results support embodiment views of semantic representation, according to which, whereas concrete concepts are grounded in our sensory-motor experience, affective experience is crucial in the

  15. A Formal Specification and Verification Method for the Prevention of Denial of Service in Ada Services

    DTIC Science & Technology

    1988-03-01

    Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy

  16. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  17. Performance of Serially Concatenated Convolutional Codes with Binary Modulation in AWGN and Noise Jamming over Rayleigh Fading Channels

    DTIC Science & Technology

    2001-09-01

    Rate - compatible punctured convolutional codes (RCPC codes ) and their applications,” IEEE...ABSTRACT In this dissertation, the bit error rates for serially concatenated convolutional codes (SCCC) for both BPSK and DPSK modulation with...INTENTIONALLY LEFT BLANK i EXECUTIVE SUMMARY In this dissertation, the bit error rates of serially concatenated convolutional codes

  18. Conflicting demands of abstract and specific visual object processing resolved by frontoparietal networks.

    PubMed

    McMenamin, Brenton W; Marsolek, Chad J; Morseth, Brianna K; Speer, MacKenzie F; Burton, Philip C; Burgund, E Darcy

    2016-06-01

    Object categorization and exemplar identification place conflicting demands on the visual system, yet humans easily perform these fundamentally contradictory tasks. Previous studies suggest the existence of dissociable visual processing subsystems to accomplish the two abilities-an abstract category (AC) subsystem that operates effectively in the left hemisphere and a specific exemplar (SE) subsystem that operates effectively in the right hemisphere. This multiple subsystems theory explains a range of visual abilities, but previous studies have not explored what mechanisms exist for coordinating the function of multiple subsystems and/or resolving the conflicts that would arise between them. We collected functional MRI data while participants performed two variants of a cue-probe working memory task that required AC or SE processing. During the maintenance phase of the task, the bilateral intraparietal sulcus (IPS) exhibited hemispheric asymmetries in functional connectivity consistent with exerting proactive control over the two visual subsystems: greater connectivity to the left hemisphere during the AC task, and greater connectivity to the right hemisphere during the SE task. Moreover, probe-evoked activation revealed activity in a broad frontoparietal network (containing IPS) associated with reactive control when the two visual subsystems were in conflict, and variations in this conflict signal across trials was related to the visual similarity of the cue-probe stimulus pairs. Although many studies have confirmed the existence of multiple visual processing subsystems, this study is the first to identify the mechanisms responsible for coordinating their operations.

  19. NASA Patent Abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 21) Abstracts

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Abstracts are cited for 87 patents and applications introduced into the NASA scientific and technical information system during the period of January 1982 through June 1982. Each entry consists of a citation, an abstract, and in mose cases, a key illustration selected from the patent or patent application.

  20. Grounding Abstractness: Abstract Concepts and the Activation of the Mouth

    PubMed Central

    Borghi, Anna M.; Zarcone, Edoardo

    2016-01-01

    One key issue for theories of cognition is how abstract concepts, such as freedom, are represented. According to the WAT (Words As social Tools) proposal, abstract concepts activate both sensorimotor and linguistic/social information, and their acquisition modality involves the linguistic experience more than the acquisition of concrete concepts. We report an experiment in which participants were presented with abstract and concrete definitions followed by concrete and abstract target-words. When the definition and the word matched, participants were required to press a key, either with the hand or with the mouth. Response times and accuracy were recorded. As predicted, we found that abstract definitions and abstract words yielded slower responses and more errors compared to concrete definitions and concrete words. More crucially, there was an interaction between the target-words and the effector used to respond (hand, mouth). While responses with the mouth were overall slower, the advantage of the hand over the mouth responses was more marked with concrete than with abstract concepts. The results are in keeping with grounded and embodied theories of cognition and support the WAT proposal, according to which abstract concepts evoke linguistic-social information, hence activate the mouth. The mechanisms underlying the mouth activation with abstract concepts (re-enactment of acquisition experience, or re-explanation of the word meaning, possibly through inner talk) are discussed. To our knowledge this is the first behavioral study demonstrating with real words that the advantage of the hand over the mouth is more marked with concrete than with abstract concepts, likely because of the activation of linguistic information with abstract concepts. PMID:27777563

  1. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  2. Stellar Presentations (Abstract)

    NASA Astrophysics Data System (ADS)

    Young, D.

    2015-12-01

    (Abstract only) The AAVSO is in the process of expanding its education, outreach and speakers bureau program. powerpoint presentations prepared for specific target audiences such as AAVSO members, educators, students, the general public, and Science Olympiad teams, coaches, event supervisors, and state directors will be available online for members to use. The presentations range from specific and general content relating to stellar evolution and variable stars to specific activities for a workshop environment. A presentation—even with a general topic—that works for high school students will not work for educators, Science Olympiad teams, or the general public. Each audience is unique and requires a different approach. The current environment necessitates presentations that are captivating for a younger generation that is embedded in a highly visual and sound-bite world of social media, twitter and U-Tube, and mobile devices. For educators, presentations and workshops for themselves and their students must support the Next Generation Science Standards (NGSS), the Common Core Content Standards, and the Science Technology, Engineering and Mathematics (STEM) initiative. Current best practices for developing relevant and engaging powerpoint presentations to deliver information to a variety of targeted audiences will be presented along with several examples.

  3. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  4. Expanding the genetic code for site-specific labelling of tobacco mosaic virus coat protein and building biotin-functionalized virus-like particles.

    PubMed

    Wu, F C; Zhang, H; Zhou, Q; Wu, M; Ballard, Z; Tian, Y; Wang, J Y; Niu, Z W; Huang, Y

    2014-04-18

    A method for site-specific and high yield modification of tobacco mosaic virus coat protein (TMVCP) utilizing a genetic code expanding technology and copper free cycloaddition reaction has been established, and biotin-functionalized virus-like particles were built by the self-assembly of the protein monomers.

  5. NASA Patent Abstracts: A Continuing Bibliography. Supplement 54

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The NASA Patent Abstracts Bibliography is a semiannual NASA publication containing comprehensive abstracts of NASA owned inventions covered by U.S. patents and applications for patent. The citations included in the bibliography arrangement of citations were originally published in NASA's Scientific and Technical Aerospace Reports (STAR) and cover STAR announcements made since May 1969. The citations published in this issue cover the period June 1998 through December 1998. This issue includes 10 major subject divisions separated into 76 specific categories and one general category/division. Each entry consists of a STAR citation accompanied by an abstract and, when appropriate, a key illustration taken from the patent or application for patent. Entries are arranged by subject category in ascending order.

  6. A Comparison of Abstract Writing Style between English and Chinese

    ERIC Educational Resources Information Center

    Zhou, Xiaoying; Liao, Hangjie

    2018-01-01

    In this paper the authors conducted a comprehensive study on English abstract writing style. Abstraction is the process of forming a theoretical concept based on the observation and classification of object things. This concept has no definite denotation. However in specific situation it can be clearly understood. In English, writing an abstract…

  7. Predicting Semantic Changes in Abstraction in Tutor Responses to Students

    ERIC Educational Resources Information Center

    Lipschultz, Michael; Litman, Diane; Katz, Sandra; Albacete, Patricia; Jordan, Pamela

    2014-01-01

    Post-problem reflective tutorial dialogues between human tutors and students are examined to predict when the tutor changed the level of abstraction from the student's preceding turn (i.e., used more general terms or more specific terms); such changes correlate with learning. Prior work examined lexical changes in abstraction. In this work, we…

  8. Divergent evolutionary rates in vertebrate and mammalian specific conserved non-coding elements (CNEs) in echolocating mammals.

    PubMed

    Davies, Kalina T J; Tsagkogeorga, Georgia; Rossiter, Stephen J

    2014-12-19

    The majority of DNA contained within vertebrate genomes is non-coding, with a certain proportion of this thought to play regulatory roles during development. Conserved Non-coding Elements (CNEs) are an abundant group of putative regulatory sequences that are highly conserved across divergent groups and thus assumed to be under strong selective constraint. Many CNEs may contain regulatory factor binding sites, and their frequent spatial association with key developmental genes - such as those regulating sensory system development - suggests crucial roles in regulating gene expression and cellular patterning. Yet surprisingly little is known about the molecular evolution of CNEs across diverse mammalian taxa or their role in specific phenotypic adaptations. We examined 3,110 vertebrate-specific and ~82,000 mammalian-specific CNEs across 19 and 9 mammalian orders respectively, and tested for changes in the rate of evolution of CNEs located in the proximity of genes underlying the development or functioning of auditory systems. As we focused on CNEs putatively associated with genes underlying the development/functioning of auditory systems, we incorporated echolocating taxa in our dataset because of their highly specialised and derived auditory systems. Phylogenetic reconstructions of concatenated CNEs broadly recovered accepted mammal relationships despite high levels of sequence conservation. We found that CNE substitution rates were highest in rodents and lowest in primates, consistent with previous findings. Comparisons of CNE substitution rates from several genomic regions containing genes linked to auditory system development and hearing revealed differences between echolocating and non-echolocating taxa. Wider taxonomic sampling of four CNEs associated with the homeobox genes Hmx2 and Hmx3 - which are required for inner ear development - revealed family-wise variation across diverse bat species. Specifically within one family of echolocating bats that utilise

  9. Suicide reporting content analysis: abstract development and reliability.

    PubMed

    Gould, Madelyn S; Midle, Jennifer Bassett; Insel, Beverly; Kleinman, Marjorie

    2007-01-01

    Despite substantial research on media influences and the development of media guidelines on suicide reporting, research on the specifics of media stories that facilitate suicide contagion has been limited. The goal of the present study was to develop a content analytic strategy to code features in media suicide reports presumed to be influential in suicide contagion and determine the interrater reliability of the qualitative characteristics abstracted from newspaper stories. A random subset of 151 articles from a database of 1851 newspaper suicide stories published during 1988 through 1996, which were collected as part of a national study in the United States to identify factors associated with the initiation of youth suicide clusters, were evaluated. Using a well-defined content-analysis procedure, the agreement between raters in scoring key concepts of suicide reports from the headline, the pictorial presentation, and the text were evaluated. The results show that while the majority of variables in the content analysis were very reliable, assessed using the kappa statistic, and obtained excellent percentages of agreement, the reliability of complicated constructs, such as sensationalizing, glorifying, or romanticizing the suicide, was comparatively low. The data emphasize that before effective guidelines and responsible suicide reporting can ensue, further explication of suicide story constructs is necessary to ensure the implementation and compliance of responsible reporting on behalf of the media.

  10. Beyond Molecular Codes: Simple Rules to Wire Complex Brains

    PubMed Central

    Hassan, Bassem A.; Hiesinger, P. Robin

    2015-01-01

    Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480

  11. Abstract semantics in the motor system? - An event-related fMRI study on passive reading of semantic word categories carrying abstract emotional and mental meaning.

    PubMed

    Dreyer, Felix R; Pulvermüller, Friedemann

    2018-03-01

    Previous research showed that modality-preferential sensorimotor areas are relevant for processing concrete words used to speak about actions. However, whether modality-preferential areas also play a role for abstract words is still under debate. Whereas recent functional magnetic resonance imaging (fMRI) studies suggest an involvement of motor cortex in processing the meaning of abstract emotion words as, for example, 'love', other non-emotional abstract words, in particular 'mental words', such as 'thought' or 'logic', are believed to engage 'amodal' semantic systems only. In the present event-related fMRI experiment, subjects passively read abstract emotional and mental nouns along with concrete action related words. Contrary to expectation, the results indicate a specific involvement of face motor areas in the processing of mental nouns, resembling that seen for face related action words. This result was confirmed when subject-specific regions of interest (ROIs) defined by motor localizers were used. We conclude that a role of motor systems in semantic processing is not restricted to concrete words but extends to at least some abstract mental symbols previously thought to be entirely 'disembodied' and divorced from semantically related sensorimotor processing. Implications for neurocognitive theories of semantics and clinical applications will be highlighted, paying specific attention to the role of brain activations as indexes of cognitive processes and their relationships to 'causal' studies addressing lesion and transcranial magnetic stimulation (TMS) effects. Possible implications for clinical practice, in particular speech language therapy, are discussed in closing. Copyright © 2017. Published by Elsevier Ltd.

  12. Visually defining and querying consistent multi-granular clinical temporal abstractions.

    PubMed

    Combi, Carlo; Oliboni, Barbara

    2012-02-01

    The main goal of this work is to propose a framework for the visual specification and query of consistent multi-granular clinical temporal abstractions. We focus on the issue of querying patient clinical information by visually defining and composing temporal abstractions, i.e., high level patterns derived from several time-stamped raw data. In particular, we focus on the visual specification of consistent temporal abstractions with different granularities and on the visual composition of different temporal abstractions for querying clinical databases. Temporal abstractions on clinical data provide a concise and high-level description of temporal raw data, and a suitable way to support decision making. Granularities define partitions on the time line and allow one to represent time and, thus, temporal clinical information at different levels of detail, according to the requirements coming from the represented clinical domain. The visual representation of temporal information has been considered since several years in clinical domains. Proposed visualization techniques must be easy and quick to understand, and could benefit from visual metaphors that do not lead to ambiguous interpretations. Recently, physical metaphors such as strips, springs, weights, and wires have been proposed and evaluated on clinical users for the specification of temporal clinical abstractions. Visual approaches to boolean queries have been considered in the last years and confirmed that the visual support to the specification of complex boolean queries is both an important and difficult research topic. We propose and describe a visual language for the definition of temporal abstractions based on a set of intuitive metaphors (striped wall, plastered wall, brick wall), allowing the clinician to use different granularities. A new algorithm, underlying the visual language, allows the physician to specify only consistent abstractions, i.e., abstractions not containing contradictory conditions on

  13. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  14. Grounded understanding of abstract concepts: The case of STEM learning.

    PubMed

    Hayes, Justin C; Kraemer, David J M

    2017-01-01

    Characterizing the neural implementation of abstract conceptual representations has long been a contentious topic in cognitive science. At the heart of the debate is whether the "sensorimotor" machinery of the brain plays a central role in representing concepts, or whether the involvement of these perceptual and motor regions is merely peripheral or epiphenomenal. The domain of science, technology, engineering, and mathematics (STEM) learning provides an important proving ground for sensorimotor (or grounded) theories of cognition, as concepts in science and engineering courses are often taught through laboratory-based and other hands-on methodologies. In this review of the literature, we examine evidence suggesting that sensorimotor processes strengthen learning associated with the abstract concepts central to STEM pedagogy. After considering how contemporary theories have defined abstraction in the context of semantic knowledge, we propose our own explanation for how body-centered information, as computed in sensorimotor brain regions and visuomotor association cortex, can form a useful foundation upon which to build an understanding of abstract scientific concepts, such as mechanical force. Drawing from theories in cognitive neuroscience, we then explore models elucidating the neural mechanisms involved in grounding intangible concepts, including Hebbian learning, predictive coding, and neuronal recycling. Empirical data on STEM learning through hands-on instruction are considered in light of these neural models. We conclude the review by proposing three distinct ways in which the field of cognitive neuroscience can contribute to STEM learning by bolstering our understanding of how the brain instantiates abstract concepts in an embodied fashion.

  15. The application of coded excitation technology in medical ultrasonic Doppler imaging

    NASA Astrophysics Data System (ADS)

    Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin

    2008-03-01

    Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.

  16. 37 CFR 1.72 - Title and abstract.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Title and abstract. 1.72 Section 1.72 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Specification § 1.72...

  17. Improving the accuracy of operation coding in surgical discharge summaries

    PubMed Central

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  18. The representation of abstract words: what matters? Reply to Paivio's (2013) comment on Kousta et al. (2011).

    PubMed

    Vigliocco, Gabriella; Kousta, Stavroula; Vinson, David; Andrews, Mark; Del Campo, Elena

    2013-02-01

    In Kousta, Vigliocco, Vinson, Andrews, and Del Campo (2011), we presented an embodied theory of semantic representation, which crucially included abstract concepts as internally embodied via affective states. Paivio (2013) took issue with our treatment of dual coding theory, our reliance on data from lexical decision, and our theoretical proposal. Here, we address these different issues and clarify how our findings offer a way to move forward in the investigation of how abstract concepts are represented. 2013 APA, all rights reserved

  19. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  20. Integrative and distinctive coding of visual and conceptual object features in the ventral visual stream

    PubMed Central

    Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY

    2018-01-01

    A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853

  1. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  2. Conflicting Demands of Abstract and Specific Visual Object Processing Resolved by Fronto-Parietal Networks

    PubMed Central

    McMenamin, Brenton W.; Marsolek, Chad J.; Morseth, Brianna K.; Speer, MacKenzie F.; Burton, Philip C.; Burgund, E. Darcy

    2016-01-01

    Object categorization and exemplar identification place conflicting demands on the visual system, yet humans easily perform these fundamentally contradictory tasks. Previous studies suggest the existence of dissociable visual processing subsystems to accomplish the two abilities – an abstract category (AC) subsystem that operates effectively in the left hemisphere, and a specific exemplar (SE) subsystem that operates effectively in the right hemisphere. This multiple subsystems theory explains a range of visual abilities, but previous studies have not explored what mechanisms exist for coordinating the function of multiple subsystems and/or resolving the conflicts that would arise between them. We collected functional MRI data while participants performed two variants of a cue-probe working memory task that required AC or SE processing. During the maintenance phase of the task, the bilateral intraparietal sulcus (IPS) exhibited hemispheric asymmetries in functional connectivity consistent with exerting proactive control over the two visual subsystems: greater connectivity to the left hemisphere during the AC task, and greater connectivity to the right hemisphere during the SE task. Moreover, probe-evoked activation revealed activity in a broad fronto-parietal network (containing IPS) associated with reactive control when the two visual subsystems were in conflict, and variations in this conflict signal across trials was related to the visual similarity of the cue/probe stimulus pairs. Although many studies have confirmed the existence of multiple visual processing subsystems, this study is the first to identify the mechanisms responsible for coordinating their operations. PMID:26883940

  3. TOUGH+ v1.5 Core Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, George J.

    TOUGH+ v1.5 is a numerical code for the simulation of multi-phase, multi-component flow and transport of mass and heat through porous and fractured media, and represents the third update of the code since its first release [Moridis et al., 2008]. TOUGH+ is a successor to the TOUGH2 [Pruess et al., 1991; 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. It is written in standard FORTRAN 95/2003, and can be run on any computational platform (workstations, PC, Macintosh). TOUGH+ v1.5 employs dynamic memory allocation, thus minimizing storage requirements. It has amore » completely modular structure, follows the tenets of Object-Oriented Programming (OOP), and involves the advanced features of FORTRAN 95/2003, i.e., modules, derived data types, the use of pointers, lists and trees, data encapsulation, defined operators and assignments, operator extension and overloading, use of generic procedures, and maximum use of the powerful intrinsic vector and matrix processing operations. TOUGH+ v1.5 is the core code for its family of applications, i.e., the part of the code that is common to all its applications. It provides a description of the underlying physics and thermodynamics of non-isothermal flow, of the mathematical and numerical approaches, as well as a detailed explanation of the general (common to all applications) input requirements, options, capabilities and output specifications. The core code cannot run by itself: it needs to be coupled with the code for the specific TOUGH+ application option that describes a particular type of problem. The additional input requirements specific to a particular TOUGH+ application options and related illustrative examples can be found in the corresponding User's Manual.« less

  4. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions

  5. Development, dissemination, and applications of a new terminological resource, the Q-Code taxonomy for professional aspects of general practice/family medicine.

    PubMed

    Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc

    2018-12-01

    While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.

  6. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  7. Fine-grained semantic categorization across the abstract and concrete domains.

    PubMed

    Ghio, Marta; Vaghi, Matilde Maria Serena; Tettamanti, Marco

    2013-01-01

    A consolidated approach to the study of the mental representation of word meanings has consisted in contrasting different domains of knowledge, broadly reflecting the abstract-concrete dichotomy. More fine-grained semantic distinctions have emerged in neuropsychological and cognitive neuroscience work, reflecting semantic category specificity, but almost exclusively within the concrete domain. Theoretical advances, particularly within the area of embodied cognition, have more recently put forward the idea that distributed neural representations tied to the kinds of experience maintained with the concepts' referents might distinguish conceptual meanings with a high degree of specificity, including those within the abstract domain. Here we report the results of two psycholinguistic rating studies incorporating such theoretical advances with two main objectives: first, to provide empirical evidence of fine-grained distinctions within both the abstract and the concrete semantic domains with respect to relevant psycholinguistic dimensions; second, to develop a carefully controlled linguistic stimulus set that may be used for auditory as well as visual neuroimaging studies focusing on the parametrization of the semantic space beyond the abstract-concrete dichotomy. Ninety-six participants rated a set of 210 sentences across pre-selected concrete (mouth, hand, or leg action-related) and abstract (mental state-, emotion-, mathematics-related) categories, with respect either to different semantic domain-related scales (rating study 1), or to concreteness, familiarity, and context availability (rating study 2). Inferential statistics and correspondence analyses highlighted distinguishing semantic and psycholinguistic traits for each of the pre-selected categories, indicating that a simple abstract-concrete dichotomy is not sufficient to account for the entire semantic variability within either domains.

  8. Fine-Grained Semantic Categorization across the Abstract and Concrete Domains

    PubMed Central

    Tettamanti, Marco

    2013-01-01

    A consolidated approach to the study of the mental representation of word meanings has consisted in contrasting different domains of knowledge, broadly reflecting the abstract-concrete dichotomy. More fine-grained semantic distinctions have emerged in neuropsychological and cognitive neuroscience work, reflecting semantic category specificity, but almost exclusively within the concrete domain. Theoretical advances, particularly within the area of embodied cognition, have more recently put forward the idea that distributed neural representations tied to the kinds of experience maintained with the concepts' referents might distinguish conceptual meanings with a high degree of specificity, including those within the abstract domain. Here we report the results of two psycholinguistic rating studies incorporating such theoretical advances with two main objectives: first, to provide empirical evidence of fine-grained distinctions within both the abstract and the concrete semantic domains with respect to relevant psycholinguistic dimensions; second, to develop a carefully controlled linguistic stimulus set that may be used for auditory as well as visual neuroimaging studies focusing on the parametrization of the semantic space beyond the abstract-concrete dichotomy. Ninety-six participants rated a set of 210 sentences across pre-selected concrete (mouth, hand, or leg action-related) and abstract (mental state-, emotion-, mathematics-related) categories, with respect either to different semantic domain-related scales (rating study 1), or to concreteness, familiarity, and context availability (rating study 2). Inferential statistics and correspondence analyses highlighted distinguishing semantic and psycholinguistic traits for each of the pre-selected categories, indicating that a simple abstract-concrete dichotomy is not sufficient to account for the entire semantic variability within either domains. PMID:23825625

  9. Identification codes for organizations listed in computerized data systems of the U.S. Geological Survey

    USGS Publications Warehouse

    Blackwell, C.D.

    1988-01-01

    Codes for the unique identification of public and private organizations listed in computerized data systems are presented. These codes are used by the U.S. Geological Survey 's National Water Data Exchange (NAWDEX), National Water Data Storage and Retrieval System (WATSTORE), National Cartographic Information Center (NCIC), and Office of Water Data Coordination (OWDC). The format structure of the codes is discussed and instructions are given for requesting new books. (Author 's abstract)

  10. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  11. Internationalizing professional codes in engineering.

    PubMed

    Harris, Charles E

    2004-07-01

    Professional engineering societies which are based in the United States, such as the American Society of Mechanical Engineers (ASME, now ASME International) are recognizing that their codes of ethics must apply to engineers working throughout the world. An examination of the ethical code of the ASME International shows that its provisions pose many problems of application, especially in societies outside the United States. In applying the codes effectively in the international environment, two principal issues must be addressed. First, some Culture Transcending Guidelines must be identified and justified. Nine such guidelines are identified Second, some methods for applying the codes to particular situations must be identified Three such methods are specification, balancing, and finding a creative middle way.

  12. Abstraction and Consolidation

    ERIC Educational Resources Information Center

    Monaghan, John; Ozmantar, Mehmet Fatih

    2006-01-01

    The framework for this paper is a recently developed theory of abstraction in context. The paper reports on data collected from one student working on tasks concerned with absolute value functions. It examines the relationship between mathematical constructions and abstractions. It argues that an abstraction is a consolidated construction that can…

  13. Researcher Perceptions of Ethical Guidelines and Codes of Conduct

    PubMed Central

    Giorgini, Vincent; Mecca, Jensen T.; Gibson, Carter; Medeiros, Kelsey; Mumford, Michael D.; Connelly, Shane; Devenport, Lynn D.

    2014-01-01

    Ethical codes of conduct exist in almost every profession. Field-specific codes of conduct have been around for decades, each articulating specific ethical and professional guidelines. However, there has been little empirical research on researchers’ perceptions of these codes of conduct. In the present study, we interviewed faculty members in six research disciplines and identified five themes bearing on the circumstances under which they use ethical guidelines and the underlying reasons for not adhering to such guidelines. We then identify problems with the manner in which codes of conduct in academia are constructed and offer solutions for overcoming these problems. PMID:25635845

  14. Energy information data base: report number codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes eachmore » has used. (RWR)« less

  15. Systematic review of validated case definitions for diabetes in ICD-9-coded and ICD-10-coded data in adult populations.

    PubMed

    Khokhar, Bushra; Jette, Nathalie; Metcalfe, Amy; Cunningham, Ceara Tess; Quan, Hude; Kaplan, Gilaad G; Butalia, Sonia; Rabi, Doreen

    2016-08-05

    With steady increases in 'big data' and data analytics over the past two decades, administrative health databases have become more accessible and are now used regularly for diabetes surveillance. The objective of this study is to systematically review validated International Classification of Diseases (ICD)-based case definitions for diabetes in the adult population. Electronic databases, MEDLINE and Embase, were searched for validation studies where an administrative case definition (using ICD codes) for diabetes in adults was validated against a reference and statistical measures of the performance reported. The search yielded 2895 abstracts, and of the 193 potentially relevant studies, 16 met criteria. Diabetes definition for adults varied by data source, including physician claims (sensitivity ranged from 26.9% to 97%, specificity ranged from 94.3% to 99.4%, positive predictive value (PPV) ranged from 71.4% to 96.2%, negative predictive value (NPV) ranged from 95% to 99.6% and κ ranged from 0.8 to 0.9), hospital discharge data (sensitivity ranged from 59.1% to 92.6%, specificity ranged from 95.5% to 99%, PPV ranged from 62.5% to 96%, NPV ranged from 90.8% to 99% and κ ranged from 0.6 to 0.9) and a combination of both (sensitivity ranged from 57% to 95.6%, specificity ranged from 88% to 98.5%, PPV ranged from 54% to 80%, NPV ranged from 98% to 99.6% and κ ranged from 0.7 to 0.8). Overall, administrative health databases are useful for undertaking diabetes surveillance, but an awareness of the variation in performance being affected by case definition is essential. The performance characteristics of these case definitions depend on the variations in the definition of primary diagnosis in ICD-coded discharge data and/or the methodology adopted by the healthcare facility to extract information from patient records. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Converging modalities ground abstract categories: the case of politics.

    PubMed

    Farias, Ana Rita; Garrido, Margarida V; Semin, Gün R

    2013-01-01

    Three studies are reported examining the grounding of abstract concepts across two modalities (visual and auditory) and their symbolic representation. A comparison of the outcomes across these studies reveals that the symbolic representation of political concepts and their visual and auditory modalities is convergent. In other words, the spatial relationships between specific instances of the political categories are highly overlapping across the symbolic, visual and auditory modalities. These findings suggest that abstract categories display redundancy across modal and amodal representations, and are multimodal.

  17. [French norms of imagery for pictures, for concrete and abstract words].

    PubMed

    Robin, Frédérique

    2006-09-01

    This paper deals with French norms for mental image versus picture agreement for 138 pictures and the imagery value for 138 concrete words and 69 abstract words. The pictures were selected from Snodgrass et Vanderwart's norms (1980). The concrete words correspond to the dominant naming response to the pictorial stimuli. The abstract words were taken from verbal associative norms published by Ferrand (2001). The norms were established according to two variables: 1) mental image vs. picture agreement, and 2) imagery value of words. Three other variables were controlled: 1) picture naming agreement; 2) familiarity of objects referred to in the pictures and the concrete words, and 3) subjective verbal frequency of words. The originality of this work is to provide French imagery norms for the three kinds of stimuli usually compared in research on dual coding. Moreover, these studies focus on figurative and verbal stimuli variations in visual imagery processes.

  18. 29 CFR 510.21 - SIC codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false SIC codes. 510.21 Section 510.21 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS IMPLEMENTATION OF THE... Classification of Industries § 510.21 SIC codes. (a) The Conference Report specifically cites Puerto Rico's...

  19. Defining pediatric traumatic brain injury using International Classification of Diseases Version 10 Codes: a systematic review.

    PubMed

    Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela

    2015-02-04

    Although healthcare administrative data are commonly used for traumatic brain injury (TBI) research, there is currently no consensus or consistency on the International Classification of Diseases Version 10 (ICD-10) codes used to define TBI among children and youth internationally. This study systematically reviewed the literature to explore the range of ICD-10 codes that are used to define TBI in this population. The identification of the range of ICD-10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews were systematically searched. Grey literature was searched using Grey Matters and Google. Reference lists of included articles were also searched for relevant studies. Two reviewers independently screened all titles and abstracts using pre-defined inclusion and exclusion criteria. A full text screen was conducted on articles that met the first screen inclusion criteria. All full text articles that met the pre-defined inclusion criteria were included for analysis in this systematic review. A total of 1,326 publications were identified through the predetermined search strategy and 32 articles/reports met all eligibility criteria for inclusion in this review. Five articles specifically examined children and youth aged 19 years or under with TBI. ICD-10 case definitions ranged from the broad injuries to the head codes (ICD-10 S00 to S09) to concussion only (S06.0). There was overwhelming consensus on the inclusion of ICD-10 code S06, intracranial injury, while codes S00 (superficial injury of the head), S03 (dislocation, sprain, and strain of joints and ligaments of head), and S05 (injury of eye and orbit) were only used by articles that examined head injury, none of which specifically examined children and

  20. Static Verification for Code Contracts

    NASA Astrophysics Data System (ADS)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  1. Identification of coding and non-coding mutational hotspots in cancer genomes.

    PubMed

    Piraino, Scott W; Furney, Simon J

    2017-01-05

    The identification of mutations that play a causal role in tumour development, so called "driver" mutations, is of critical importance for understanding how cancers form and how they might be treated. Several large cancer sequencing projects have identified genes that are recurrently mutated in cancer patients, suggesting a role in tumourigenesis. While the landscape of coding drivers has been extensively studied and many of the most prominent driver genes are well characterised, comparatively less is known about the role of mutations in the non-coding regions of the genome in cancer development. The continuing fall in genome sequencing costs has resulted in a concomitant increase in the number of cancer whole genome sequences being produced, facilitating systematic interrogation of both the coding and non-coding regions of cancer genomes. To examine the mutational landscapes of tumour genomes we have developed a novel method to identify mutational hotspots in tumour genomes using both mutational data and information on evolutionary conservation. We have applied our methodology to over 1300 whole cancer genomes and show that it identifies prominent coding and non-coding regions that are known or highly suspected to play a role in cancer. Importantly, we applied our method to the entire genome, rather than relying on predefined annotations (e.g. promoter regions) and we highlight recurrently mutated regions that may have resulted from increased exposure to mutational processes rather than selection, some of which have been identified previously as targets of selection. Finally, we implicate several pan-cancer and cancer-specific candidate non-coding regions, which could be involved in tumourigenesis. We have developed a framework to identify mutational hotspots in cancer genomes, which is applicable to the entire genome. This framework identifies known and novel coding and non-coding mutional hotspots and can be used to differentiate candidate driver regions from

  2. The left inferior frontal gyrus: A neural crossroads between abstract and concrete knowledge.

    PubMed

    Della Rosa, Pasquale Anthony; Catricalà, Eleonora; Canini, Matteo; Vigliocco, Gabriella; Cappa, Stefano F

    2018-07-15

    Evidence from both neuropsychology and neuroimaging suggests that different types of information are necessary for representing and processing concrete and abstract word meanings. Both abstract and concrete concepts, however, conjointly rely on perceptual, verbal and contextual knowledge, with abstract concepts characterized by low values of imageability (IMG) (low sensory-motor grounding) and low context availability (CA) (more difficult to contextualize). Imaging studies supporting differences between abstract and concrete concepts show a greater recruitment of the left inferior frontal gyrus (LIFG) for abstract concepts, which has been attributed either to the representation of abstract-specific semantic knowledge or to the request for more executive control than in the case of concrete concepts. We conducted an fMRI study on 27 participants, using a lexical decision task involving both abstract and concrete words, whose IMG and CA values were explicitly modelled in separate parametric analyses. The LIFG was significantly more activated for abstract than for concrete words, and a conjunction analysis showed a common activation for words with low IMG or low CA only in the LIFG, in the same area reported for abstract words. A regional template map of brain activations was then traced for words with low IMG or low CA, and BOLD regional time-series were extracted and correlated with the specific LIFG neural activity elicited for abstract words. The regions associated to low IMG, which were functionally correlated with LIFG, were mainly in the left hemisphere, while those associated with low CA were in the right hemisphere. Finally, in order to reveal which LIFG-related network increased its connectivity with decreases of IMG or CA, we conducted generalized psychophysiological interaction analyses. The connectivity strength values extracted from each region connected with the LIFG were correlated with specific LIFG neural activity for abstract words, and a regression

  3. Bandwidth efficient CCSDS coding standard proposals

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.; Perez, Lance C.; Wang, Fu-Quan

    1992-01-01

    The basic concatenated coding system for the space telemetry channel consists of a Reed-Solomon (RS) outer code, a symbol interleaver/deinterleaver, and a bandwidth efficient trellis inner code. A block diagram of this configuration is shown. The system may operate with or without the outer code and interleaver. In this recommendation, the outer code remains the (255,223) RS code over GF(2 exp 8) with an error correcting capability of t = 16 eight bit symbols. This code's excellent performance and the existence of fast, cost effective, decoders justify its continued use. The purpose of the interleaver/deinterleaver is to distribute burst errors out of the inner decoder over multiple codewords of the outer code. This utilizes the error correcting capability of the outer code more efficiently and reduces the probability of an RS decoder failure. Since the space telemetry channel is not considered bursty, the required interleaving depth is primarily a function of the inner decoding method. A diagram of an interleaver with depth 4 that is compatible with the (255,223) RS code is shown. Specific interleaver requirements are discussed after the inner code recommendations.

  4. Editors' Introduction: Abstract Concepts: Structure, Processing, and Modeling.

    PubMed

    Bolognesi, Marianna; Steen, Gerard

    2018-06-22

    Our ability to deal with abstract concepts is one of the most intriguing faculties of human cognition. Still, we know little about how such concepts are formed, processed, and represented in mind. For example, because abstract concepts do not designate referents that can be experienced through our body, the role of perceptual experiences in shaping their content remains controversial. Current theories suggest a variety of alternative explanations to the question of "how abstract concepts are represented in the human mind." These views pinpoint specific streams of semantic information that would play a prominent role in shaping the content of abstract concepts, such as situation-based information (e.g., Barsalou & Wiemer-Hastings, ), affective information (Kousta, Vigliocco, Vinson, Andrews, & Del Campo, ), and linguistic information (Louwerse, ). Rarely, these theoretical views are directly compared. In this special issue, current views are presented in their most recent and advanced form, and directly compared and discussed in a debate, which is reported at the end of each article. As a result, new exciting questions and challenges arise. These questions and challenges, reported in this introductory article, can arguably pave the way to new empirical studies and theoretical developments on the nature of abstract concepts. © 2018 Cognitive Science Society, Inc.

  5. Abstracts, Third Space Processing Symposium, Skylab results

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Skylab experiments results are reported in abstracts of papers presented at the Third Space Processing Symposium. Specific areas of interest include: exothermic brazing, metals melting, crystals, reinforced composites, glasses, eutectics; physics of the low-g processes; electrophoresis, heat flow, and convection demonstrations flown on Apollo missions; and apparatus for containerless processing, heating, cooling, and containing materials.

  6. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  7. Converging Modalities Ground Abstract Categories: The Case of Politics

    PubMed Central

    Farias, Ana Rita; Garrido, Margarida V.; Semin, Gün R.

    2013-01-01

    Three studies are reported examining the grounding of abstract concepts across two modalities (visual and auditory) and their symbolic representation. A comparison of the outcomes across these studies reveals that the symbolic representation of political concepts and their visual and auditory modalities is convergent. In other words, the spatial relationships between specific instances of the political categories are highly overlapping across the symbolic, visual and auditory modalities. These findings suggest that abstract categories display redundancy across modal and amodal representations, and are multimodal. PMID:23593360

  8. Abstraction networks for terminologies: Supporting management of "big knowledge".

    PubMed

    Halper, Michael; Gu, Huanying; Perl, Yehoshua; Ochs, Christopher

    2015-05-01

    Terminologies and terminological systems have assumed important roles in many medical information processing environments, giving rise to the "big knowledge" challenge when terminological content comprises tens of thousands to millions of concepts arranged in a tangled web of relationships. Use and maintenance of knowledge structures on that scale can be daunting. The notion of abstraction network is presented as a means of facilitating the usability, comprehensibility, visualization, and quality assurance of terminologies. An abstraction network overlays a terminology's underlying network structure at a higher level of abstraction. In particular, it provides a more compact view of the terminology's content, avoiding the display of minutiae. General abstraction network characteristics are discussed. Moreover, the notion of meta-abstraction network, existing at an even higher level of abstraction than a typical abstraction network, is described for cases where even the abstraction network itself represents a case of "big knowledge." Various features in the design of abstraction networks are demonstrated in a methodological survey of some existing abstraction networks previously developed and deployed for a variety of terminologies. The applicability of the general abstraction-network framework is shown through use-cases of various terminologies, including the Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT), the Medical Entities Dictionary (MED), and the Unified Medical Language System (UMLS). Important characteristics of the surveyed abstraction networks are provided, e.g., the magnitude of the respective size reduction referred to as the abstraction ratio. Specific benefits of these alternative terminology-network views, particularly their use in terminology quality assurance, are discussed. Examples of meta-abstraction networks are presented. The "big knowledge" challenge constitutes the use and maintenance of terminological structures that

  9. A Counterexample Guided Abstraction Refinement Framework for Verifying Concurrent C Programs

    DTIC Science & Technology

    2005-05-24

    source code are routinely executed. The source code is written in languages ranging from C/C++/Java to ML/ Ocaml . These languages differ not only in...from the difficulty to model computer programs—due to the complexity of programming languages as compared to hardware description languages —to...intermediate specification language lying between high-level Statechart- like formalisms and transition systems. Actions are encoded as changes in

  10. Transversal Clifford gates on folded surface codes

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less

  11. ICD-10 codes used to identify adverse drug events in administrative data: a systematic review.

    PubMed

    Hohl, Corinne M; Karpov, Andrei; Reddekopp, Lisa; Doyle-Waters, Mimi; Stausberg, Jürgen

    2014-01-01

    Adverse drug events, the unintended and harmful effects of medications, are important outcome measures in health services research. Yet no universally accepted set of International Classification of Diseases (ICD) revision 10 codes or coding algorithms exists to ensure their consistent identification in administrative data. Our objective was to synthesize a comprehensive set of ICD-10 codes used to identify adverse drug events. We developed a systematic search strategy and applied it to five electronic reference databases. We searched relevant medical journals, conference proceedings, electronic grey literature and bibliographies of relevant studies, and contacted content experts for unpublished studies. One author reviewed the titles and abstracts for inclusion and exclusion criteria. Two authors reviewed eligible full-text articles and abstracted data in duplicate. Data were synthesized in a qualitative manner. Of 4241 titles identified, 41 were included. We found a total of 827 ICD-10 codes that have been used in the medical literature to identify adverse drug events. The median number of codes used to search for adverse drug events was 190 (IQR 156-289) with a large degree of variability between studies in the numbers and types of codes used. Authors commonly used external injury (Y40.0-59.9) and disease manifestation codes. Only two papers reported on the sensitivity of their code set. Substantial variability exists in the methods used to identify adverse drug events in administrative data. Our work may serve as a point of reference for future research and consensus building in this area.

  12. ICD-10 codes used to identify adverse drug events in administrative data: a systematic review

    PubMed Central

    Hohl, Corinne M; Karpov, Andrei; Reddekopp, Lisa; Stausberg, Jürgen

    2014-01-01

    Background Adverse drug events, the unintended and harmful effects of medications, are important outcome measures in health services research. Yet no universally accepted set of International Classification of Diseases (ICD) revision 10 codes or coding algorithms exists to ensure their consistent identification in administrative data. Our objective was to synthesize a comprehensive set of ICD-10 codes used to identify adverse drug events. Methods We developed a systematic search strategy and applied it to five electronic reference databases. We searched relevant medical journals, conference proceedings, electronic grey literature and bibliographies of relevant studies, and contacted content experts for unpublished studies. One author reviewed the titles and abstracts for inclusion and exclusion criteria. Two authors reviewed eligible full-text articles and abstracted data in duplicate. Data were synthesized in a qualitative manner. Results Of 4241 titles identified, 41 were included. We found a total of 827 ICD-10 codes that have been used in the medical literature to identify adverse drug events. The median number of codes used to search for adverse drug events was 190 (IQR 156–289) with a large degree of variability between studies in the numbers and types of codes used. Authors commonly used external injury (Y40.0–59.9) and disease manifestation codes. Only two papers reported on the sensitivity of their code set. Conclusions Substantial variability exists in the methods used to identify adverse drug events in administrative data. Our work may serve as a point of reference for future research and consensus building in this area. PMID:24222671

  13. A proto-code of ethics and conduct for European nurse directors.

    PubMed

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  14. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  15. The Representation of Abstract Words: What Matters? Reply to Paivio's (2013) Comment on Kousta et al. (2011)

    ERIC Educational Resources Information Center

    Vigliocco, Gabriella; Kousta, Stavroula; Vinson, David; Andrews, Mark; Del Campo, Elena

    2013-01-01

    In Kousta, Vigliocco, Vinson, Andrews, and Del Campo (2011), we presented an embodied theory of semantic representation, which crucially included abstract concepts as internally embodied via affective states. Paivio (2013) took issue with our treatment of dual coding theory, our reliance on data from lexical decision, and our theoretical proposal.…

  16. Rooted tRNAomes and evolution of the genetic code

    PubMed Central

    Pak, Daewoo; Du, Nan; Kim, Yunsoo; Sun, Yanni

    2018-01-01

    ABSTRACT We advocate for a tRNA- rather than an mRNA-centric model for evolution of the genetic code. The mechanism for evolution of cloverleaf tRNA provides a root sequence for radiation of tRNAs and suggests a simplified understanding of code evolution. To analyze code sectoring, rooted tRNAomes were compared for several archaeal and one bacterial species. Rooting of tRNAome trees reveals conserved structures, indicating how the code was shaped during evolution and suggesting a model for evolution of a LUCA tRNAome tree. We propose the polyglycine hypothesis that the initial product of the genetic code may have been short chain polyglycine to stabilize protocells. In order to describe how anticodons were allotted in evolution, the sectoring-degeneracy hypothesis is proposed. Based on sectoring, a simple stepwise model is developed, in which the code sectors from a 1→4→8→∼16 letter code. At initial stages of code evolution, we posit strong positive selection for wobble base ambiguity, supporting convergence to 4-codon sectors and ∼16 letters. In a later stage, ∼5–6 letters, including stops, were added through innovating at the anticodon wobble position. In archaea and bacteria, tRNA wobble adenine is negatively selected, shrinking the maximum size of the primordial genetic code to 48 anticodons. Because 64 codons are recognized in mRNA, tRNA-mRNA coevolution requires tRNA wobble position ambiguity leading to degeneracy of the code. PMID:29372672

  17. Tutorial on Reed-Solomon error correction coding

    NASA Technical Reports Server (NTRS)

    Geisel, William A.

    1990-01-01

    This tutorial attempts to provide a frank, step-by-step approach to Reed-Solomon (RS) error correction coding. RS encoding and RS decoding both with and without erasing code symbols are emphasized. There is no need to present rigorous proofs and extreme mathematical detail. Rather, the simple concepts of groups and fields, specifically Galois fields, are presented with a minimum of complexity. Before RS codes are presented, other block codes are presented as a technical introduction into coding. A primitive (15, 9) RS coding example is then completely developed from start to finish, demonstrating the encoding and decoding calculations and a derivation of the famous error-locator polynomial. The objective is to present practical information about Reed-Solomon coding in a manner such that it can be easily understood.

  18. Memory for pictures and words as a function of level of processing: Depth or dual coding?

    PubMed

    D'Agostino, P R; O'Neill, B J; Paivio, A

    1977-03-01

    The experiment was designed to test differential predictions derived from dual-coding and depth-of-processing hypotheses. Subjects under incidental memory instructions free recalled a list of 36 test events, each presented twice. Within the list, an equal number of events were assigned to structural, phonemic, and semantic processing conditions. Separate groups of subjects were tested with a list of pictures, concrete words, or abstract words. Results indicated that retention of concrete words increased as a direct function of the processing-task variable (structural < phonemic abstract words and pictures, phonemic and semantic processing produced equivalent memory performance. These data provided strong support for the dual-coding model.

  19. Teaching Abstract Concepts: Keys to the World of Ideas.

    ERIC Educational Resources Information Center

    Flatley, Joannis K.; Gittinger, Dennis J.

    1990-01-01

    Specific teaching strategies to help hearing-impaired secondary students comprehend abstract concepts include (1) pinpointing facts and fallacies, (2) organizing information visually, (3) categorizing ideas, and (4) reinforcing new vocabulary and concepts. Figures provide examples of strategy applications. (DB)

  20. A parallel and modular deformable cell Car-Parrinello code

    NASA Astrophysics Data System (ADS)

    Cavazzoni, Carlo; Chiarotti, Guido L.

    1999-12-01

    We have developed a modular parallel code implementing the Car-Parrinello [Phys. Rev. Lett. 55 (1985) 2471] algorithm including the variable cell dynamics [Europhys. Lett. 36 (1994) 345; J. Phys. Chem. Solids 56 (1995) 510]. Our code is written in Fortran 90, and makes use of some new programming concepts like encapsulation, data abstraction and data hiding. The code has a multi-layer hierarchical structure with tree like dependences among modules. The modules include not only the variables but also the methods acting on them, in an object oriented fashion. The modular structure allows easier code maintenance, develop and debugging procedures, and is suitable for a developer team. The layer structure permits high portability. The code displays an almost linear speed-up in a wide range of number of processors independently of the architecture. Super-linear speed up is obtained with a "smart" Fast Fourier Transform (FFT) that uses the available memory on the single node (increasing for a fixed problem with the number of processing elements) as temporary buffer to store wave function transforms. This code has been used to simulate water and ammonia at giant planet conditions for systems as large as 64 molecules for ˜50 ps.

  1. Efficient abstract data type components for distributed and parallel systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bastani, F.; Hilal, W.; Iyengar, S.S.

    1987-10-01

    One way of improving software system's comprehensibility and maintainability is to decompose it into several components, each of which encapsulates some information concerning the system. These components can be classified into four categories, namely, abstract data type, functional, interface, and control components. Such a classfication underscores the need for different specification, implementation, and performance-improvement methods for different types of components. This article focuses on the development of high-performance abstract data type components for distributed and parallel environments.

  2. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  3. Alternate Learning Center. Abstracts of Inservice Training Programs.

    ERIC Educational Resources Information Center

    Rhode Island State Dept. of Education, Providence. Div. of Development and Operations.

    This booklet is a collection of abstracts describing the 18 programs offered at the Alternate Learning Center of the Rhode Island Teacher Center which has as its Primary function school based inservice training for local teachers and administrators. Each project is described in detail, including course goals, specific objectives, training…

  4. 2018 Congress Poster Abstracts

    PubMed

    2018-02-21

    Each abstract has been indexed according to the first author. Abstracts appear as they were submitted and have not undergone editing or the Oncology Nursing Forum’s review process. Only abstracts that will be presented appear here. Poster numbers are subject to change. For updated poster numbers, visit congress.ons.org or check the Congress guide. Data published in abstracts presented at the ONS 43rd Annual Congress are embargoed until the conclusion of the presentation. Coverage and/or distribution of an abstract, poster, or any of its supplemental material to or by the news media, any commercial entity, or individuals, including the authors of said abstract, is strictly prohibited until the embargo is lifted. Promotion of general topics and speakers is encouraged within these guidelines.

  5. Abstracts of Review Articles and Educational Materials in Physiology

    ERIC Educational Resources Information Center

    Physiology Teacher, 1977

    1977-01-01

    Contained are 99 abstracts of review articles, texts, books, manuals, learning programs, and audiovisual material used in teaching physiology. Specific fields include cell physiology, circulation, comparative physiology, development and aging, endocrinology and metabolism, environmental and exercise physiology, gastrointestinal physiology, muscle…

  6. Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.

    PubMed

    Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R

    2015-12-01

    To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Practical guide to bar coding for patient medication safety.

    PubMed

    Neuenschwander, Mark; Cohen, Michael R; Vaida, Allen J; Patchett, Jeffrey A; Kelly, Jamie; Trohimovich, Barbara

    2003-04-15

    Bar coding for the medication administration step of the drug-use process is discussed. FDA will propose a rule in 2003 that would require bar-code labels on all human drugs and biologicals. Even with an FDA mandate, manufacturer procrastination and possible shifts in product availability are likely to slow progress. Such delays should not preclude health systems from adopting bar-code-enabled point-of-care (BPOC) systems to achieve gains in patient safety. Bar-code technology is a replacement for traditional keyboard data entry. The elements of bar coding are content, which determines the meaning; data format, which refers to the embedded data and symbology, which describes the "font" in which the machine-readable code is written. For a BPOC system to deliver an acceptable level of patient protection, the hospital must first establish reliable processes for a patient identification band, caregiver badge, and medication bar coding. Medications can have either drug-specific or patient-specific bar codes. Both varieties result in the desired code that supports patient's five rights of drug administration. When medications are not available from the manufacturer in immediate-container bar-coded packaging, other means of applying the bar code must be devised, including the use of repackaging equipment, overwrapping, manual bar coding, and outsourcing. Virtually all medications should be bar coded, the bar code on the label should be easily readable, and appropriate policies, procedures, and checks should be in place. Bar coding has the potential to be not only cost-effective but to produce a return on investment. By bar coding patient identification tags, caregiver badges, and immediate-container medications, health systems can substantially increase patient safety during medication administration.

  8. An initial-abstraction, constant-loss model for unit hydrograph modeling for applicable watersheds in Texas

    USGS Publications Warehouse

    Asquith, William H.; Roussel, Meghan C.

    2007-01-01

    Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is

  9. A new way to generate cytolytic tumor-specific T cells: electroporation of RNA coding for a T cell receptor into T lymphocytes.

    PubMed

    Schaft, Niels; Dörrie, Jan; Müller, Ina; Beck, Verena; Baumann, Stefanie; Schunder, Tanja; Kämpgen, Eckhart; Schuler, Gerold

    2006-09-01

    Effective T cell receptor (TCR) transfer until now required stable retroviral transduction. However, retroviral transduction poses the threat of irreversible genetic manipulation of autologous cells. We, therefore, used optimized RNA transfection for transient manipulation. The transfection efficiency, using EGFP RNA, was >90%. The electroporation of primary T cells, isolated from blood, with TCR-coding RNA resulted in functional cytotoxic T lymphocytes (CTLs) (>60% killing at an effector to target ratio of 20:1) with the same HLA-A2/gp100-specificity as the parental CTL clone. The TCR-transfected T cells specifically recognized peptide-pulsed T2 cells, or dendritic cells electroporated with gp100-coding RNA, in an IFNgamma-secretion assay and retained this ability, even after cryopreservation, over 3 days. Most importantly, we show here for the first time that the electroporated T cells also displayed cytotoxicity, and specifically lysed peptide-loaded T2 cells and HLA-A2+/gp100+ melanoma cells over a period of at least 72 h. Peptide-titration studies showed that the lytic efficiency of the RNA-transfected T cells was similar to that of retrovirally transduced T cells, and approximated that of the parental CTL clone. Functional TCR transfer by RNA electroporation is now possible without the disadvantages of retroviral transduction, and forms a new strategy for the immunotherapy of cancer.

  10. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com; Smith, Ralph; Williams, Brian

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is tomore » employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.« less

  11. Circular codes revisited: a statistical approach.

    PubMed

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. DROP: Detecting Return-Oriented Programming Malicious Code

    NASA Astrophysics Data System (ADS)

    Chen, Ping; Xiao, Hai; Shen, Xiaobin; Yin, Xinchun; Mao, Bing; Xie, Li

    Return-Oriented Programming (ROP) is a new technique that helps the attacker construct malicious code mounted on x86/SPARC executables without any function call at all. Such technique makes the ROP malicious code contain no instruction, which is different from existing attacks. Moreover, it hides the malicious code in benign code. Thus, it circumvents the approaches that prevent control flow diversion outside legitimate regions (such as W ⊕ X ) and most malicious code scanning techniques (such as anti-virus scanners). However, ROP has its own intrinsic feature which is different from normal program design: (1) uses short instruction sequence ending in "ret", which is called gadget, and (2) executes the gadgets contiguously in specific memory space, such as standard GNU libc. Based on the features of the ROP malicious code, in this paper, we present a tool DROP, which is focused on dynamically detecting ROP malicious code. Preliminary experimental results show that DROP can efficiently detect ROP malicious code, and have no false positives and negatives.

  13. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Fujiwara, T.; Lin, S.

    1986-01-01

    In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.

  14. Transforming Aggregate Object-Oriented Formal Specifications to Code

    DTIC Science & Technology

    1999-03-01

    integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development

  15. 32 CFR 636.11 - Installation traffic codes

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Installation traffic codes 636.11 Section 636.11 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION (SPECIFIC INSTALLATIONS) Fort Stewart, Georgia § 636.11 Installation traffic codes In...

  16. Operating System Abstraction Layer (OSAL)

    NASA Technical Reports Server (NTRS)

    Yanchik, Nicholas J.

    2007-01-01

    This viewgraph presentation reviews the concept of the Operating System Abstraction Layer (OSAL) and its benefits. The OSAL is A small layer of software that allows programs to run on many different operating systems and hardware platforms It runs independent of the underlying OS & hardware and it is self-contained. The benefits of OSAL are that it removes dependencies from any one operating system, promotes portable, reusable flight software. It allows for Core Flight software (FSW) to be built for multiple processors and operating systems. The presentation discusses the functionality, the various OSAL releases, and describes the specifications.

  17. Intra Frame Coding In Advanced Video Coding Standard (H.264) to Obtain Consistent PSNR and Reduce Bit Rate for Diagonal Down Left Mode Using Gaussian Pulse

    NASA Astrophysics Data System (ADS)

    Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma

    2017-08-01

    Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids

  18. Experimental evaluation of certification trails using abstract data type validation

    NASA Technical Reports Server (NTRS)

    Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.

  19. The proposed coding standard at GSFC

    NASA Technical Reports Server (NTRS)

    Morakis, J. C.; Helgert, H. J.

    1977-01-01

    As part of the continuing effort to introduce standardization of spacecraft and ground equipment in satellite systems, NASA's Goddard Space Flight Center and other NASA facilities have supported the development of a set of standards for the use of error control coding in telemetry subsystems. These standards are intended to ensure compatibility between spacecraft and ground encoding equipment, while allowing sufficient flexibility to meet all anticipated mission requirements. The standards which have been developed to date cover the application of block codes in error detection and error correction modes, as well as short and long constraint length convolutional codes decoded via the Viterbi and sequential decoding algorithms, respectively. Included are detailed specifications of the codes, and their implementation. Current effort is directed toward the development of standards covering channels with burst noise characteristics, channels with feedback, and code concatenation.

  20. Regulation of mammalian cell differentiation by long non-coding RNAs

    PubMed Central

    Hu, Wenqian; Alvarez-Dominguez, Juan R; Lodish, Harvey F

    2012-01-01

    Differentiation of specialized cell types from stem and progenitor cells is tightly regulated at several levels, both during development and during somatic tissue homeostasis. Many long non-coding RNAs have been recognized as an additional layer of regulation in the specification of cellular identities; these non-coding species can modulate gene-expression programmes in various biological contexts through diverse mechanisms at the transcriptional, translational or messenger RNA stability levels. Here, we summarize findings that implicate long non-coding RNAs in the control of mammalian cell differentiation. We focus on several representative differentiation systems and discuss how specific long non-coding RNAs contribute to the regulation of mammalian development. PMID:23070366

  1. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 08)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This bibliography is issued in two sections; abstracts and indexes. The Abstract Section cites 180 patents and applications for patents introduced into the NASA scientific and technical information system during the period of July 1975 through December 1975. Each entry in the Abstract Section consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent. The index Section contains entries for 2,905 patents and applications for patent citations covering the period May 1969 through December 1975. The Index Section contains five indexes -- subject, inventor, source, number and accession number.

  2. The accuracy of burn diagnosis codes in health administrative data: A validation study.

    PubMed

    Mason, Stephanie A; Nathens, Avery B; Byrne, James P; Fowler, Rob; Gonzalez, Alejandro; Karanicolas, Paul J; Moineddin, Rahim; Jeschke, Marc G

    2017-03-01

    Health administrative databases may provide rich sources of data for the study of outcomes following burn. We aimed to determine the accuracy of International Classification of Diseases diagnoses codes for burn in a population-based administrative database. Data from a regional burn center's clinical registry of patients admitted between 2006-2013 were linked to administrative databases. Burn total body surface area (TBSA), depth, mechanism, and inhalation injury were compared between the registry and administrative records. The sensitivity, specificity, and positive and negative predictive values were determined, and coding agreement was assessed with the kappa statistic. 1215 burn center patients were linked to administrative records. TBSA codes were highly sensitive and specific for ≥10 and ≥20% TBSA (89/93% sensitive and 95/97% specific), with excellent agreement (κ, 0.85/κ, 0.88). Codes were weakly sensitive (68%) in identifying ≥10% TBSA full-thickness burn, though highly specific (86%) with moderate agreement (κ, 0.46). Codes for inhalation injury had limited sensitivity (43%) but high specificity (99%) with moderate agreement (κ, 0.54). Burn mechanism had excellent coding agreement (κ, 0.84). Administrative data diagnosis codes accurately identify burn by burn size and mechanism, while identification of inhalation injury or full-thickness burns is less sensitive but highly specific. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  3. Compliance with the International Code of Marketing of breast-milk substitutes: an observational study of pediatricians' waiting rooms.

    PubMed

    Dodgson, Joan E; Watkins, Amanda L; Bond, Angela B; Kintaro-Tagaloa, Cheryl; Arellano, Alondra; Allred, Patrick A

    2014-04-01

    Abstract The importance of breastmilk as a primary preventative intervention is widely known and understood by most healthcare providers. The actions or non-actions that heathcare providers take toward promoting and supporting breastfeeding families make a difference in the success and duration of breastfeeding. Recognizing this relationship, the World Health Organization developed the International Code of Marketing of Breast-milk Substitutes (the Code), which defines best practices in breastfeeding promotion, including physicians' offices. The pediatric practices' waiting rooms are often a family's first experience with pediatric care. The specific aims of this study were to describe (1) Code compliance, (2) the demographic factors affecting the Code compliance, and (3) the amount and type of breastfeeding-supportive materials available in the pediatricians' waiting rooms. An observational cross-sectional design was used to collect data from 163 (82%) of the pediatric practices in Maricopa County, Arizona. None of the 100 waiting rooms that had any materials displayed (61%) was found to be completely Code compliant, with 81 of the offices having formula-promotional materials readily available. Waiting rooms in higher income areas offered more non-Code-compliant materials and gifts. Breastfeeding support information and materials were lacking in all but 18 (18%) offices. A positive relationship (t97=-2.31, p=0.02) occurred between the presence of breastfeeding educational materials and higher income areas. We were able to uncover some practice-related patterns that impact families and potentially undermine breastfeeding success. To move current practices toward breastfeeding-friendly physicians' offices, change is needed.

  4. Product information representation for feature conversion and implementation of group technology automated coding

    NASA Astrophysics Data System (ADS)

    Medland, A. J.; Zhu, Guowang; Gao, Jian; Sun, Jian

    1996-03-01

    Feature conversion, also called feature transformation and feature mapping, is defined as the process of converting features from one view of an object to another view of the object. In a relatively simple implementation, for each application the design features are automatically converted into features specific for that application. All modifications have to be made via the design features. This is the approach that has attracted most attention until now. In the ideal situation, however, conversions directly from application views to the design view, and to other applications views, are also possible. In this paper, some difficulties faced in feature conversion are discussed. A new representation scheme of feature-based parts models has been proposed for the purpose of one-way feature conversion. The parts models consist of five different levels of abstraction, extending from an assembly level and its attributes, single parts and their attributes, single features and their attributes, one containing the geometric reference element and finally one for detailed geometry. One implementation of feature conversion for rotational components within GT (Group Technology) has already been undertaken using an automated coding procedure operating on a design-feature database. This database has been generated by a feature-based design system, and the GT coding scheme used in this paper is a specific scheme created for a textile machine manufacturing plant. Such feature conversion techniques presented here are only in their early stages of development and further research is underway.

  5. Abstracting Concepts and Methods.

    ERIC Educational Resources Information Center

    Borko, Harold; Bernier, Charles L.

    This text provides a complete discussion of abstracts--their history, production, organization, publication--and of indexing. Instructions for abstracting are outlined, and standards and criteria for abstracting are stated. Management, automation, and personnel are discussed in terms of possible economies that can be derived from the introduction…

  6. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 07)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    This bibliography is issued in two sections: Section 1 - Abstracts, and Section 2 - Indexes. This issue of the Abstract Section cites 158 patents and applications for patent introduced into the NASA scientific and technical information system during the period of January 1975 through June 1975. Each entry in the Abstract Section consists of a citation, an abstract, and, in most cases, a key illustration selected from the patent or application for patent. This issue of the Index Section contains entries for 2830 patent and application for patent citations covering the period May 1969 through June 1975. The index section contains five indexes -- subject, inventor, source, number and accession number.

  7. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 09)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This bibliography is issued in two sections: Section 1 - Abstracts, and Section 2 - Indexes. This issue of the Abstract Section cites 200 patents and applications for patent introduced into the NASA scientific and technical information system during the period of January 1976 through June 1976. Each entry in the Abstract Section consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent. This issue of the Index Section contains entries for 2994 patent and application for patent citations covering the period May 1969 through June 1976. The Index Section contains five indexes -- subject, inventor, source, number and accession number.

  8. Promoter analysis reveals globally differential regulation of human long non-coding RNA and protein-coding genes

    DOE PAGES

    Alam, Tanvir; Medvedeva, Yulia A.; Jia, Hui; ...

    2014-10-02

    Transcriptional regulation of protein-coding genes is increasingly well-understood on a global scale, yet no comparable information exists for long non-coding RNA (lncRNA) genes, which were recently recognized to be as numerous as protein-coding genes in mammalian genomes. We performed a genome-wide comparative analysis of the promoters of human lncRNA and protein-coding genes, finding global differences in specific genetic and epigenetic features relevant to transcriptional regulation. These two groups of genes are hence subject to separate transcriptional regulatory programs, including distinct transcription factor (TF) proteins that significantly favor lncRNA, rather than coding-gene, promoters. We report a specific signature of promoter-proximal transcriptionalmore » regulation of lncRNA genes, including several distinct transcription factor binding sites (TFBS). Experimental DNase I hypersensitive site profiles are consistent with active configurations of these lncRNA TFBS sets in diverse human cell types. TFBS ChIP-seq datasets confirm the binding events that we predicted using computational approaches for a subset of factors. For several TFs known to be directly regulated by lncRNAs, we find that their putative TFBSs are enriched at lncRNA promoters, suggesting that the TFs and the lncRNAs may participate in a bidirectional feedback loop regulatory network. Accordingly, cells may be able to modulate lncRNA expression levels independently of mRNA levels via distinct regulatory pathways. Our results also raise the possibility that, given the historical reliance on protein-coding gene catalogs to define the chromatin states of active promoters, a revision of these chromatin signature profiles to incorporate expressed lncRNA genes is warranted in the future.« less

  9. Promoter analysis reveals globally differential regulation of human long non-coding RNA and protein-coding genes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, Tanvir; Medvedeva, Yulia A.; Jia, Hui

    Transcriptional regulation of protein-coding genes is increasingly well-understood on a global scale, yet no comparable information exists for long non-coding RNA (lncRNA) genes, which were recently recognized to be as numerous as protein-coding genes in mammalian genomes. We performed a genome-wide comparative analysis of the promoters of human lncRNA and protein-coding genes, finding global differences in specific genetic and epigenetic features relevant to transcriptional regulation. These two groups of genes are hence subject to separate transcriptional regulatory programs, including distinct transcription factor (TF) proteins that significantly favor lncRNA, rather than coding-gene, promoters. We report a specific signature of promoter-proximal transcriptionalmore » regulation of lncRNA genes, including several distinct transcription factor binding sites (TFBS). Experimental DNase I hypersensitive site profiles are consistent with active configurations of these lncRNA TFBS sets in diverse human cell types. TFBS ChIP-seq datasets confirm the binding events that we predicted using computational approaches for a subset of factors. For several TFs known to be directly regulated by lncRNAs, we find that their putative TFBSs are enriched at lncRNA promoters, suggesting that the TFs and the lncRNAs may participate in a bidirectional feedback loop regulatory network. Accordingly, cells may be able to modulate lncRNA expression levels independently of mRNA levels via distinct regulatory pathways. Our results also raise the possibility that, given the historical reliance on protein-coding gene catalogs to define the chromatin states of active promoters, a revision of these chromatin signature profiles to incorporate expressed lncRNA genes is warranted in the future.« less

  10. DRG benchmarking study establishes national coding norms.

    PubMed

    Vaul, J H

    1998-05-01

    With the increase in fraud and abuse investigations, healthcare financial managers should examine their organization's medical record coding procedures. The Federal government and third-party payers are looking specifically for improper billing of outpatient services, unbundling of procedures to increase payment, assigning higher-paying DRG codes for inpatient claims, and other abuses. A recent benchmarking study of Medicare Provider Analysis and Review (MEDPAR) data has established national norms for hospital coding and case mix based on DRGs and has revealed the majority of atypical coding cases fall into six DRG pairs. Organizations with a greater percentage of atypical cases--those more likely to be scrutinized by Federal investigators--will want to conduct suitable review and be sure appropriate documentation exists to justify the coding.

  11. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 13)

    NASA Technical Reports Server (NTRS)

    1978-01-01

    This bibliography is issued in two sections: Section 1 - Abstracts, and Section 2 - Indexes. This issue of the Abstract Section cites 161 patents and applications for patent introduced into the NASA scientific and technical information system during the period January 1978 through June 1978. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  12. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error control in data communications is analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. The probability of undetected error of the above error control scheme is derived and upper bounded. Two specific exmaples are analyzed. In the first example, the inner code is a distance-4 shortened Hamming code with generator polynomial (X+1)(X(6)+X+1) = X(7)+X(6)+X(2)+1 and the outer code is a distance-4 shortened Hamming code with generator polynomial (X+1)X(15+X(14)+X(13)+X(12)+X(4)+X(3)+X(2)+X+1) = X(16)+X(12)+X(5)+1 which is the X.25 standard for packet-switched data network. This example is proposed for error control on NASA telecommand links. In the second example, the inner code is the same as that in the first example but the outer code is a shortened Reed-Solomon code with symbols from GF(2(8)) and generator polynomial (X+1)(X+alpha) where alpha is a primitive element in GF(z(8)).

  13. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  14. Code-Mixing as a Bilingual Instructional Strategy

    ERIC Educational Resources Information Center

    Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram

    2014-01-01

    This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

  15. Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.

    PubMed

    Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong

    2017-05-01

    Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.

  16. Synthesizing Safety Conditions for Code Certification Using Meta-Level Programming

    NASA Technical Reports Server (NTRS)

    Eusterbrock, Jutta

    2004-01-01

    In code certification the code consumer publishes a safety policy and the code producer generates a proof that the produced code is in compliance with the published safety policy. In this paper, a novel viewpoint approach towards an implementational re-use oriented framework for code certification is taken. It adopts ingredients from Necula's approach for proof-carrying code, but in this work safety properties can be analyzed on a higher code level than assembly language instructions. It consists of three parts: (1) The specification language is extended to include generic pre-conditions that shall ensure safety at all states that can be reached during program execution. Actual safety requirements can be expressed by providing domain-specific definitions for the generic predicates which act as interface to the environment. (2) The Floyd-Hoare inductive assertion method is refined to obtain proof rules that allow the derivation of the proof obligations in terms of the generic safety predicates. (3) A meta-interpreter is designed and experimentally implemented that enables automatic synthesis of proof obligations for submitted programs by applying the modified Floyd-Hoare rules. The proof obligations have two separate conjuncts, one for functional correctness and another for the generic safety obligations. Proof of the generic obligations, having provided the actual safety definitions as context, ensures domain-specific safety of program execution in a particular environment and is simpler than full program verification.

  17. A subset of conserved mammalian long non-coding RNAs are fossils of ancestral protein-coding genes.

    PubMed

    Hezroni, Hadas; Ben-Tov Perry, Rotem; Meir, Zohar; Housman, Gali; Lubelsky, Yoav; Ulitsky, Igor

    2017-08-30

    Only a small portion of human long non-coding RNAs (lncRNAs) appear to be conserved outside of mammals, but the events underlying the birth of new lncRNAs in mammals remain largely unknown. One potential source is remnants of protein-coding genes that transitioned into lncRNAs. We systematically compare lncRNA and protein-coding loci across vertebrates, and estimate that up to 5% of conserved mammalian lncRNAs are derived from lost protein-coding genes. These lncRNAs have specific characteristics, such as broader expression domains, that set them apart from other lncRNAs. Fourteen lncRNAs have sequence similarity with the loci of the contemporary homologs of the lost protein-coding genes. We propose that selection acting on enhancer sequences is mostly responsible for retention of these regions. As an example of an RNA element from a protein-coding ancestor that was retained in the lncRNA, we describe in detail a short translated ORF in the JPX lncRNA that was derived from an upstream ORF in a protein-coding gene and retains some of its functionality. We estimate that ~ 55 annotated conserved human lncRNAs are derived from parts of ancestral protein-coding genes, and loss of coding potential is thus a non-negligible source of new lncRNAs. Some lncRNAs inherited regulatory elements influencing transcription and translation from their protein-coding ancestors and those elements can influence the expression breadth and functionality of these lncRNAs.

  18. Beacon- and Schema-Based Method for Recognizing Algorithms from Students' Source Code

    ERIC Educational Resources Information Center

    Taherkhani, Ahmad; Malmi, Lauri

    2013-01-01

    In this paper, we present a method for recognizing algorithms from students programming submissions coded in Java. The method is based on the concept of "programming schemas" and "beacons". Schemas are high-level programming knowledge with detailed knowledge abstracted out, and beacons are statements that imply specific…

  19. Binary weight distributions of some Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Arnold, S.

    1992-01-01

    The binary weight distributions of the (7,5) and (15,9) Reed-Solomon (RS) codes and their duals are computed using the MacWilliams identities. Several mappings of symbols to bits are considered and those offering the largest binary minimum distance are found. These results are then used to compute bounds on the soft-decoding performance of these codes in the presence of additive Gaussian noise. These bounds are useful for finding large binary block codes with good performance and for verifying the performance obtained by specific soft-coding algorithms presently under development.

  20. Methodology, status and plans for development and assessment of TUF and CATHENA codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luxat, J.C.; Liu, W.S.; Leung, R.K.

    1997-07-01

    An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less

  1. Specification and Prediction of the Radiation Environment Using Data Assimilative VERB code

    NASA Astrophysics Data System (ADS)

    Shprits, Yuri; Kellerman, Adam

    2016-07-01

    We discuss how data assimilation can be used for the reconstruction of long-term evolution, bench-marking of the physics based codes and used to improve the now-casting and focusing of the radiation belts and ring current. We also discuss advanced data assimilation methods such as parameter estimation and smoothing. We present a number of data assimilation applications using the VERB 3D code. The 3D data assimilative VERB allows us to blend together data from GOES, RBSP A and RBSP B. 1) Model with data assimilation allows us to propagate data to different pitch angles, energies, and L-shells and blends them together with the physics-based VERB code in an optimal way. We illustrate how to use this capability for the analysis of the previous events and for obtaining a global and statistical view of the system. 2) The model predictions strongly depend on initial conditions that are set up for the model. Therefore, the model is as good as the initial conditions that it uses. To produce the best possible initial conditions, data from different sources (GOES, RBSP A, B, our empirical model predictions based on ACE) are all blended together in an optimal way by means of data assimilation, as described above. The resulting initial conditions do not have gaps. This allows us to make more accurate predictions. Real-time prediction framework operating on our website, based on GOES, RBSP A, B and ACE data, and 3D VERB, is presented and discussed.

  2. PubFinder: a tool for improving retrieval rate of relevant PubMed abstracts.

    PubMed

    Goetz, Thomas; von der Lieth, Claus-Wilhelm

    2005-07-01

    Since it is becoming increasingly laborious to manually extract useful information embedded in the ever-growing volumes of literature, automated intelligent text analysis tools are becoming more and more essential to assist in this task. PubFinder (www.glycosciences.de/tools/PubFinder) is a publicly available web tool designed to improve the retrieval rate of scientific abstracts relevant for a specific scientific topic. Only the selection of a representative set of abstracts is required, which are central for a scientific topic. No special knowledge concerning the query-syntax is necessary. Based on the selected abstracts, a list of discriminating words is automatically calculated, which is subsequently used for scoring all defined PubMed abstracts for their probability of belonging to the defined scientific topic. This results in a hit-list of references in the descending order of their likelihood score. The algorithms and procedures implemented in PubFinder facilitate the perpetual task for every scientist of staying up-to-date with current publications dealing with a specific subject in biomedicine.

  3. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  4. The commerce of professional psychology and the new ethics code.

    PubMed

    Koocher, G P

    1994-11-01

    The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.

  5. Orthographic Coding: Brain Activation for Letters, Symbols, and Digits.

    PubMed

    Carreiras, Manuel; Quiñones, Ileana; Hernández-Cabrera, Juan Andrés; Duñabeitia, Jon Andoni

    2015-12-01

    The present experiment investigates the input coding mechanisms of 3 common printed characters: letters, numbers, and symbols. Despite research in this area, it is yet unclear whether the identity of these 3 elements is processed through the same or different brain pathways. In addition, some computational models propose that the position-in-string coding of these elements responds to general flexible mechanisms of the visual system that are not character-specific, whereas others suggest that the position coding of letters responds to specific processes that are different from those that guide the position-in-string assignment of other types of visual objects. Here, in an fMRI study, we manipulated character position and character identity through the transposition or substitution of 2 internal elements within strings of 4 elements. Participants were presented with 2 consecutive visual strings and asked to decide whether they were the same or different. The results showed: 1) that some brain areas responded more to letters than to numbers and vice versa, suggesting that processing may follow different brain pathways; 2) that the left parietal cortex is involved in letter identity, and critically in letter position coding, specifically contributing to the early stages of the reading process; and that 3) a stimulus-specific mechanism for letter position coding is operating during orthographic processing. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D J; Willcock, J J

    2008-12-12

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructuremore » which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.« less

  7. Disclosure of terminal illness to patients and families: diversity of governing codes in 14 Islamic countries.

    PubMed

    Abdulhameed, Hunida E; Hammami, Muhammad M; Mohamed, Elbushra A Hameed

    2011-08-01

    The consistency of codes governing disclosure of terminal illness to patients and families in Islamic countries has not been studied until now. To review available codes on disclosure of terminal illness in Islamic countries. DATA SOURCE AND EXTRACTION: Data were extracted through searches on Google and PubMed. Codes related to disclosure of terminal illness to patients or families were abstracted, and then classified independently by the three authors. Codes for 14 Islamic countries were located. Five codes were silent regarding informing the patient, seven allowed concealment, one mandated disclosure and one prohibited disclosure. Five codes were silent regarding informing the family, four allowed disclosure and five mandated/recommended disclosure. The Islamic Organization for Medical Sciences code was silent on both issues. Codes regarding disclosure of terminal illness to patients and families differed markedly among Islamic countries. They were silent in one-third of the codes, and tended to favour a paternalistic/utilitarian, family-centred approach over an autonomous, patient-centred approach.

  8. Paper Abstract Animals

    ERIC Educational Resources Information Center

    Sutley, Jane

    2010-01-01

    Abstraction is, in effect, a simplification and reduction of shapes with an absence of detail designed to comprise the essence of the more naturalistic images being depicted. Without even intending to, young children consistently create interesting, and sometimes beautiful, abstract compositions. A child's creations, moreover, will always seem to…

  9. A Bayesian network coding scheme for annotating biomedical information presented to genetic counseling clients.

    PubMed

    Green, Nancy

    2005-04-01

    We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.

  10. One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.

    ERIC Educational Resources Information Center

    Milroy, Lesley, Ed.; Muysken, Pieter, Ed.

    Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…

  11. Are nursing codes of practice ethical?

    PubMed

    Pattison, S

    2001-01-01

    This article provides a theoretical critique from a particular 'ideal type' ethical perspective of professional codes in general and the United Kingdom Central Council for Nursing, Midwifery and Health Visiting (UKCC) Code of professional conduct (reprinted on pp. 77-78) in particular. Having outlined a specific 'ideal type' of what ethically informed and aware practice may be, the article examines the extent to which professional codes may be likely to elicit and engender such practice. Because of their terminological inexactitudes and confusions, their arbitrary values and principles, their lack of helpful ethical guidance, and their exclusion of ordinary moral experience, a number of contemporary professional codes in health and social care can be arraigned as ethically inadequate. The UKCC Code of professional conduct embodies many of these flaws, and others besides. Some of its weaknesses in this respect are anatomized before some tentative suggestions are offered for the reform of codes and the engendering of greater ethical awareness among professionals in the light of greater public ethical concerns and values.

  12. The "Wow! signal" of the terrestrial genetic code

    NASA Astrophysics Data System (ADS)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of

  13. The Advantages of Abstract Control Knowledge in Expert System Design. Technical Report #7.

    ERIC Educational Resources Information Center

    Clancey, William J.

    This paper argues that an important design principle for building expert systems is to represent all control knowledge abstractly and separately from the domain knowledge upon which it operates. Abstract control knowledge is defined as the specifications of when and how a program is to carry out its operations, such as pursuing a goal, focusing,…

  14. Intent Specifications

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1995-01-01

    We have been investigating the implications of using abstractions based on intent rather than the aggregation and information-hiding abstractions commonly used in software en- gineering: Cognitive psychologists have shown that intent abstraction is consistent with human problem-solving processes. We believe that new types of specifications and designs based on this concept can assist in understanding and specifying requirements, capturing the most important design rationale information in an efficient and economical way, and supporting the process of identifying and analyzing required changes to minimize the introduction of errors. The goal of hierarchical abstraction is to allow both top-down and bottom-up reasoning about a complex system. In computer science, we have made much use of (1) part-whole abstractions where each level of a hierarchy represents an aggregation of the components at a lower level and of (2) information-hiding abstractions where each level contains the same conceptual information but hides some details about the concepts, that is, each level is a refinement of the information at a higher level.

  15. Moral concepts set decision strategies to abstract values.

    PubMed

    Caspers, Svenja; Heim, Stefan; Lucas, Marc G; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl

    2011-04-01

    Persons have different value preferences. Neuroimaging studies where value-based decisions in actual conflict situations were investigated suggest an important role of prefrontal and cingulate brain regions. General preferences, however, reflect a superordinate moral concept independent of actual situations as proposed in psychological and socioeconomic research. Here, the specific brain response would be influenced by abstract value systems and moral concepts. The neurobiological mechanisms underlying such responses are largely unknown. Using functional magnetic resonance imaging (fMRI) with a forced-choice paradigm on word pairs representing abstract values, we show that the brain handles such decisions depending on the person's superordinate moral concept. Persons with a predominant collectivistic (altruistic) value system applied a "balancing and weighing" strategy, recruiting brain regions of rostral inferior and intraparietal, and midcingulate and frontal cortex. Conversely, subjects with mainly individualistic (egocentric) value preferences applied a "fight-and-flight" strategy by recruiting the left amygdala. Finally, if subjects experience a value conflict when rejecting an alternative congruent to their own predominant value preference, comparable brain regions are activated as found in actual moral dilemma situations, i.e., midcingulate and dorsolateral prefrontal cortex. Our results demonstrate that superordinate moral concepts influence the strategy and the neural mechanisms in decision processes, independent of actual situations, showing that decisions are based on general neural principles. These findings provide a novel perspective to future sociological and economic research as well as to the analysis of social relations by focusing on abstract value systems as triggers of specific brain responses.

  16. Moral Concepts Set Decision Strategies to Abstract Values

    PubMed Central

    Caspers, Svenja; Heim, Stefan; Lucas, Marc G.; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl

    2011-01-01

    Persons have different value preferences. Neuroimaging studies where value-based decisions in actual conflict situations were investigated suggest an important role of prefrontal and cingulate brain regions. General preferences, however, reflect a superordinate moral concept independent of actual situations as proposed in psychological and socioeconomic research. Here, the specific brain response would be influenced by abstract value systems and moral concepts. The neurobiological mechanisms underlying such responses are largely unknown. Using functional magnetic resonance imaging (fMRI) with a forced-choice paradigm on word pairs representing abstract values, we show that the brain handles such decisions depending on the person's superordinate moral concept. Persons with a predominant collectivistic (altruistic) value system applied a “balancing and weighing” strategy, recruiting brain regions of rostral inferior and intraparietal, and midcingulate and frontal cortex. Conversely, subjects with mainly individualistic (egocentric) value preferences applied a “fight-and-flight” strategy by recruiting the left amygdala. Finally, if subjects experience a value conflict when rejecting an alternative congruent to their own predominant value preference, comparable brain regions are activated as found in actual moral dilemma situations, i.e., midcingulate and dorsolateral prefrontal cortex. Our results demonstrate that superordinate moral concepts influence the strategy and the neural mechanisms in decision processes, independent of actual situations, showing that decisions are based on general neural principles. These findings provide a novel perspective to future sociological and economic research as well as to the analysis of social relations by focusing on abstract value systems as triggers of specific brain responses. PMID:21483767

  17. ASTRONAUTICS INFORMATION. Abstracts Vol. III, No. 1. Abstracts 3,082- 3,184

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1961-01-01

    Abstracts are presented on astronautics. The abstracts are generally restricted to spaceflight and to applicable techniques and data. The publication covers the period of January 1961. 102 references. (J.R.D.)

  18. MEMOPS: data modelling and automatic code generation.

    PubMed

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  19. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  20. Software-defined network abstractions and configuration interfaces for building programmable quantum networks

    NASA Astrophysics Data System (ADS)

    Dasari, Venkat R.; Sadlier, Ronald J.; Geerhart, Billy E.; Snow, Nikolai A.; Williams, Brian P.; Humble, Travis S.

    2017-05-01

    Well-defined and stable quantum networks are essential to realize functional quantum communication applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. In this paper, we describe new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.

  1. Software-defined network abstractions and configuration interfaces for building programmable quantum networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasari, Venkat; Sadlier, Ronald J; Geerhart, Mr. Billy

    Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.

  2. Minnowbrook V: 2006 Workshop on Unsteady Flows in Turbomachinery. (Conference Abstracts)

    NASA Technical Reports Server (NTRS)

    LaGraff, John E. (Editor); Ashpis, David E. (Editor); Oldfield, Martin L. G. (Editor); Gostelow, J. Paul (Editor)

    2006-01-01

    This volume contains materials presented at the Minnowbrook V 2006 Workshop on Unsteady Flows in Turbomachinery, held at the Syracuse University Minnowbrook Conference Center, New York, on August 20-23, 2006. The workshop organizers were John E. LaGraff (Syracuse University), Martin L.G. Oldfield (Oxford University), and J. Paul Gostelow (University of Leicester). The workshop followed the theme, venue, and informal format of four earlier workshops: Minnowbrook I (1993), Minnowbrook II (1997), Minnowbrook III (2000), and Minnowbrook IV (2003). The workshop was focused on physical understanding of unsteady flows in turbomachinery, with the specific goal of contributing to engineering application of improving design codes for turbomachinery. The workshop participants included academic researchers from the United States and abroad and representatives from the gas-turbine industry and U.S. Government laboratories. The physical mechanisms discussed were related to unsteady wakes, active flow control, turbulence, bypass and natural transition, separation bubbles and turbulent spots, modeling of turbulence and transition, heat transfer and cooling, surface roughness, unsteady CFD, and DNS. The workshop summary and the plenary discussion transcripts clearly highlight the need for continued vigorous research in the technologically important area of unsteady flows in turbomachines. This volume contains abstracts and copies of select viewgraphs organized according to the workshop sessions. Full-color viewgraphs and animations are included in the CD-ROM version only (Doc.ID 20070024781).

  3. ASTRONAUTICS INFORMATION. ABSTRACTS, VOL. V, NO. 3. Abstracts 5,201- 5,330

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardgrove, B.J.; Warren, F.L. comps.

    1962-03-01

    Abstracts of astronautics information covering the period March 1962 are presented. The 129 abstracts cover the subject of spaceflight and applicable data and techniques. Author, subject, and source indexes are included. (M.C.G.)

  4. A Deterministic Transport Code for Space Environment Electrons

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.

    2010-01-01

    A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.

  5. Pacifier Overuse and Conceptual Relations of Abstract and Emotional Concepts.

    PubMed

    Barca, Laura; Mazzuca, Claudia; Borghi, Anna M

    2017-01-01

    This study explores the impact of the extensive use of an oral device since infancy (pacifier) on the acquisition of concrete, abstract, and emotional concepts. While recent evidence showed a negative relation between pacifier use and children's emotional competence (Niedenthal et al., 2012), the possible interaction between use of pacifier and processing of emotional and abstract language has not been investigated. According to recent theories, while all concepts are grounded in sensorimotor experience, abstract concepts activate linguistic and social information more than concrete ones. Specifically, the Words As Social Tools (WAT) proposal predicts that the simulation of their meaning leads to an activation of the mouth (Borghi and Binkofski, 2014; Borghi and Zarcone, 2016). Since the pacifier affects facial mimicry forcing mouth muscles into a static position, we hypothesize its possible interference on acquisition/consolidation of abstract emotional and abstract not-emotional concepts, which are mainly conveyed during social and linguistic interactions, than of concrete concepts. Fifty-nine first grade children, with a history of different frequency of pacifier use, provided oral definitions of the meaning of abstract not-emotional, abstract emotional, and concrete words. Main effect of concept type emerged, with higher accuracy in defining concrete and abstract emotional concepts with respect to abstract not-emotional concepts, independently from pacifier use. Accuracy in definitions was not influenced by the use of pacifier, but correspondence and hierarchical clustering analyses suggest that the use of pacifier differently modulates the conceptual relations elicited by abstract emotional and abstract not-emotional. While the majority of the children produced a similar pattern of conceptual relations, analyses on the few (6) children who overused the pacifier (for more than 3 years) showed that they tend to distinguish less clearly between concrete and

  6. Pacifier Overuse and Conceptual Relations of Abstract and Emotional Concepts

    PubMed Central

    Barca, Laura; Mazzuca, Claudia; Borghi, Anna M.

    2017-01-01

    This study explores the impact of the extensive use of an oral device since infancy (pacifier) on the acquisition of concrete, abstract, and emotional concepts. While recent evidence showed a negative relation between pacifier use and children's emotional competence (Niedenthal et al., 2012), the possible interaction between use of pacifier and processing of emotional and abstract language has not been investigated. According to recent theories, while all concepts are grounded in sensorimotor experience, abstract concepts activate linguistic and social information more than concrete ones. Specifically, the Words As Social Tools (WAT) proposal predicts that the simulation of their meaning leads to an activation of the mouth (Borghi and Binkofski, 2014; Borghi and Zarcone, 2016). Since the pacifier affects facial mimicry forcing mouth muscles into a static position, we hypothesize its possible interference on acquisition/consolidation of abstract emotional and abstract not-emotional concepts, which are mainly conveyed during social and linguistic interactions, than of concrete concepts. Fifty-nine first grade children, with a history of different frequency of pacifier use, provided oral definitions of the meaning of abstract not-emotional, abstract emotional, and concrete words. Main effect of concept type emerged, with higher accuracy in defining concrete and abstract emotional concepts with respect to abstract not-emotional concepts, independently from pacifier use. Accuracy in definitions was not influenced by the use of pacifier, but correspondence and hierarchical clustering analyses suggest that the use of pacifier differently modulates the conceptual relations elicited by abstract emotional and abstract not-emotional. While the majority of the children produced a similar pattern of conceptual relations, analyses on the few (6) children who overused the pacifier (for more than 3 years) showed that they tend to distinguish less clearly between concrete and

  7. Check Sample Abstracts.

    PubMed

    Alter, David; Grenache, David G; Bosler, David S; Karcher, Raymond E; Nichols, James; Rajadhyaksha, Aparna; Camelo-Piragua, Sandra; Rauch, Carol; Huddleston, Brent J; Frank, Elizabeth L; Sluss, Patrick M; Lewandrowski, Kent; Eichhorn, John H; Hall, Janet E; Rahman, Saud S; McPherson, Richard A; Kiechle, Frederick L; Hammett-Stabler, Catherine; Pierce, Kristin A; Kloehn, Erica A; Thomas, Patricia A; Walts, Ann E; Madan, Rashna; Schlesinger, Kathie; Nawgiri, Ranjana; Bhutani, Manoop; Kanber, Yonca; Abati, Andrea; Atkins, Kristen A; Farrar, Robert; Gopez, Evelyn Valencerina; Jhala, Darshana; Griffin, Sonya; Jhala, Khushboo; Jhala, Nirag; Bentz, Joel S; Emerson, Lyska; Chadwick, Barbara E; Barroeta, Julieta E; Baloch, Zubair W; Collins, Brian T; Middleton, Owen L; Davis, Gregory G; Haden-Pinneri, Kathryn; Chu, Albert Y; Keylock, Joren B; Ramoso, Robert; Thoene, Cynthia A; Stewart, Donna; Pierce, Arand; Barry, Michelle; Aljinovic, Nika; Gardner, David L; Barry, Michelle; Shields, Lisa B E; Arnold, Jack; Stewart, Donna; Martin, Erica L; Rakow, Rex J; Paddock, Christopher; Zaki, Sherif R; Prahlow, Joseph A; Stewart, Donna; Shields, Lisa B E; Rolf, Cristin M; Falzon, Andrew L; Hudacki, Rachel; Mazzella, Fermina M; Bethel, Melissa; Zarrin-Khameh, Neda; Gresik, M Vicky; Gill, Ryan; Karlon, William; Etzell, Joan; Deftos, Michael; Karlon, William J; Etzell, Joan E; Wang, Endi; Lu, Chuanyi M; Manion, Elizabeth; Rosenthal, Nancy; Wang, Endi; Lu, Chuanyi M; Tang, Patrick; Petric, Martin; Schade, Andrew E; Hall, Geraldine S; Oethinger, Margret; Hall, Geraldine; Picton, Avis R; Hoang, Linda; Imperial, Miguel Ranoa; Kibsey, Pamela; Waites, Ken; Duffy, Lynn; Hall, Geraldine S; Salangsang, Jo-Anne M; Bravo, Lulette Tricia C; Oethinger, Margaret D; Veras, Emanuela; Silva, Elvia; Vicens, Jimena; Silva, Elvio; Keylock, Joren; Hempel, James; Rushing, Elizabeth; Posligua, Lorena E; Deavers, Michael T; Nash, Jason W; Basturk, Olca; Perle, Mary Ann; Greco, Alba; Lee, Peng; Maru, Dipen; Weydert, Jamie Allen; Stevens, Todd M; Brownlee, Noel A; Kemper, April E; Williams, H James; Oliverio, Brock J; Al-Agha, Osama M; Eskue, Kyle L; Newlands, Shawn D; Eltorky, Mahmoud A; Puri, Puja K; Royer, Michael C; Rush, Walter L; Tavora, Fabio; Galvin, Jeffrey R; Franks, Teri J; Carter, James Elliot; Kahn, Andrea Graciela; Lozada Muñoz, Luis R; Houghton, Dan; Land, Kevin J; Nester, Theresa; Gildea, Jacob; Lefkowitz, Jerry; Lacount, Rachel A; Thompson, Hannis W; Refaai, Majed A; Quillen, Karen; Lopez, Ana Ortega; Goldfinger, Dennis; Muram, Talia; Thompson, Hannis

    2009-02-01

    The following abstracts are compiled from Check Sample exercises published in 2008. These peer-reviewed case studies assist laboratory professionals with continuing medical education and are developed in the areas of clinical chemistry, cytopathology, forensic pathology, hematology, microbiology, surgical pathology, and transfusion medicine. Abstracts for all exercises published in the program will appear annually in AJCP.

  8. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p

  9. The Role of Code-Switching in Bilingual Creativity

    ERIC Educational Resources Information Center

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…

  10. Connection anonymity analysis in coded-WDM PONs

    NASA Astrophysics Data System (ADS)

    Sue, Chuan-Ching

    2008-04-01

    A coded wavelength division multiplexing passive optical network (WDM PON) is presented for fiber to the home (FTTH) systems to protect against eavesdropping. The proposed scheme applies spectral amplitude coding (SAC) with a unipolar maximal-length sequence (M-sequence) code matrix to generate a specific signature address (coding) and to retrieve its matching address codeword (decoding) by exploiting the cyclic properties inherent in array waveguide grating (AWG) routers. In addition to ensuring the confidentiality of user data, the proposed coded-WDM scheme is also a suitable candidate for the physical layer with connection anonymity. Under the assumption that the eavesdropper applies a photo-detection strategy, it is shown that the coded WDM PON outperforms the conventional TDM PON and WDM PON schemes in terms of a higher degree of connection anonymity. Additionally, the proposed scheme allows the system operator to partition the optical network units (ONUs) into appropriate groups so as to achieve a better degree of anonymity.

  11. [Long non-coding RNAs in the pathophysiology of atherosclerosis].

    PubMed

    Novak, Jan; Vašků, Julie Bienertová; Souček, Miroslav

    2018-01-01

    The human genome contains about 22 000 protein-coding genes that are transcribed to an even larger amount of messenger RNAs (mRNA). Interestingly, the results of the project ENCODE from 2012 show, that despite up to 90 % of our genome being actively transcribed, protein-coding mRNAs make up only 2-3 % of the total amount of the transcribed RNA. The rest of RNA transcripts is not translated to proteins and that is why they are referred to as "non-coding RNAs". Earlier the non-coding RNA was considered "the dark matter of genome", or "the junk", whose genes has accumulated in our DNA during the course of evolution. Today we already know that non-coding RNAs fulfil a variety of regulatory functions in our body - they intervene into epigenetic processes from chromatin remodelling to histone methylation, or into the transcription process itself, or even post-transcription processes. Long non-coding RNAs (lncRNA) are one of the classes of non-coding RNAs that have more than 200 nucleotides in length (non-coding RNAs with less than 200 nucleotides in length are called small non-coding RNAs). lncRNAs represent a widely varied and large group of molecules with diverse regulatory functions. We can identify them in all thinkable cell types or tissues, or even in an extracellular space, which includes blood, specifically plasma. Their levels change during the course of organogenesis, they are specific to different tissues and their changes also occur along with the development of different illnesses, including atherosclerosis. This review article aims to present lncRNAs problematics in general and then focuses on some of their specific representatives in relation to the process of atherosclerosis (i.e. we describe lncRNA involvement in the biology of endothelial cells, vascular smooth muscle cells or immune cells), and we further describe possible clinical potential of lncRNA, whether in diagnostics or therapy of atherosclerosis and its clinical manifestations.Key words

  12. Refactoring the Genetic Code for Increased Evolvability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pines, Gur; Winkler, James D.; Pines, Assaf

    ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less

  13. Refactoring the Genetic Code for Increased Evolvability

    DOE PAGES

    Pines, Gur; Winkler, James D.; Pines, Assaf; ...

    2017-11-14

    ABSTRACT The standard genetic code is robust to mutations during transcription and translation. Point mutations are likely to be synonymous or to preserve the chemical properties of the original amino acid. Saturation mutagenesis experiments suggest that in some cases the best-performing mutant requires replacement of more than a single nucleotide within a codon. These replacements are essentially inaccessible to common error-based laboratory engineering techniques that alter a single nucleotide per mutation event, due to the extreme rarity of adjacent mutations. In this theoretical study, we suggest a radical reordering of the genetic code that maximizes the mutagenic potential of singlemore » nucleotide replacements. We explore several possible genetic codes that allow a greater degree of accessibility to the mutational landscape and may result in a hyperevolvable organism that could serve as an ideal platform for directed evolution experiments. We then conclude by evaluating the challenges of constructing such recoded organisms and their potential applications within the field of synthetic biology. IMPORTANCE The conservative nature of the genetic code prevents bioengineers from efficiently accessing the full mutational landscape of a gene via common error-prone methods. Here, we present two computational approaches to generate alternative genetic codes with increased accessibility. These new codes allow mutational transitions to a larger pool of amino acids and with a greater extent of chemical differences, based on a single nucleotide replacement within the codon, thus increasing evolvability both at the single-gene and at the genome levels. Given the widespread use of these techniques for strain and protein improvement, along with more fundamental evolutionary biology questions, the use of recoded organisms that maximize evolvability should significantly improve the efficiency of directed evolution, library generation, and fitness maximization.« less

  14. Quantized phase coding and connected region labeling for absolute phase retrieval.

    PubMed

    Chen, Xiangcheng; Wang, Yuwei; Wang, Yajun; Ma, Mengchao; Zeng, Chunnian

    2016-12-12

    This paper proposes an absolute phase retrieval method for complex object measurement based on quantized phase-coding and connected region labeling. A specific code sequence is embedded into quantized phase of three coded fringes. Connected regions of different codes are labeled and assigned with 3-digit-codes combining the current period and its neighbors. Wrapped phase, more than 36 periods, can be restored with reference to the code sequence. Experimental results verify the capability of the proposed method to measure multiple isolated objects.

  15. Modelling abstraction licensing strategies ahead of the UK's water abstraction licensing reform

    NASA Astrophysics Data System (ADS)

    Klaar, M. J.

    2012-12-01

    Within England and Wales, river water abstractions are licensed and regulated by the Environment Agency (EA), who uses compliance with the Environmental Flow Indicator (EFI) to ascertain where abstraction may cause undesirable effects on river habitats and species. The EFI is a percentage deviation from natural flow represented using a flow duration curve. The allowable percentage deviation changes with different flows, and also changes depending on an assessment of the sensitivity of the river to changes in flow (Table 1). Within UK abstraction licensing, resource availability is expressed as a surplus or deficit of water resources in relation to the EFI, and utilises the concept of 'hands-off-flows' (HOFs) at the specified flow statistics detailed in Table 1. Use of a HOF system enables abstraction to cease at set flows, but also enables abstraction to occur at periods of time when more water is available. Compliance at low flows (Q95) is used by the EA to determine the hydrological classification and compliance with the Water Framework Directive (WFD) for identifying waterbodies where flow may be causing or contributing to a failure in good ecological status (GES; Table 2). This compliance assessment shows where the scenario flows are below the EFI and by how much, to help target measures for further investigation and assessment. Currently, the EA is reviewing the EFI methodology in order to assess whether or not it can be used within the reformed water abstraction licensing system which is being planned by the Department for Environment, Food and Rural Affairs (DEFRA) to ensure the licensing system is resilient to the challenges of climate change and population growth, while allowing abstractors to meet their water needs efficiently, and better protect the environment. In order to assess the robustness of the EFI, a simple model has been created which allows a number of abstraction, flow and licensing scenarios to be run to determine WFD compliance using the

  16. A Code of Ethics for All Adult Educators?

    ERIC Educational Resources Information Center

    Wood, George S., Jr.

    1996-01-01

    Offers a code of ethics for adult educators, outlining ethical responsibilities to society, to learners, to the sponsoring organization and other stakeholders, and to the profession. Stresses that this is more of a framework than a code, and adult educators can use it to reflect systematically upon the specifics of practice. (SK)

  17. Trellis phase codes for power-bandwith efficient satellite communications

    NASA Technical Reports Server (NTRS)

    Wilson, S. G.; Highfill, J. H.; Hsu, C. D.; Harkness, R.

    1981-01-01

    Support work on improved power and spectrum utilization on digital satellite channels was performed. Specific attention is given to the class of signalling schemes known as continuous phase modulation (CPM). The specific work described in this report addresses: analytical bounds on error probability for multi-h phase codes, power and bandwidth characterization of 4-ary multi-h codes, and initial results of channel simulation to assess the impact of band limiting filters and nonlinear amplifiers on CPM performance.

  18. Gene and genon concept: coding versus regulation

    PubMed Central

    2007-01-01

    We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various

  19. Abstracts

    ERIC Educational Resources Information Center

    American Biology Teacher, 1976

    1976-01-01

    Presents abstracts of 63 papers to be presented at the 1976 Convention of the National Association of Biology Teachers, October 14-17, 1976, Denver, Colorado. Papers cover a wide range of biology and science education topics with the majority concentrating upon the convention's main program, "Ecosystems: 1776-1976-?". (SL)

  20. Review of codes, standards, and regulations for natural gas locomotives.

    DOT National Transportation Integrated Search

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  1. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  2. Java Source Code Analysis for API Migration to Embedded Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, Victor; McCoy, James A.; Guerrero, Jonathan

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less

  3. Reported estimates of diagnostic accuracy in ophthalmology conference abstracts were not associated with full-text publication

    PubMed Central

    Korevaar, Daniël A.; Cohen, Jérémie F.; Spijker, René; Saldanha, Ian J.; Dickersin, Kay; Virgili, Gianni; Hooft, Lotty; Bossuyt, Patrick M.M.

    2016-01-01

    Objective To assess whether conference abstracts that report higher estimates of diagnostic accuracy are more likely to reach full-text publication in a peer-reviewed journal. Study Design and Setting We identified abstracts describing diagnostic accuracy studies, presented between 2007 and 2010 at the Association for Research in Vision and Ophthalmology (ARVO) Annual Meeting. We extracted reported estimates of sensitivity, specificity, area under the receiver operating characteristic curve (AUC), and diagnostic odds ratio (DOR). Between May and July 2015, we searched MEDLINE and EMBASE to identify corresponding full-text publications; if needed, we contacted abstract authors. Cox regression was performed to estimate associations with full-text publication, where sensitivity, specificity, and AUC were logit transformed, and DOR was log transformed. Results A full-text publication was found for 226/399 (57%) included abstracts. There was no association between reported estimates of sensitivity and full-text publication (hazard ratio [HR] 1.09 [95% confidence interval {CI} 0.98, 1.22]). The same applied to specificity (HR 1.00 [95% CI 0.88, 1.14]), AUC (HR 0.91 [95% CI 0.75, 1.09]), and DOR (HR 1.01 [95% CI 0.94, 1.09]). Conclusion Almost half of the ARVO conference abstracts describing diagnostic accuracy studies did not reach full-text publication. Studies in abstracts that mentioned higher accuracy estimates were not more likely to be reported in a full-text publication. PMID:27312228

  4. ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite

    PubMed Central

    2010-01-01

    Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light") versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm PMID:20109223

  5. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.

    PubMed

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2017-03-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.

  6. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models

    PubMed Central

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2016-01-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, non-standardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly-available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the Labeled Latent Dirichlet Allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic (ROC) curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of .79, and .70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scaleable method for accurate automated coding of psychotherapy sessions that performs better than comparable discriminative methods at session-level coding and can also predict fine-grained codes. PMID:26625437

  7. Research Abstracts of 1982.

    DTIC Science & Technology

    1982-12-01

    Third Molars in Naval Personnel,- (Abstract #1430) 7. A. SEROWSKI* and F. AKER --"The Effect of Marine and Fresh-Water Atmospheric Environments on...Packaged Dental Instrument4’, (Abstract #1133) 8. I. L. SHKLAIR*, R. W. GAUGLER, R. G. WALTER -.The Effect of Three Surfactants on Controlling Caries...Insoluble Streptococcal Glucan"’. e (Abstract #102) - _/_ / 10. R. G. WALTER* and I. L. SHKLAIR - The Effect of T-10 Dextran on Caries and Plaque in

  8. Identification of ICD Codes Suggestive of Child Maltreatment

    ERIC Educational Resources Information Center

    Schnitzer, Patricia G.; Slusher, Paula L.; Kruse, Robin L.; Tarleton, Molly M.

    2011-01-01

    Objective: In order to be reimbursed for the care they provide, hospitals in the United States are required to use a standard system to code all discharge diagnoses: the International Classification of Disease, 9th Revision, Clinical Modification (ICD-9). Although ICD-9 codes specific for child maltreatment exist, they do not identify all…

  9. Computer-assisted coding and clinical documentation: first things first.

    PubMed

    Tully, Melinda; Carmichael, Angela

    2012-10-01

    Computer-assisted coding tools have the potential to drive improvements in seven areas: Transparency of coding. Productivity (generally by 20 to 25 percent for inpatient claims). Accuracy (by improving specificity of documentation). Cost containment (by reducing overtime expenses, audit fees, and denials). Compliance. Efficiency. Consistency.

  10. Mental Ability and Mismatch Negativity: Pre-Attentive Discrimination of Abstract Feature Conjunctions in Auditory Sequences

    ERIC Educational Resources Information Center

    Houlihan, Michael; Stelmack, Robert M.

    2012-01-01

    The relation between mental ability and the ability to detect violations of an abstract, third-order conjunction rule was examined using event-related potential measures, specifically mismatch negativity (MMN). The primary objective was to determine whether the extraction of invariant relations based on abstract conjunctions between two…

  11. Lineage-Specific Genome Architecture Links Enhancers and Non-coding Disease Variants to Target Gene Promoters.

    PubMed

    Javierre, Biola M; Burren, Oliver S; Wilder, Steven P; Kreuzhuber, Roman; Hill, Steven M; Sewitz, Sven; Cairns, Jonathan; Wingett, Steven W; Várnai, Csilla; Thiecke, Michiel J; Burden, Frances; Farrow, Samantha; Cutler, Antony J; Rehnström, Karola; Downes, Kate; Grassi, Luigi; Kostadima, Myrto; Freire-Pritchett, Paula; Wang, Fan; Stunnenberg, Hendrik G; Todd, John A; Zerbino, Daniel R; Stegle, Oliver; Ouwehand, Willem H; Frontini, Mattia; Wallace, Chris; Spivakov, Mikhail; Fraser, Peter

    2016-11-17

    Long-range interactions between regulatory elements and gene promoters play key roles in transcriptional regulation. The vast majority of interactions are uncharted, constituting a major missing link in understanding genome control. Here, we use promoter capture Hi-C to identify interacting regions of 31,253 promoters in 17 human primary hematopoietic cell types. We show that promoter interactions are highly cell type specific and enriched for links between active promoters and epigenetically marked enhancers. Promoter interactomes reflect lineage relationships of the hematopoietic tree, consistent with dynamic remodeling of nuclear architecture during differentiation. Interacting regions are enriched in genetic variants linked with altered expression of genes they contact, highlighting their functional role. We exploit this rich resource to connect non-coding disease variants to putative target promoters, prioritizing thousands of disease-candidate genes and implicating disease pathways. Our results demonstrate the power of primary cell promoter interactomes to reveal insights into genomic regulatory mechanisms underlying common diseases. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Parietal Activation During Retrieval of Abstract and Concrete Auditory Information

    PubMed Central

    Klostermann, Ellen C.; Kane, Ari J.M.; Shimamura, Arthur P.

    2008-01-01

    Successful memory retrieval has been associated with a neural circuit that involves prefrontal, precuneus, and posterior parietal regions. Specifically, these regions are active during recognition memory tests when items correctly identified as “old” are compared with items correctly identified as “new.” Yet, as nearly all previous fMRI studies have used visual stimuli, it is unclear whether activations in posterior regions are specifically associated with memory retrieval or if they reflect visuospatial processing. We focus on the status of parietal activations during recognition performance by testing memory for abstract and concrete nouns presented in the auditory modality with eyes closed. Successful retrieval of both concrete and abstract words was associated with increased activation in left inferior parietal regions (BA 40), similar to those observed with visual stimuli. These results demonstrate that activations in the posterior parietal cortex during retrieval cannot be attributed to bottom-up visuospatial processes but instead have a more direct relationship to memory retrieval processes. PMID:18243736

  13. Abstraction in perceptual symbol systems.

    PubMed Central

    Barsalou, Lawrence W

    2003-01-01

    After reviewing six senses of abstraction, this article focuses on abstractions that take the form of summary representations. Three central properties of these abstractions are established: ( i ) type-token interpretation; (ii) structured representation; and (iii) dynamic realization. Traditional theories of representation handle interpretation and structure well but are not sufficiently dynamical. Conversely, connectionist theories are exquisitely dynamic but have problems with structure. Perceptual symbol systems offer an approach that implements all three properties naturally. Within this framework, a loose collection of property and relation simulators develops to represent abstractions. Type-token interpretation results from binding a property simulator to a region of a perceived or simulated category member. Structured representation results from binding a configuration of property and relation simulators to multiple regions in an integrated manner. Dynamic realization results from applying different subsets of property and relation simulators to category members on different occasions. From this standpoint, there are no permanent or complete abstractions of a category in memory. Instead, abstraction is the skill to construct temporary online interpretations of a category's members. Although an infinite number of abstractions are possible, attractors develop for habitual approaches to interpretation. This approach provides new ways of thinking about abstraction phenomena in categorization, inference, background knowledge and learning. PMID:12903648

  14. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 05)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    This bibliography is issued in two sections: Section 1 - Abstracts, and section 2 - Indexes. The abstract section cites 217 patents and applications for patent introduced into the NASA scientific and technical information system during the period of January 1974 through June 1974. Each entry consists of a citation, an abstract, and, in most cases, a key illustration selected from the patent or application for patent. The index section contains entries for 2653 patent and application for patent citations covering the period May 1969 through June 1974. The index section contains five indexes -- subject, inventor, source, number and accession number.

  15. Deterministic quantum dense coding networks

    NASA Astrophysics Data System (ADS)

    Roy, Saptarshi; Chanda, Titas; Das, Tamoghna; Sen(De), Aditi; Sen, Ujjwal

    2018-07-01

    We consider the scenario of deterministic classical information transmission between multiple senders and a single receiver, when they a priori share a multipartite quantum state - an attempt towards building a deterministic dense coding network. Specifically, we prove that in the case of two or three senders and a single receiver, generalized Greenberger-Horne-Zeilinger (gGHZ) states are not beneficial for sending classical information deterministically beyond the classical limit, except when the shared state is the GHZ state itself. On the other hand, three- and four-qubit generalized W (gW) states with specific parameters as well as the four-qubit Dicke states can provide a quantum advantage of sending the information in deterministic dense coding. Interestingly however, numerical simulations in the three-qubit scenario reveal that the percentage of states from the GHZ-class that are deterministic dense codeable is higher than that of states from the W-class.

  16. Professional codes in a changing nursing context: literature review.

    PubMed

    Meulenbergs, Tom; Verpeet, Ellen; Schotsmans, Paul; Gastmans, Chris

    2004-05-01

    Professional codes played a definitive role during a specific period of time, when the professional context of nursing was characterized by an increasing professionalization. Today, however, this professional context has changed. This paper reports on a study which aimed to explore the meaning of professional codes in the current context of the nursing profession. A literature review on professional codes and the nursing profession was carried out. The literature was systematically investigated using the electronic databases PubMed and The Philosopher's Index, and the keywords nursing codes, professional codes in nursing, ethics codes/ethical codes, professional ethics. Due to the nursing profession's growing multidisciplinary nature, the increasing dominance of economic discourse, and the intensified legal framework in which health care professionals need to operate, the context of nursing is changing. In this changed professional context, nursing professional codes have to accommodate to the increasing ethical demands placed upon the profession. Therefore, an ethicization of these codes is desirable, and their moral objectives need to be revalued.

  17. Reliability of cause of death coding: an international comparison.

    PubMed

    Antini, Carmen; Rajs, Danuta; Muñoz-Quezada, María Teresa; Mondaca, Boris Andrés Lucero; Heiss, Gerardo

    2015-07-01

    This study evaluates the agreement of nosologic coding of cardiovascular causes of death between a Chilean coder and one in the United States, in a stratified random sample of death certificates of persons aged ≥ 60, issued in 2008 in the Valparaíso and Metropolitan regions, Chile. All causes of death were converted to ICD-10 codes in parallel by both coders. Concordance was analyzed with inter-coder agreement and Cohen's kappa coefficient by level of specification ICD-10 code for the underlying cause and the total causes of death coding. Inter-coder agreement was 76.4% for all causes of death and 80.6% for the underlying cause (agreement at the four-digit level), with differences by the level of specification of the ICD-10 code, by line of the death certificate, and by number of causes of death per certificate. Cohen's kappa coefficient was 0.76 (95%CI: 0.68-0.84) for the underlying cause and 0.75 (95%CI: 0.74-0.77) for the total causes of death. In conclusion, causes of death coding and inter-coder agreement for cardiovascular diseases in two regions of Chile are comparable to an external benchmark and with reports from other countries.

  18. ETF system code: composition and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less

  19. Contrasting Five Different Theories of Letter Position Coding: Evidence from Orthographic Similarity Effects

    ERIC Educational Resources Information Center

    Davis, Colin J.; Bowers, Jeffrey S.

    2006-01-01

    Five theories of how letter position is coded are contrasted: position-specific slot-coding, Wickelcoding, open-bigram coding (discrete and continuous), and spatial coding. These theories make different predictions regarding the relative similarity of three different types of pairs of letter strings: substitution neighbors,…

  20. Code Properties from Holographic Geometries

    NASA Astrophysics Data System (ADS)

    Pastawski, Fernando; Preskill, John

    2017-04-01

    Almheiri, Dong, and Harlow [J. High Energy Phys. 04 (2015) 163., 10.1007/JHEP04(2015)163] proposed a highly illuminating connection between the AdS /CFT holographic correspondence and operator algebra quantum error correction (OAQEC). Here, we explore this connection further. We derive some general results about OAQEC, as well as results that apply specifically to quantum codes that admit a holographic interpretation. We introduce a new quantity called price, which characterizes the support of a protected logical system, and find constraints on the price and the distance for logical subalgebras of quantum codes. We show that holographic codes defined on bulk manifolds with asymptotically negative curvature exhibit uberholography, meaning that a bulk logical algebra can be supported on a boundary region with a fractal structure. We argue that, for holographic codes defined on bulk manifolds with asymptotically flat or positive curvature, the boundary physics must be highly nonlocal, an observation with potential implications for black holes and for quantum gravity in AdS space at distance scales that are small compared to the AdS curvature radius.

  1. Automatic Abstraction in Planning

    NASA Technical Reports Server (NTRS)

    Christensen, J.

    1991-01-01

    Traditionally, abstraction in planning has been accomplished by either state abstraction or operator abstraction, neither of which has been fully automatic. We present a new method, predicate relaxation, for automatically performing state abstraction. PABLO, a nonlinear hierarchical planner, implements predicate relaxation. Theoretical, as well as empirical results are presented which demonstrate the potential advantages of using predicate relaxation in planning. We also present a new definition of hierarchical operators that allows us to guarantee a limited form of completeness. This new definition is shown to be, in some ways, more flexible than previous definitions of hierarchical operators. Finally, a Classical Truth Criterion is presented that is proven to be sound and complete for a planning formalism that is general enough to include most classical planning formalisms that are based on the STRIPS assumption.

  2. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 32)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Abstracts are provided for 136 patents and patent applications entered into the NASA scientific and technical information system during the period July through December 1987. Each entry consists of a citation , an abstract, and in most cases, a key illustration selected from the patent or patent application.

  3. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 29)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Abstracts are provided for 115 patents and patent applications entered into the NASA scientific and technical information system during the period January 1986 through June 1986. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent application.

  4. Semantic Neighborhood Effects for Abstract versus Concrete Words.

    PubMed

    Danguecan, Ashley N; Buchanan, Lori

    2016-01-01

    Studies show that semantic effects may be task-specific, and thus, that semantic representations are flexible and dynamic. Such findings are critical to the development of a comprehensive theory of semantic processing in visual word recognition, which should arguably account for how semantic effects may vary by task. It has been suggested that semantic effects are more directly examined using tasks that explicitly require meaning processing relative to those for which meaning processing is not necessary (e.g., lexical decision task). The purpose of the present study was to chart the processing of concrete versus abstract words in the context of a global co-occurrence variable, semantic neighborhood density (SND), by comparing word recognition response times (RTs) across four tasks varying in explicit semantic demands: standard lexical decision task (with non-pronounceable non-words), go/no-go lexical decision task (with pronounceable non-words), progressive demasking task, and sentence relatedness task. The same experimental stimulus set was used across experiments and consisted of 44 concrete and 44 abstract words, with half of these being low SND, and half being high SND. In this way, concreteness and SND were manipulated in a factorial design using a number of visual word recognition tasks. A consistent RT pattern emerged across tasks, in which SND effects were found for abstract (but not necessarily concrete) words. Ultimately, these findings highlight the importance of studying interactive effects in word recognition, and suggest that linguistic associative information is particularly important for abstract words.

  5. PS2-06: Best Practices for Advancing Multi-site Chart Abstraction Research

    PubMed Central

    Blick, Noelle; Cole, Deanna; King, Colleen; Riordan, Rick; Von Worley, Ann; Yarbro, Patty

    2012-01-01

    Background/Aims Multi-site chart abstraction studies are becoming increasingly common within the HMORN. Differences in systems among HMORN sites can pose significant obstacles to the success of these studies. It is therefore crucial to standardize abstraction activities by following best practices for multi-site chart abstraction, as consistency of processes across sites will increase efficiencies and enhance data quality. Methods Over the past few months the authors have been meeting to identify obstacles to multi-site chart abstraction and to address ways in which multi-site chart abstraction processes can be systemized and standardized. The aim of this workgroup is to create a best practice guide for multi-site chart abstraction studies. Focus areas include: abstractor training, format for chart abstraction (database, paper, etc), data quality, redaction, mechanism for transferring data, site specific access to medical records, IRB/HIPAA concerns, and budgetary issues. Results The results of the workgroup’s efforts (the best practice guide) will be presented by a panel of experts at the 2012 HMORN conference. The presentation format will also focus on discussion among attendees to elicit further input and to identify areas that need to be further addressed. Subsequently, the best practice guide will be posted on the HMORN website. Discussion The best practice guide for multi-site chart abstraction studies will establish sound guidelines and serve as an aid to researchers embarking on multi-site chart abstraction studies. Efficiencies and data quality will be further enhanced with standardized multi-site chart abstraction practices.

  6. Implementing the LIM code: the structural basis for cell type-specific assembly of LIM-homeodomain complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhati, Mugdha; Lee, Christopher; Nancarrow, Amy L.

    2008-09-03

    LIM-homeodomain (LIM-HD) transcription factors form a combinatorial 'LIM code' that contributes to the specification of cell types. In the ventral spinal cord, the binary LIM homeobox protein 3 (Lhx3)/LIM domain-binding protein 1 (Ldb1) complex specifies the formation of V2 interneurons. The additional expression of islet-1 (Isl1) in adjacent cells instead specifies the formation of motor neurons through assembly of a ternary complex in which Isl1 contacts both Lhx3 and Ldb1, displacing Lhx3 as the binding partner of Ldb1. However, little is known about how this molecular switch occurs. Here, we have identified the 30-residue Lhx3-binding domain on Isl1 (Isl1{sub LBD}).more » Although the LIM interaction domain of Ldb1 (Ldb1{sub LID}) and Isl1{sub LBD} share low levels of sequence homology, X-ray and NMR structures reveal that they bind Lhx3 in an identical manner, that is, Isl1{sub LBD} mimics Ldb1{sub LID}. These data provide a structural basis for the formation of cell type-specific protein-protein interactions in which unstructured linear motifs with diverse sequences compete to bind protein partners. The resulting alternate protein complexes can target different genes to regulate key biological events.« less

  7. Clinical coding of prospectively identified paediatric adverse drug reactions--a retrospective review of patient records.

    PubMed

    Bellis, Jennifer R; Kirkham, Jamie J; Nunn, Anthony J; Pirmohamed, Munir

    2014-12-17

    National Health Service (NHS) hospitals in the UK use a system of coding for patient episodes. The coding system used is the International Classification of Disease (ICD-10). There are ICD-10 codes which may be associated with adverse drug reactions (ADRs) and there is a possibility of using these codes for ADR surveillance. This study aimed to determine whether ADRs prospectively identified in children admitted to a paediatric hospital were coded appropriately using ICD-10. The electronic admission abstract for each patient with at least one ADR was reviewed. A record was made of whether the ADR(s) had been coded using ICD-10. Of 241 ADRs, 76 (31.5%) were coded using at least one ICD-10 ADR code. Of the oncology ADRs, 70/115 (61%) were coded using an ICD-10 ADR code compared with 6/126 (4.8%) non-oncology ADRs (difference in proportions 56%, 95% CI 46.2% to 65.8%; p < 0.001). The majority of ADRs detected in a prospective study at a paediatric centre would not have been identified if the study had relied on ICD-10 codes as a single means of detection. Data derived from administrative healthcare databases are not reliable for identifying ADRs by themselves, but may complement other methods of detection.

  8. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 31)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Abstracts are provided for 85 patents and patent applications entered into the NASA scientific and technical information system during the period January 1987 through June 1987. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  9. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 24)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Abstracts are provided for 167 patents and patent applications entered into the NASA scientific and technical information system during the period July 1983 through December 1983. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  10. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 27)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Abstracts are provided for 92 patents and patent applications entered into the NASA scientific and technical information system during the period January 1985 through June 1985. Each entry consist of a citation, and abstract, and in most cases, a key illustration selected from the patent or patent application.

  11. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 45)

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Abstracts are provided for 137 patents and patent applications entered into the NASA scientific and technical information system during the period Jan. 1994 through Jun. 1994. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  12. NASA patent abstracts bibliography. A continuing bibliography (supplement 22). Section 1: Abstracts

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Abstracts are cited for 234 patents and patent applications introduced into the NASA scientific and technical information system during the period July 1982 through December 1982. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  13. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 35)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Abstracts are provided for 58 patents and patent applications entered into the NASA scientific and technical information systems during the period January 1989 through June 1989. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  14. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 37)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Abstracts are provided for 76 patents and patent applications entered into the NASA scientific and technical information systems during the period January 1990 through June 1990. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  15. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 30)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Abstracts are provided for 105 patents and patent applications entered into the NASA scientific and technical information system during the period July 1986 through December 1986. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  16. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 38)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Abstracts are provided for 132 patents and patent applications entered into the NASA scientific and technical information system during the period July 1990 through December 1990. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  17. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 39)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Abstracts are provided for 154 patents and patent applications entered into the NASA scientific and technical information systems during the period Jan. 1991 through Jun. 1991. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  18. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 43)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Abstracts are provided for 128 patents and patent applications entered into the NASA scientific and technical information system during the period Jan. 1993 through Jun. 1993. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  19. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 42)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Abstracts are provided for 174 patents and patent applications entered into the NASA scientific and technical information system during the period July 1992 through December 1992. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  20. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 36)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Abstracts are provided for 63 patents and patent applications entered into the NASA scientific and technical information systems during the period July 1989 through December 1989. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  1. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 40)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Abstracts are provided for 181 patents and patent applications entered into the NASA scientific and technical information system during the period July 1991 through December 1991. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  2. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 28)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Abstracts are provided for 109 patents and patent applications entered into the NASA Scientific and Technical Information System during the period July 1985 through December 1985. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  3. NASA Patent Abstracts Bibliography: A Continuing Bibliography. Section 1: Abstracts (Supplement 48)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Abstracts are provided for 85 patents and patent applications entered into the NASA scientific and technical information system during the period July 1995 through December 1995. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  4. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 25)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Abstracts are provided for 102 patents and patent applications entered into the NASA scientific and technical information system during the period January 1984 through June 1984. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  5. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 33)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Abstracts are provided for 16 patents and patent applications entered into the NASA scientific and technical information systems during the period January 1988 through June 1988. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  6. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 15)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Abstracts are cited for 240 patents and applications for patents introduced into the NASA scientific system during the period of January 1979 through June 1979. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  7. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 26)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Abstracts are provided for 172 patents and patent applications entered into the NASA scientific and technical information system during the period July 1984 through December 1984. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  8. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 16)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Abstracts are cited for 138 patents and patent applications introduced into the NASA scientific and technical information system during the period July 1979 through December 1979. Each entry cib consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  9. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 23)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Abstracts are cited for 129 patents and patent applications introduced into the NASA scientific and technical information system during the period January 1983 through June 1983. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  10. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 18)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Abstracts are cited for 120 patents and patent applications for patents introduced into the NASA scientific system during the period of July 1980 through December 1980. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  11. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 34)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Abstracts are provided for 124 patents and patent applications entered into the NASA scientific and technical information systems during the period July 1988 through December 1988. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  12. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 41)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Abstracts are provided for 131 patents and patent applications entered into the NASA scientific and technical information system during the period Jan. 1992 through Jun. 1992. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  13. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 44)

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Abstracts are provided for 131 patents and patent applications entered into the NASA scientific and technical information system during the period Jun. 1993 through Dec. 1993. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  14. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 20)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Abstracts are cited for 165 patents and patent applications introduced into the NASA scientific and technical information system during the period July 1981 through December 1981. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or patent application.

  15. Abstraction and art.

    PubMed

    Gortais, Bernard

    2003-07-29

    In a given social context, artistic creation comprises a set of processes, which relate to the activity of the artist and the activity of the spectator. Through these processes we see and understand that the world is vaster than it is said to be. Artistic processes are mediated experiences that open up the world. A successful work of art expresses a reality beyond actual reality: it suggests an unknown world using the means and the signs of the known world. Artistic practices incorporate the means of creation developed by science and technology and change forms as they change. Artists and the public follow different processes of abstraction at different levels, in the definition of the means of creation, of representation and of perception of a work of art. This paper examines how the processes of abstraction are used within the framework of the visual arts and abstract painting, which appeared during a period of growing importance for the processes of abstraction in science and technology, at the beginning of the twentieth century. The development of digital platforms and new man-machine interfaces allow multimedia creations. This is performed under the constraint of phases of multidisciplinary conceptualization using generic representation languages, which tend to abolish traditional frontiers between the arts: visual arts, drama, dance and music.

  16. Administrative database code accuracy did not vary notably with changes in disease prevalence.

    PubMed

    van Walraven, Carl; English, Shane; Austin, Peter C

    2016-11-01

    Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 19)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Abstracts are cited for 130 patents and patent applications introduced into the NASA scientific and technical information system during the period of January 1981 through July 1981. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  18. Cooperative Educational Abstracting Service (CEAS). (Abstract Series No. 103-122, March 1972).

    ERIC Educational Resources Information Center

    International Bureau of Education, Geneva (Switzerland).

    This document is a compilation of 20 English-language abstracts concerning various aspects of education in Switzerland, New Zealand, Chile, Poland, Argentina, Pakistan, Malaysia, Thailand, and France. The abstracts are informative in nature, each being approximately 1,500 words in length. They are based on documents submitted by each of the…

  19. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 17)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Abstracts are cited for 150 patents and applications for patents introduced into the NASA scientific and technical information system during the period January 1980 through June 1980. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  20. NASA patent abstracts bibliography: A continuing bibliography. Section 1: Abstracts (supplement 14)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Abstracts are cited for 213 patents and applications for patent introduced into the NASA scientific and technical information system during the period of July 1978 through December 1978. Each entry consists of a citation, an abstract, and in most cases, a key illustration selected from the patent or application for patent.

  1. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerjan, Charles J.; Shi, Xizeng

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less

  2. Operational rate-distortion performance for joint source and channel coding of images.

    PubMed

    Ruf, M J; Modestino, J W

    1999-01-01

    This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.

  3. Evaluation in industry of a draft code of practice for manual handling.

    PubMed

    Ashby, Liz; Tappin, David; Bentley, Tim

    2004-05-01

    This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.

  4. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  5. Self-Taught Low-Rank Coding for Visual Learning.

    PubMed

    Li, Sheng; Li, Kang; Fu, Yun

    2018-03-01

    The lack of labeled data presents a common challenge in many computer vision and machine learning tasks. Semisupervised learning and transfer learning methods have been developed to tackle this challenge by utilizing auxiliary samples from the same domain or from a different domain, respectively. Self-taught learning, which is a special type of transfer learning, has fewer restrictions on the choice of auxiliary data. It has shown promising performance in visual learning. However, existing self-taught learning methods usually ignore the structure information in data. In this paper, we focus on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. By leveraging a high quality dictionary learned across auxiliary and target domains, the proposed approach learns expressive codings for the samples in the target domain. Since many types of visual data have been proven to contain subspace structures, a low-rank constraint is introduced into the coding objective to better characterize the structure of the given target set. The proposed representation learning framework is called self-taught low-rank (S-Low) coding, which can be formulated as a nonconvex rank-minimization and dictionary learning problem. We devise an efficient majorization-minimization augmented Lagrange multiplier algorithm to solve it. Based on the proposed S-Low coding mechanism, both unsupervised and supervised visual learning algorithms are derived. Extensive experiments on five benchmark data sets demonstrate the effectiveness of our approach.

  6. Automatic Certification of Kalman Filters for Reliable Code Generation

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann; Richardson, Julian

    2005-01-01

    AUTOFILTER is a tool for automatically deriving Kalman filter code from high-level declarative specifications of state estimation problems. It can generate code with a range of algorithmic characteristics and for several target platforms. The tool has been designed with reliability of the generated code in mind and is able to automatically certify that the code it generates is free from various error classes. Since documentation is an important part of software assurance, AUTOFILTER can also automatically generate various human-readable documents, containing both design and safety related information. We discuss how these features address software assurance standards such as DO-178B.

  7. SPIN: An Inversion Code for the Photospheric Spectral Line

    NASA Astrophysics Data System (ADS)

    Yadav, Rahul; Mathew, Shibu K.; Tiwary, Alok Ranjan

    2017-08-01

    Inversion codes are the most useful tools to infer the physical properties of the solar atmosphere from the interpretation of Stokes profiles. In this paper, we present the details of a new Stokes Profile INversion code (SPIN) developed specifically to invert the spectro-polarimetric data of the Multi-Application Solar Telescope (MAST) at Udaipur Solar Observatory. The SPIN code has adopted Milne-Eddington approximations to solve the polarized radiative transfer equation (RTE) and for the purpose of fitting a modified Levenberg-Marquardt algorithm has been employed. We describe the details and utilization of the SPIN code to invert the spectro-polarimetric data. We also present the details of tests performed to validate the inversion code by comparing the results from the other widely used inversion codes (VFISV and SIR). The inverted results of the SPIN code after its application to Hinode/SP data have been compared with the inverted results from other inversion codes.

  8. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  9. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    PubMed

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  10. Abstract knowledge versus direct experience in processing of binomial expressions

    PubMed Central

    Morgan, Emily; Levy, Roger

    2016-01-01

    We ask whether word order preferences for binomial expressions of the form A and B (e.g. bread and butter) are driven by abstract linguistic knowledge of ordering constraints referencing the semantic, phonological, and lexical properties of the constituent words, or by prior direct experience with the specific items in questions. Using forced-choice and self-paced reading tasks, we demonstrate that online processing of never-before-seen binomials is influenced by abstract knowledge of ordering constraints, which we estimate with a probabilistic model. In contrast, online processing of highly frequent binomials is primarily driven by direct experience, which we estimate from corpus frequency counts. We propose a trade-off wherein processing of novel expressions relies upon abstract knowledge, while reliance upon direct experience increases with increased exposure to an expression. Our findings support theories of language processing in which both compositional generation and direct, holistic reuse of multi-word expressions play crucial roles. PMID:27776281

  11. A Bayesian Theory of Sequential Causal Learning and Abstract Transfer

    ERIC Educational Resources Information Center

    Lu, Hongjing; Rojas, Randall R.; Beckers, Tom; Yuille, Alan L.

    2016-01-01

    Two key research issues in the field of causal learning are how people acquire causal knowledge when observing data that are presented sequentially, and the level of abstraction at which learning takes place. Does sequential causal learning solely involve the acquisition of specific cause-effect links, or do learners also acquire knowledge about…

  12. Abstraction and art.

    PubMed Central

    Gortais, Bernard

    2003-01-01

    In a given social context, artistic creation comprises a set of processes, which relate to the activity of the artist and the activity of the spectator. Through these processes we see and understand that the world is vaster than it is said to be. Artistic processes are mediated experiences that open up the world. A successful work of art expresses a reality beyond actual reality: it suggests an unknown world using the means and the signs of the known world. Artistic practices incorporate the means of creation developed by science and technology and change forms as they change. Artists and the public follow different processes of abstraction at different levels, in the definition of the means of creation, of representation and of perception of a work of art. This paper examines how the processes of abstraction are used within the framework of the visual arts and abstract painting, which appeared during a period of growing importance for the processes of abstraction in science and technology, at the beginning of the twentieth century. The development of digital platforms and new man-machine interfaces allow multimedia creations. This is performed under the constraint of phases of multidisciplinary conceptualization using generic representation languages, which tend to abolish traditional frontiers between the arts: visual arts, drama, dance and music. PMID:12903659

  13. Self-complementary circular codes in coding theory.

    PubMed

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  14. From Physics Model to Results: An Optimizing Framework for Cross-Architecture Code Generation

    DOE PAGES

    Blazewicz, Marek; Hinder, Ian; Koppelman, David M.; ...

    2013-01-01

    Starting from a high-level problem description in terms of partial differential equations using abstract tensor notation, the Chemora framework discretizes, optimizes, and generates complete high performance codes for a wide range of compute architectures. Chemora extends the capabilities of Cactus, facilitating the usage of large-scale CPU/GPU systems in an efficient manner for complex applications, without low-level code tuning. Chemora achieves parallelism through MPI and multi-threading, combining OpenMP and CUDA. Optimizations include high-level code transformations, efficient loop traversal strategies, dynamically selected data and instruction cache usage strategies, and JIT compilation of GPU code tailored to the problem characteristics. The discretization ismore » based on higher-order finite differences on multi-block domains. Chemora's capabilities are demonstrated by simulations of black hole collisions. This problem provides an acid test of the framework, as the Einstein equations contain hundreds of variables and thousands of terms.« less

  15. DSP code optimization based on cache

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Li, Chengcheng; Tang, Bin

    2013-03-01

    DSP program's running efficiency on board is often lower than which via the software simulation during the program development, which is mainly resulted from the user's improper use and incomplete understanding of the cache-based memory. This paper took the TI TMS320C6455 DSP as an example, analyzed its two-level internal cache, and summarized the methods of code optimization. Processor can achieve its best performance when using these code optimization methods. At last, a specific algorithm application in radar signal processing is proposed. Experiment result shows that these optimization are efficient.

  16. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less

  17. Concept Formation and Abstraction.

    ERIC Educational Resources Information Center

    Lunzer, Eric A.

    1979-01-01

    This paper examines the nature of concepts and conceptual processes and the manner of their formation. It argues that a process of successive abstraction and systematization is central to the evolution of conceptual structures. Classificatory processes are discussed and three levels of abstraction outlined. (Author/SJL)

  18. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  19. Semantic Neighborhood Effects for Abstract versus Concrete Words

    PubMed Central

    Danguecan, Ashley N.; Buchanan, Lori

    2016-01-01

    Studies show that semantic effects may be task-specific, and thus, that semantic representations are flexible and dynamic. Such findings are critical to the development of a comprehensive theory of semantic processing in visual word recognition, which should arguably account for how semantic effects may vary by task. It has been suggested that semantic effects are more directly examined using tasks that explicitly require meaning processing relative to those for which meaning processing is not necessary (e.g., lexical decision task). The purpose of the present study was to chart the processing of concrete versus abstract words in the context of a global co-occurrence variable, semantic neighborhood density (SND), by comparing word recognition response times (RTs) across four tasks varying in explicit semantic demands: standard lexical decision task (with non-pronounceable non-words), go/no-go lexical decision task (with pronounceable non-words), progressive demasking task, and sentence relatedness task. The same experimental stimulus set was used across experiments and consisted of 44 concrete and 44 abstract words, with half of these being low SND, and half being high SND. In this way, concreteness and SND were manipulated in a factorial design using a number of visual word recognition tasks. A consistent RT pattern emerged across tasks, in which SND effects were found for abstract (but not necessarily concrete) words. Ultimately, these findings highlight the importance of studying interactive effects in word recognition, and suggest that linguistic associative information is particularly important for abstract words. PMID:27458422

  20. Specification Reformulation During Specification Validation

    NASA Technical Reports Server (NTRS)

    Benner, Kevin M.

    1992-01-01

    The goal of the ARIES Simulation Component (ASC) is to uncover behavioral errors by 'running' a specification at the earliest possible points during the specification development process. The problems to be overcome are the obvious ones the specification may be large, incomplete, underconstrained, and/or uncompilable. This paper describes how specification reformulation is used to mitigate these problems. ASC begins by decomposing validation into specific validation questions. Next, the specification is reformulated to abstract out all those features unrelated to the identified validation question thus creating a new specialized specification. ASC relies on a precise statement of the validation question and a careful application of transformations so as to preserve the essential specification semantics in the resulting specialized specification. This technique is a win if the resulting specialized specification is small enough so the user my easily handle any remaining obstacles to execution. This paper will: (1) describe what a validation question is; (2) outline analysis techniques for identifying what concepts are and are not relevant to a validation question; and (3) identify and apply transformations which remove these less relevant concepts while preserving those which are relevant.

  1. Hemispheric asymmetry of liking for representational and abstract paintings.

    PubMed

    Nadal, Marcos; Schiavi, Susanna; Cattaneo, Zaira

    2017-10-13

    Although the neural correlates of the appreciation of aesthetic qualities have been the target of much research in the past decade, few experiments have explored the hemispheric asymmetries in underlying processes. In this study, we used a divided visual field paradigm to test for hemispheric asymmetries in men and women's preference for abstract and representational artworks. Both male and female participants liked representational paintings more when presented in the right visual field, whereas preference for abstract paintings was unaffected by presentation hemifield. We hypothesize that this result reflects a facilitation of the sort of visual processes relevant to laypeople's liking for art-specifically, local processing of highly informative object features-when artworks are presented in the right visual field, given the left hemisphere's advantage in processing such features.

  2. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  3. A Statistical Analysis of IrisCode and Its Security Implications.

    PubMed

    Kong, Adams Wai-Kin

    2015-03-01

    IrisCode has been used to gather iris data for 430 million people. Because of the huge impact of IrisCode, it is vital that it is completely understood. This paper first studies the relationship between bit probabilities and a mean of iris images (The mean of iris images is defined as the average of independent iris images.) and then uses the Chi-square statistic, the correlation coefficient and a resampling algorithm to detect statistical dependence between bits. The results show that the statistical dependence forms a graph with a sparse and structural adjacency matrix. A comparison of this graph with a graph whose edges are defined by the inner product of the Gabor filters that produce IrisCodes shows that partial statistical dependence is induced by the filters and propagates through the graph. Using this statistical information, the security risk associated with two patented template protection schemes that have been deployed in commercial systems for producing application-specific IrisCodes is analyzed. To retain high identification speed, they use the same key to lock all IrisCodes in a database. The belief has been that if the key is not compromised, the IrisCodes are secure. This study shows that even without the key, application-specific IrisCodes can be unlocked and that the key can be obtained through the statistical dependence detected.

  4. Geophysical abstracts 167, October-December 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  5. Geophysical abstracts 164, January-March 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. A new table of contents, alphabetically arranged, has been adapted to show more clearly the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  6. Geophysical abstracts 166, July-September 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  7. Geophysical abstracts 165, April-June 1956

    USGS Publications Warehouse

    Rabbitt, Mary C.; Vitaliano, Dorothy B.; Vesselowsky, S.T.; ,

    1956-01-01

    Geophysical Abstracts includes abstracts of technical papers and books on the physics of the solid earth, the application of physical methods and techniques to geologic problems, and geophysical exploration. The table of contents, which is alphabetically arranged, shows the material covered.Abstracts are prepared only of material that is believed to be generally available. Ordinarily abstracts are not published of material with limited circulation (such as dissertations, open-file reports, or memoranda) or of other papers presented orally at meetings unless summaries of substantial length are published. Abstracts of papers in Japanese and Chinese are based on abstracts or summaries in a western language accompanying the paper.

  8. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  9. Highway Safety Program Manual: Volume 6: Codes and Laws.

    ERIC Educational Resources Information Center

    National Highway Traffic Safety Administration (DOT), Washington, DC.

    Volume 6 of the 19-volume Highway Safety Program Manual (which provides guidance to State and local governments on preferred safety practices) concentrates on codes and laws. The purpose and specific objectives of the Codes and Laws Program, Federal authority in the area of highway safety, and policies regarding traffic regulation are described.…

  10. Separability of Abstract-Category and Specific-Exemplar Visual Object Subsystems: Evidence from fMRI Pattern Analysis

    PubMed Central

    McMenamin, Brenton W.; Deason, Rebecca G.; Steele, Vaughn R.; Koutstaal, Wilma; Marsolek, Chad J.

    2014-01-01

    Previous research indicates that dissociable neural subsystems underlie abstract-category (AC) recognition and priming of objects (e.g., cat, piano) and specific-exemplar (SE) recognition and priming of objects (e.g., a calico cat, a different calico cat, a grand piano, etc.). However, the degree of separability between these subsystems is not known, despite the importance of this issue for assessing relevant theories. Visual object representations are widely distributed in visual cortex, thus a multivariate pattern analysis (MVPA) approach to analyzing functional magnetic resonance imaging (fMRI) data may be critical for assessing the separability of different kinds of visual object processing. Here we examined the neural representations of visual object categories and visual object exemplars using multi-voxel pattern analyses of brain activity elicited in visual object processing areas during a repetition-priming task. In the encoding phase, participants viewed visual objects and the printed names of other objects. In the subsequent test phase, participants identified objects that were either same-exemplar primed, different-exemplar primed, word-primed, or unprimed. In visual object processing areas, classifiers were trained to distinguish same-exemplar primed objects from word-primed objects. Then, the abilities of these classifiers to discriminate different-exemplar primed objects and word-primed objects (reflecting AC priming) and to discriminate same-exemplar primed objects and different-exemplar primed objects (reflecting SE priming) was assessed. Results indicated that (a) repetition priming in occipital-temporal regions is organized asymmetrically, such that AC priming is more prevalent in the left hemisphere and SE priming is more prevalent in the right hemisphere, and (b) AC and SE subsystems are weakly modular, not strongly modular or unified. PMID:25528436

  11. Energy Research Abstracts. [DOE abstract journal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-01-01

    Energy Research Abstracts (ERA) provides abstracting and indexing coverage of all scientific and technical reports, journal articles, conference papers and proceedings, books, patents, theses, and monographs originated by the US Department of Energy, its laboratories, energy centers, and contractors. ERA also covers other energy information prepared in report form by federal and state government organizations, foreign governments, and domestic and foreign universities and research organizations. ERA coverage of non-report literature is limited to that generated by Department of Energy activity. ERA is comprehensive in its subject scope, encompassing the DOE's research, development, demonstration, and technological programs resulting from its broadmore » charter for energy sources, conservation, safety, environmental impacts, and regulation. Corporate, author, subject, report number, and contract number indexes are included. ERA is available on an exchange basis to universities, research intitutions, industrial firms, and publishers of scientific information. Federal, state, and municipal agencies concerned with energy development, conservation, and usage may obtain ERA free of charge. Inquiries should be directed to the Technical Information Center, P.O. Box 62, Oak Ridge, Tennessee 37830. ERA is available to the public on a subscription basis for 24 semimonthly issues including a semiannual index and an annual index. All citations announced in ERA exist as separate records in the DOE Energy Data Base.« less

  12. What to do with a Dead Research Code

    NASA Astrophysics Data System (ADS)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  13. The Concreteness Effect and the Bilingual Lexicon: The Impact of Visual Stimuli Attachment on Meaning Recall of Abstract L2 Words

    ERIC Educational Resources Information Center

    Farley, Andrew P.; Ramonda, Kris; Liu, Xun

    2012-01-01

    According to the Dual-Coding Theory (Paivio & Desrochers, 1980), words that are associated with rich visual imagery are more easily learned than abstract words due to what is termed the concreteness effect (Altarriba & Bauer, 2004; de Groot, 1992, de Groot et al., 1994; ter Doest & Semin, 2005). The present study examined the effects of attaching…

  14. Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trambauer, K.

    1997-07-01

    The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonablemore » accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running

  15. Simulation of Code Spectrum and Code Flow of Cultured Neuronal Networks.

    PubMed

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime

    2016-01-01

    It has been shown that, in cultured neuronal networks on a multielectrode, pseudorandom-like sequences (codes) are detected, and they flow with some spatial decay constant. Each cultured neuronal network is characterized by a specific spectrum curve. That is, we may consider the spectrum curve as a "signature" of its associated neuronal network that is dependent on the characteristics of neurons and network configuration, including the weight distribution. In the present study, we used an integrate-and-fire model of neurons with intrinsic and instantaneous fluctuations of characteristics for performing a simulation of a code spectrum from multielectrodes on a 2D mesh neural network. We showed that it is possible to estimate the characteristics of neurons such as the distribution of number of neurons around each electrode and their refractory periods. Although this process is a reverse problem and theoretically the solutions are not sufficiently guaranteed, the parameters seem to be consistent with those of neurons. That is, the proposed neural network model may adequately reflect the behavior of a cultured neuronal network. Furthermore, such prospect is discussed that code analysis will provide a base of communication within a neural network that will also create a base of natural intelligence.

  16. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  17. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  18. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  19. Co-operative Educational Abstracting Service (CEAS). [Abstract Series No. 1-4, 1969-1971].

    ERIC Educational Resources Information Center

    International Bureau of Education, Geneva (Switzerland).

    This document is a compilation of 163 English-language abstracts concerning various aspects of education in Australia, Brazil, Bulgaria, Denmark, Finland, France, Hungary, Iceland, India, Israel, Japan, Mexico, Nigeria, Philippines, Thailand, UAR, U.S., USSR, and Yugoslavia. The abstracts are informative in nature and are approximately 1,500 words…

  20. Mechanisms for Aiding Worker Adjustment to Technological Change: Volume 2: Concepts, Review of the Literature, Abstracts.

    ERIC Educational Resources Information Center

    Utah Univ., Salt Lake City. Human Resources Inst.

    Volume 2, which accompanies "Mechanisms for Aiding Worker Adjustment to Technological Change, Volume 1," consists of a key word index for locating specific topics and the abstracts of literature reviewed in Volume 1. Key words, referring to aspects of worker adjustment to technological change appearing in the abstracted literature, are grouped…

  1. A Comprehensive Approach to Convert a Radiology Department From Coding Based on International Classification of Diseases, Ninth Revision, to Coding Based on International Classification of Diseases, Tenth Revision.

    PubMed

    McBee, Morgan P; Laor, Tal; Pryor, Rebecca M; Smith, Rachel; Hardin, Judy; Ulland, Lisa; May, Sally; Zhang, Bin; Towbin, Alexander J

    2018-02-01

    The purpose of this study was to adapt our radiology reports to provide the documentation required for specific International Classification of Diseases, tenth rev (ICD-10) diagnosis coding. Baseline data were analyzed to identify the reports with the greatest number of unspecified ICD-10 codes assigned by computer-assisted coding software. A two-part quality improvement initiative was subsequently implemented. The first component involved improving clinical histories by utilizing technologists to obtain information directly from the patients or caregivers, which was then imported into the radiologist's report within the speech recognition software. The second component involved standardization of report terminology and creation of four different structured report templates to determine which yielded the fewest reports with an unspecified ICD-10 code assigned by an automated coding engine. In all, 12,077 reports were included in the baseline analysis. Of these, 5,151 (43%) had an unspecified ICD-10 code. The majority of deficient reports were for radiographs (n = 3,197; 62%). Inadequacies included insufficient clinical history provided and lack of detailed fracture descriptions. Therefore, the focus was standardizing terminology and testing different structured reports for radiographs obtained for fractures. At baseline, 58% of radiography reports contained a complete clinical history with improvement to >95% 8 months later. The total number of reports that contained an unspecified ICD-10 code improved from 43% at baseline to 27% at completion of this study (P < .0001). The number of radiology studies with a specific ICD-10 code can be improved through quality improvement methodology, specifically through the use of technologist-acquired clinical histories and structured reporting. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. Letter position coding across modalities: the case of Braille readers.

    PubMed

    Perea, Manuel; García-Chamorro, Cristina; Martín-Suesta, Miguel; Gómez, Pablo

    2012-01-01

    The question of how the brain encodes letter position in written words has attracted increasing attention in recent years. A number of models have recently been proposed to accommodate the fact that transposed-letter stimuli like jugde or caniso are perceptually very close to their base words. Here we examined how letter position coding is attained in the tactile modality via Braille reading. The idea is that Braille word recognition may provide more serial processing than the visual modality, and this may produce differences in the input coding schemes employed to encode letters in written words. To that end, we conducted a lexical decision experiment with adult Braille readers in which the pseudowords were created by transposing/replacing two letters. We found a word-frequency effect for words. In addition, unlike parallel experiments in the visual modality, we failed to find any clear signs of transposed-letter confusability effects. This dissociation highlights the differences between modalities. The present data argue against models of letter position coding that assume that transposed-letter effects (in the visual modality) occur at a relatively late, abstract locus.

  3. Designing for Mathematical Abstraction

    ERIC Educational Resources Information Center

    Pratt, Dave; Noss, Richard

    2010-01-01

    Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as "designing for abstraction." In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing…

  4. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  5. Coding for effective denial management.

    PubMed

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  6. Using ClinicalTrials.gov to supplement information in ophthalmology conference abstracts about trial outcomes: a comparison study.

    PubMed

    Scherer, Roberta W; Huynh, Lynn; Ervin, Ann-Margret; Dickersin, Kay

    2015-01-01

    Including results from unpublished randomized controlled trials (RCTs) in a systematic review may ameliorate the effect of publication bias in systematic review results. Unpublished RCTs are sometimes described in abstracts presented at conferences, included in trials registers, or both. Trial results may not be available in a trials register and abstracts describing RCT results often lack study design information. Complementary information from a trials register record may be sufficient to allow reliable inclusion of an unpublished RCT only available as an abstract in a systematic review. We identified 496 abstracts describing RCTs presented at the 2007 to 2009 Association for Research in Vision and Ophthalmology (ARVO) meetings; 154 RCTs were registered in ClinicalTrials.gov. Two persons extracted verbatim primary and non-primary outcomes reported in the abstract and ClinicalTrials.gov record. We compared each abstract outcome with all ClinicalTrials.gov outcomes and coded matches as complete, partial, or no match. We identified 800 outcomes in 152 abstracts (95 primary [51 abstracts] and 705 [141 abstracts] non-primary outcomes). No outcomes were reported in 2 abstracts. Of 95 primary outcomes, 17 (18%) agreed completely, 53 (56%) partially, and 25 (26%) had no match with a ClinicalTrials.gov primary or non-primary outcome. Among 705 non-primary outcomes, 56 (8%) agreed completely, 205 (29%) agreed partially, and 444 (63%) had no match with a ClinicalTrials.gov primary or non-primary outcome. Among the 258 outcomes partially agreeing, we found additional information on the time when the outcome was measured more often in ClinicalTrials.gov than in the abstract (141/258 (55%) versus 55/258 (21%)). We found no association between the presence of non-matching "new" outcomes and year of registration, time to registry update, industry sponsorship, or multi-center status. Conference abstracts may be a valuable source of information about results for outcomes of

  7. Using ClinicalTrials.gov to Supplement Information in Ophthalmology Conference Abstracts about Trial Outcomes: A Comparison Study

    PubMed Central

    Scherer, Roberta W.; Huynh, Lynn; Ervin, Ann-Margret; Dickersin, Kay

    2015-01-01

    Background Including results from unpublished randomized controlled trials (RCTs) in a systematic review may ameliorate the effect of publication bias in systematic review results. Unpublished RCTs are sometimes described in abstracts presented at conferences, included in trials registers, or both. Trial results may not be available in a trials register and abstracts describing RCT results often lack study design information. Complementary information from a trials register record may be sufficient to allow reliable inclusion of an unpublished RCT only available as an abstract in a systematic review. Methods We identified 496 abstracts describing RCTs presented at the 2007 to 2009 Association for Research in Vision and Ophthalmology (ARVO) meetings; 154 RCTs were registered in ClinicalTrials.gov. Two persons extracted verbatim primary and non-primary outcomes reported in the abstract and ClinicalTrials.gov record. We compared each abstract outcome with all ClinicalTrials.gov outcomes and coded matches as complete, partial, or no match. Results We identified 800 outcomes in 152 abstracts (95 primary [51 abstracts] and 705 [141 abstracts] non-primary outcomes). No outcomes were reported in 2 abstracts. Of 95 primary outcomes, 17 (18%) agreed completely, 53 (56%) partially, and 25 (26%) had no match with a ClinicalTrials.gov primary or non-primary outcome. Among 705 non-primary outcomes, 56 (8%) agreed completely, 205 (29%) agreed partially, and 444 (63%) had no match with a ClinicalTrials.gov primary or non-primary outcome. Among the 258 outcomes partially agreeing, we found additional information on the time when the outcome was measured more often in ClinicalTrials.gov than in the abstract (141/258 (55%) versus 55/258 (21%)). We found no association between the presence of non-matching “new” outcomes and year of registration, time to registry update, industry sponsorship, or multi-center status. Conclusion Conference abstracts may be a valuable source of

  8. Scientific meeting abstracts: significance, access, and trends.

    PubMed Central

    Kelly, J A

    1998-01-01

    Abstracts of scientific papers and posters that are presented at annual scientific meetings of professional societies are part of the broader category of conference literature. They are an important avenue for the dissemination of current data. While timely and succinct, these abstracts present problems such as an abbreviated peer review and incomplete bibliographic access. METHODS: Seventy societies of health sciences professionals were surveyed about the publication of abstracts from their annual meetings. Nineteen frequently cited journals also were contacted about their policies on the citation of meeting abstracts. Ten databases were searched for the presence of meetings abstracts. RESULTS: Ninety percent of the seventy societies publish their abstracts, with nearly half appearing in the society's journal. Seventy-seven percent of the societies supply meeting attendees with a copy of each abstract, and 43% make their abstracts available in an electronic format. Most of the journals surveyed allow meeting abstracts to be cited. Bibliographic access to these abstracts does not appear to be widespread. CONCLUSIONS: Meeting abstracts play an important role in the dissemination of scientific knowledge. Bibliographic access to meeting abstracts is very limited. The trend toward making meeting abstracts available via the Internet has the potential to give a broader audience access to the information they contain. PMID:9549015

  9. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  10. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  11. Control of transcription of the Bacillus subtilis spoIIIG gene, which codes for the forespore-specific transcription factor sigma G.

    PubMed

    Sun, D X; Cabrera-Martinez, R M; Setlow, P

    1991-05-01

    The Bacillus subtilis spoIIIG gene codes for a sigma factor termed sigma G which directs transcription of genes expressed only in the forespore compartment of the sporulating cell. Use of spoIIIG-lacZ transcriptional fusions showed that spoIIIG is cotranscribed with the spoIIG operon beginning at t0.5-1 of sporulation. However, this large mRNA produced little if any sigma G, and transferring the spoIIIG gene without the spoIIG promoter into the amyE locus resulted in a Spo+ phenotype. Significant translation of spoIIIG began at t2.5-3 with use of an mRNA whose 5' end is just upstream of the spoIIIG coding sequence. Synthesis of this spoIIIG-specific mRNA was not abolished by a deletion in spoIIIG itself. Similar results were obtained when a spoIIIG-lacZ translational fusion lacking the spoIIG promoter was integrated at the amyE locus. These data suggest that synthesis of sigma G is dependent neither on transcription from the spoIIG promoter nor on sigma G itself but can be due to another transcription factor. This transcription factor may be sigma F, the product of the spoIIAC locus, since a spoIIAC mutation blocked spoIIIG expression, and sequences upstream of the 5' end of the spoIIIG-specific mRNA agree well with the recognition sequence for sigma F. RNA polymerase containing sigma F (E sigma F) initiated transcription in vitro on a spoIIIG template at the 5' end found in vivo, as did E sigma G. However, E sigma F showed a greater than 20-fold preference for spoIIIG over a known sigma G-dependent gene compared with the activity of E sigma G.

  12. Is It Really Abstract?

    ERIC Educational Resources Information Center

    Kernan, Christine

    2011-01-01

    For this author, one of the most enjoyable aspects of teaching elementary art is the willingness of students to embrace the different styles of art introduced to them. In this article, she describes a project that allows upper-elementary students to learn about abstract art and the lives of some of the master abstract artists, implement the idea…

  13. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  14. Water Pollution Abstracts. Volume 43, Number 4, Abstracts 645-849.

    DTIC Science & Technology

    WATER POLLUTION, *ABSTRACTS, PURIFICATION, WASTES(INDUSTRIAL), CONTROL, SEWAGE, WATER SUPPLIES, PUBLIC HEALTH, PETROLEUM PRODUCTS, DEGRADATION, DAMS...ESTUARIES, PLANKTON, PHOTOSYNTHESIS, VIRUSES, SEA WATER , MICROBIOLOGY, UNITED KINGDOM.

  15. Intrasystem Analysis Program (IAP) code summaries

    NASA Astrophysics Data System (ADS)

    Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.

    1983-05-01

    This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.

  16. Is the Motor System Necessary for Processing Action and Abstract Emotion Words? Evidence from Focal Brain Lesions

    PubMed Central

    Dreyer, Felix R.; Frey, Dietmar; Arana, Sophie; von Saldern, Sarah; Picht, Thomas; Vajkoczy, Peter; Pulvermüller, Friedemann

    2015-01-01

    Neuroimaging and neuropsychological experiments suggest that modality-preferential cortices, including motor- and somatosensory areas, contribute to the semantic processing of action related concrete words. Still, a possible role of sensorimotor areas in processing abstract meaning remains under debate. Recent fMRI studies indicate an involvement of the left sensorimotor cortex in the processing of abstract-emotional words (e.g., “love”) which resembles activation patterns seen for action words. But are the activated areas indeed necessary for processing action-related and abstract words? The current study now investigates word processing in two patients suffering from focal brain lesion in the left frontocentral motor system. A speeded Lexical Decision Task on meticulously matched word groups showed that the recognition of nouns from different semantic categories – related to food, animals, tools, and abstract-emotional concepts – was differentially affected. Whereas patient HS with a lesion in dorsolateral central sensorimotor systems next to the hand area showed a category-specific deficit in recognizing tool words, patient CA suffering from lesion centered in the left supplementary motor area was primarily impaired in abstract-emotional word processing. These results point to a causal role of the motor cortex in the semantic processing of both action-related object concepts and abstract-emotional concepts and therefore suggest that the motor areas previously found active in action-related and abstract word processing can serve a meaning-specific necessary role in word recognition. The category-specific nature of the observed dissociations is difficult to reconcile with the idea that sensorimotor systems are somehow peripheral or ‘epiphenomenal’ to meaning and concept processing. Rather, our results are consistent with the claim that cognition is grounded in action and perception and based on distributed action perception circuits reaching into

  17. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  18. Top 10 Tips for Using Advance Care Planning Codes in Palliative Medicine and Beyond

    PubMed Central

    Acevedo, Jean; Bull, Janet; Kamal, Arif H.

    2016-01-01

    Abstract Although recommended for all persons with serious illness, advance care planning (ACP) has historically been a charitable clinical service. Inadequate or unreliable provisions for reimbursement, among other barriers, have spurred a gap between the evidence demonstrating the importance of timely ACP and recognition by payers for its delivery.1 For the first time, healthcare is experiencing a dramatic shift in billing codes that support increased care management and care coordination. ACP, chronic care management, and transitional care management codes are examples of this newer recognition of the value of these types of services. ACP discussions are an integral component of comprehensive, high-quality palliative care delivery. The advent of reimbursement mechanisms to recognize these services has an enormous potential to impact palliative care program sustainability and growth. In this article, we highlight 10 tips to effectively using the new ACP codes reimbursable under Medicare. The importance of documentation, proper billing, and nuances regarding coding is addressed. PMID:27682147

  19. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  20. Applying model abstraction techniques to optimize monitoring networks for detecting subsurface contaminant transport

    USDA-ARS?s Scientific Manuscript database

    Improving strategies for monitoring subsurface contaminant transport includes performance comparison of competing models, developed independently or obtained via model abstraction. Model comparison and parameter discrimination involve specific performance indicators selected to better understand s...

  1. A Role for the Motor System in Binding Abstract Emotional Meaning

    PubMed Central

    Carota, Francesca; Hauk, Olaf; Mohr, Bettina; Pulvermüller, Friedemann

    2012-01-01

    Sensorimotor areas activate to action- and object-related words, but their role in abstract meaning processing is still debated. Abstract emotion words denoting body internal states are a critical test case because they lack referential links to objects. If actions expressing emotion are crucial for learning correspondences between word forms and emotions, emotion word–evoked activity should emerge in motor brain systems controlling the face and arms, which typically express emotions. To test this hypothesis, we recruited 18 native speakers and used event-related functional magnetic resonance imaging to compare brain activation evoked by abstract emotion words to that by face- and arm-related action words. In addition to limbic regions, emotion words indeed sparked precentral cortex, including body-part–specific areas activated somatotopically by face words or arm words. Control items, including hash mark strings and animal words, failed to activate precentral areas. We conclude that, similar to their role in action word processing, activation of frontocentral motor systems in the dorsal stream reflects the semantic binding of sign and meaning of abstract words denoting emotions and possibly other body internal states. PMID:21914634

  2. Experimental analysis of coding processes.

    PubMed

    Postman, L; Burns, S

    1973-12-01

    The first part of the paper reports an investigation of the effects of the concreteness-imagery (C-I) value of stimuli and responses on the long-term retention of paired-associate lists. With degree of learning equated, the measures of retention after a 1-week interval showed a significant interaction of Stimulus by Response C-I: When the responses had a high value, recall was substantially better with low than with high stimuli; when the responses were low, there was no reliable difference as a function of stimulus value. Recall was best when abstract stimuli were paired with concrete responses. The second part of the paper is addressed to some current issues in the analysis of coding processes. Major emphasis is placed on the experimental and theoretical differentiation of encoding and decoding processes.

  3. Multilevel Concatenated Block Modulation Codes for the Frequency Non-selective Rayleigh Fading Channel

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Rhee, Dojun

    1996-01-01

    This paper is concerned with construction of multilevel concatenated block modulation codes using a multi-level concatenation scheme for the frequency non-selective Rayleigh fading channel. In the construction of multilevel concatenated modulation code, block modulation codes are used as the inner codes. Various types of codes (block or convolutional, binary or nonbinary) are being considered as the outer codes. In particular, we focus on the special case for which Reed-Solomon (RS) codes are used as the outer codes. For this special case, a systematic algebraic technique for constructing q-level concatenated block modulation codes is proposed. Codes have been constructed for certain specific values of q and compared with the single-level concatenated block modulation codes using the same inner codes. A multilevel closest coset decoding scheme for these codes is proposed.

  4. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less

  5. The Crash Outcome Data Evaluation System (CODES)

    DOT National Transportation Integrated Search

    1996-01-01

    The CODES Technical Report presents state-specific results from the Crash : Outcome Data Evaluation System project. These results confirm previous NHTSA : studies and show that safety belts and motorcycle helmets are effective in : reducing fatalitie...

  6. Interoception: the forgotten modality in perceptual grounding of abstract and concrete concepts.

    PubMed

    Connell, Louise; Lynott, Dermot; Banks, Briony

    2018-08-05

    Conceptual representations are perceptually grounded, but when investigating which perceptual modalities are involved, researchers have typically restricted their consideration to vision, touch, hearing, taste and smell. However, there is another major modality of perceptual information that is distinct from these traditional five senses; that is, interoception, or sensations inside the body. In this paper, we use megastudy data (modality-specific ratings of perceptual strength for over 32 000 words) to explore how interoceptive information contributes to the perceptual grounding of abstract and concrete concepts. We report how interoceptive strength captures a distinct form of perceptual experience across the abstract-concrete spectrum, but is markedly more important to abstract concepts (e.g. hungry , serenity ) than to concrete concepts (e.g. capacity , rainy ). In particular, interoception dominates emotion concepts, especially negative emotions relating to fear and sadness , moreso than other concepts of equivalent abstractness and valence. Finally, we examine whether interoceptive strength represents valuable information in conceptual content by investigating its role in concreteness effects in word recognition, and find that it enhances semantic facilitation over and above the traditional five sensory modalities. Overall, these findings suggest that interoception has comparable status to other modalities in contributing to the perceptual grounding of abstract and concrete concepts.This article is part of the theme issue 'Varieties of abstract concepts: development, use and representation in the brain'. © 2018 The Author(s).

  7. Scientist-Centered Workflow Abstractions via Generic Actors, Workflow Templates, and Context-Awareness for Groundwater Modeling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Sivaramakrishnan, Chandrika; Critchlow, Terence J.

    2011-07-04

    A drawback of existing scientific workflow systems is the lack of support to domain scientists in designing and executing their own scientific workflows. Many domain scientists avoid developing and using workflows because the basic objects of workflows are too low-level and high-level tools and mechanisms to aid in workflow construction and use are largely unavailable. In our research, we are prototyping higher-level abstractions and tools to better support scientists in their workflow activities. Specifically, we are developing generic actors that provide abstract interfaces to specific functionality, workflow templates that encapsulate workflow and data patterns that can be reused and adaptedmore » by scientists, and context-awareness mechanisms to gather contextual information from the workflow environment on behalf of the scientist. To evaluate these scientist-centered abstractions on real problems, we apply them to construct and execute scientific workflows in the specific domain area of groundwater modeling and analysis.« less

  8. Youth Studies Abstracts. Vol. 4 No. 3.

    ERIC Educational Resources Information Center

    Youth Studies Abstracts, 1985

    1985-01-01

    This volume contains 169 abstracts of documents dealing with youth and educational programs for youth. Included in the volume are 97 abstracts of documents dealing with social and educational developments; 56 abstracts of program reports, reviews, and evaluations; and 16 abstracts of program materials. Abstracts are grouped according to the…

  9. Abstraction and perceptual individuation in primed word identification are modulated by distortion and repetition: a dissociation.

    PubMed

    Sciama, Sonia C; Dowker, Ann

    2007-11-01

    One experiment investigated the effects of distortion and multiple prime repetition (super-repetition) on repetition priming using divided-visual-field word identification at test and mixed-case words (e.g., goAT). The experiment measured form-specificity (the effect of matching lettercase at study and test) for two non-conceptual study tasks. For an ideal typeface, super-repetition increased form-independent priming leaving form-specificity constant. The opposite pattern was found for a distorted typeface; super-repetition increased form-specificity, leaving form-independent priming constant. These priming effects did not depend on the study task or test hemifield for either typeface. An additional finding was that only the ideal typeface showed the usual advantage of right hemifield presentation. These results demonstrate that super-repetition produced abstraction for the ideal typeface and perceptual individuation for the distorted typeface; abstraction and perceptual individuation dissociated. We suggest that there is a fundamental duality between perceptual individuation and abstraction consistent with Tulving's (1984) distinction between episodic and semantic memory. This could reflect a duality of system or process.

  10. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  11. Overcoming the Challenges of Unstructured Data in Multisite, Electronic Medical Record-based Abstraction.

    PubMed

    Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy J H

    2016-10-01

    Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as these data are often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. As standard abstraction approaches resulted in substandard data reliability for unstructured data elements collected as part of a multisite, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. We adopted a "fit-for-use" framework to guide the development and evaluation of abstraction methods using a 4-step, phase-based approach including (1) team building; (2) identification of challenges; (3) adaptation of abstraction methods; and (4) systematic data quality monitoring. Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (eg, warfarin initiation) and medical follow-up (eg, timeframe for follow-up). After implementation of the phase-based approach, interrater reliability for all unstructured data elements demonstrated κ's of ≥0.89-an average increase of +0.25 for each unstructured data element. As compared with standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multisite EMR documentation.

  12. Separability of abstract-category and specific-exemplar visual object subsystems: evidence from fMRI pattern analysis.

    PubMed

    McMenamin, Brenton W; Deason, Rebecca G; Steele, Vaughn R; Koutstaal, Wilma; Marsolek, Chad J

    2015-02-01

    Previous research indicates that dissociable neural subsystems underlie abstract-category (AC) recognition and priming of objects (e.g., cat, piano) and specific-exemplar (SE) recognition and priming of objects (e.g., a calico cat, a different calico cat, a grand piano, etc.). However, the degree of separability between these subsystems is not known, despite the importance of this issue for assessing relevant theories. Visual object representations are widely distributed in visual cortex, thus a multivariate pattern analysis (MVPA) approach to analyzing functional magnetic resonance imaging (fMRI) data may be critical for assessing the separability of different kinds of visual object processing. Here we examined the neural representations of visual object categories and visual object exemplars using multi-voxel pattern analyses of brain activity elicited in visual object processing areas during a repetition-priming task. In the encoding phase, participants viewed visual objects and the printed names of other objects. In the subsequent test phase, participants identified objects that were either same-exemplar primed, different-exemplar primed, word-primed, or unprimed. In visual object processing areas, classifiers were trained to distinguish same-exemplar primed objects from word-primed objects. Then, the abilities of these classifiers to discriminate different-exemplar primed objects and word-primed objects (reflecting AC priming) and to discriminate same-exemplar primed objects and different-exemplar primed objects (reflecting SE priming) was assessed. Results indicated that (a) repetition priming in occipital-temporal regions is organized asymmetrically, such that AC priming is more prevalent in the left hemisphere and SE priming is more prevalent in the right hemisphere, and (b) AC and SE subsystems are weakly modular, not strongly modular or unified. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. A comprehensive catalogue of the coding and non-coding transcripts of the human inner ear

    PubMed Central

    Corneveaux, Jason J.; Ohmen, Jeffrey; White, Cory; Allen, April N.; Lusis, Aldons J.; Van Camp, Guy; Huentelman, Matthew J.; Friedman, Rick A.

    2015-01-01

    The mammalian inner ear consists of the cochlea and the vestibular labyrinth (utricle, saccule, and semicircular canals), which participate in both hearing and balance. Proper development and life-long function of these structures involves a highly complex coordinated system of spatial and temporal gene expression. The characterization of the inner ear transcriptome is likely important for the functional study of auditory and vestibular components, yet, primarily due to tissue unavailability, detailed expression catalogues of the human inner ear remain largely incomplete. We report here, for the first time, comprehensive transcriptome characterization of the adult human cochlea, ampulla, saccule and utricle of the vestibule obtained from patients without hearing abnormalities. Using RNA-Seq, we measured the expression of >50,000 predicted genes corresponding to approximately 200,000 transcripts, in the adult inner ear and compared it to 32 other human tissues. First, we identified genes preferentially expressed in the inner ear, and unique either to the vestibule or cochlea. Next, we examined expression levels of specific groups of potentially interesting RNAs, such as genes implicated in hearing loss, long non-coding RNAs, pseudogenes and transcripts subject to nonsense mediated decay (NMD). We uncover the spatial specificity of expression of these RNAs in the hearing/balance system, and reveal evidence of tissue specific NMD. Lastly, we investigated the non-syndromic deafness loci to which no gene has been mapped, and narrow the list of potential candidates for each locus. These data represent the first high-resolution transcriptome catalogue of the adult human inner ear. A comprehensive identification of coding and non-coding RNAs in the inner ear will enable pathways of auditory and vestibular function to be further defined in the study of hearing and balance. Expression data are freely accessible at https

  14. Accuracy of injury coding under ICD‐9 for New Zealand public hospital discharges

    PubMed Central

    Langley, J; Stephenson, S; Thorpe, C; Davie, G

    2006-01-01

    Objective To determine the level of accuracy in coding for injury principal diagnosis and the first external cause code for public hospital discharges in New Zealand and determine how these levels vary by hospital size. Method A simple random sample of 1800 discharges was selected from the period 1996–98 inclusive. Records were obtained from hospitals and an accredited coder coded the discharge independently of the codes already recorded in the national database. Results Five percent of the principal diagnoses, 18% of the first four digits of the E‐codes, and 8% of the location codes (5th digit of the E‐code), were incorrect. There were no substantive differences in the level of incorrect coding between large and small hospitals. Conclusions Users of New Zealand public hospital discharge data can have a high degree of confidence in the injury diagnoses coded under ICD‐9‐CM‐A. A similar degree of confidence is warranted for E‐coding at the group level (for example, fall), but not, in general, at higher levels of specificity (for example, type of fall). For those countries continuing to use ICD‐9 the study provides insight into potential problems of coding and thus guidance on where the focus of coder training should be placed. For those countries that have historical data coded according to ICD‐9 it suggests that some specific injury and external cause incidence estimates may need to be treated with more caution. PMID:16461421

  15. Abstract Constructions.

    ERIC Educational Resources Information Center

    Pietropola, Anne

    1998-01-01

    Describes a lesson designed to culminate a year of eighth-grade art classes in which students explore elements of design and space by creating 3-D abstract constructions. Outlines the process of using foam board and markers to create various shapes and optical effects. (DSK)

  16. Imagining the truth and the moon: an electrophysiological study of abstract and concrete word processing.

    PubMed

    Gullick, Margaret M; Mitra, Priya; Coch, Donna

    2013-05-01

    Previous event-related potential studies have indicated that both a widespread N400 and an anterior N700 index differential processing of concrete and abstract words, but the nature of these components in relation to concreteness and imagery has been unclear. Here, we separated the effects of word concreteness and task demands on the N400 and N700 in a single word processing paradigm with a within-subjects, between-tasks design and carefully controlled word stimuli. The N400 was larger to concrete words than to abstract words, and larger in the visualization task condition than in the surface task condition, with no interaction. A marked anterior N700 was elicited only by concrete words in the visualization task condition, suggesting that this component indexes imagery. These findings are consistent with a revised or extended dual coding theory according to which concrete words benefit from greater activation in both verbal and imagistic systems. Copyright © 2013 Society for Psychophysiological Research.

  17. NASA Patent Abstracts Bibliography: A Continuing Bibliography. Supplement 60

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Several thousand inventions result each year from the aeronautical and space research supported by the National Aeronautics and Space Administration. The inventions having important use in government programs or significant commercial potential are usually patented by NASA. These inventions cover practically all fields of technology and include many that have useful and valuable commercial application. NASA inventions best serve the interests of the United States when their benefits are available to the public. In many instances, the granting of nonexclusive or exclusive licenses for the practice of these inventions may assist in the accomplishment of this objective. This bibliography is published as a service to companies, firms, and individuals seeking new, licensable products for the commercial market. The NASA Patent Abstracts Bibliography is a semiannual NASA publication containing comprehensive abstracts of NASA owned inventions covered by U.S. patents. The citations included in the bibliography arrangement of citations were originally published in NASA's Scientific and Technical Aerospace Reports (STAR) and cover STAR announcements made since May 1969. The citations published in this issue cover the period July 2001 through December 2001. This issue includes 10 major subject divisions separated into 76 specific categories and one general category/division. (See Table of Contents for the scope note of each category, under which are grouped appropriate NASA inventions.) This scheme was devised in 1975 and revised in 1987 in lieu of the 34 category divisions which were utilized in supplements (01) through (06) covering STAR abstracts from May 1969 through January 1974. Each entry consists of a STAR citation accompanied by an abstract and, when appropriate, a key illustration taken from the patent or application for patent. Entries are arranged by subject category in ascending order. A typical citation and abstract presents the various data elements included in

  18. NASA Patent Abstracts Bibliography: A Continuing Bibliography. Supplement 58

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This report lists reports, articles and other documents recently announced in the NASA STI Database. Several thousand inventions result each year from the aeronautical and space research supported by the National Aeronautics and Space Administration. The inventions having important use in government programs or significant commercial potential are usually patented by NASA. These inventions cover practically all fields of technology and include many that have useful and valuable commercial application. NASA inventions best serve the interests of the United States when their benefits are available to the public. In many instances, the granting of nonexclusive or exclusive licenses for the practice of these inventions may assist in the accomplishment of this objective. This bibliography is published as a service to companies, firms, and individuals seeking new, licensable products for the commercial market. The NASA Patent Abstracts Bibliography is a semiannual NASA publication containing comprehensive abstracts of NASA owned inventions covered by U.S. patents. The citations included in the bibliography arrangement of citations were originally published in NASA's Scientific and Technical Aerospace Reports (STAR) and cover STAR announcements made since May 1969. The citations published in this issue cover the period July 2000 through December 2000. This issue includes 10 major subject divisions separated into 76 specific categories and one general category/division. This scheme was devised in 1975 and revised in 1987 in lieu of the 34 category divisions which were utilized in supplements (01) through (06) covering STAR abstracts from May 1969 through January 1974. Each entry consists of a STAR citation accompanied by an abstract and, when appropriate, a key illustration taken from the patent or application for patent. Entries are arranged by subject category in ascending order. A typical citation and abstract presents the various data elements included in most records

  19. NASA Patent Abstracts Bibliography: A Continuing Bibliography. Supplement 62

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Several thousand inventions result each year from research supported by the National Aeronautics and Space Administration. NASA seeks patent protection on inventions to which it has title if the invention has important use in government programs or significant commercial potential. These inventions cover a broad range of technologies and include many that have useful and valuable commercial application. NASA inventions best serve the interests of the United States when their benefits are available to the public. In many instances, the granting of nonexclusive or exclusive licenses for the practice of these inventions may assist in the accomplishment of this objective. This bibliography is published as a service to companies, firms, and individuals seeking new, licensable products for the commercial market. The NASA Patent Abstracts Bibliography is a semiannual NASA publication containing comprehensive abstracts of NASA owned inventions covered by U.S. patents. The citations included in the bibliography arrangement of citations were originally published in NASA's Scientific and Technical Aerospace Reports (STAR) and cover STAR announcements made since May 1969. The citations published in this issue cover the period July 2002 through. December 2002. This issue includes 10 major subject divisions separated into 76 specific categories and one general category/division. (See Table of Contents for the scope note of each category, under which are grouped appropriate NASA inventions.) This scheme was devised in 1975 and revised in 1987 in lieu of the 34 category divisions which were utilized in supplements (01) through (06) covering STAR abstracts from May 1969 through January 1974. Each entry consists of a STAR citation accompanied by an abstract and, when appropriate, a key illustration taken from the patent or application for patent. Entries are arranged by subject category in ascending order. A typical citation and abstract presents the various data elements included

  20. Coded Cooperation for Multiway Relaying in Wireless Sensor Networks.

    PubMed

    Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar

    2015-06-29

    Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels.

  1. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Fujiwara, Tohru; Takata, Toyoo

    1986-01-01

    A coding scheme is investigated for error control in data communication systems. The scheme is obtained by cascading two error correcting codes, called the inner and outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon <1/2. It is shown that if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging form high rates to very low rates and Reed-Solomon codes as inner codes are considered, and their error probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  2. 29 CFR 510.21 - SIC codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... annual Census of Manufacturing Industries as a source of average hourly wage data by industry. Industries in that census are organized by Standard Industrial Classification (SIC), the statistical... stated that data “should be at a level of specificity comparable to the four digit Standard Industry Code...

  3. 29 CFR 510.21 - SIC codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... annual Census of Manufacturing Industries as a source of average hourly wage data by industry. Industries in that census are organized by Standard Industrial Classification (SIC), the statistical... stated that data “should be at a level of specificity comparable to the four digit Standard Industry Code...

  4. 29 CFR 510.21 - SIC codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... annual Census of Manufacturing Industries as a source of average hourly wage data by industry. Industries in that census are organized by Standard Industrial Classification (SIC), the statistical... stated that data “should be at a level of specificity comparable to the four digit Standard Industry Code...

  5. 29 CFR 510.21 - SIC codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... annual Census of Manufacturing Industries as a source of average hourly wage data by industry. Industries in that census are organized by Standard Industrial Classification (SIC), the statistical... stated that data “should be at a level of specificity comparable to the four digit Standard Industry Code...

  6. General 3D Airborne Antenna Radiation Pattern Code Users Manual.

    DTIC Science & Technology

    1983-02-01

    AD-A 30 359 GENERAL 3D AIRBORNEANTENNA RADIATION PATTERN CODE USERS MANUA (U) OHIO STATE UNIV COLUMBUS ELECTROSCIENCE LAB H HCHUNGET AL FEB 83 RADC...F30602-79-C-0068 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASKAREA A WORK UNIT NUMEEfRS The Ohio State University...Computer Program 20, ABSTRACT (Coaffivme on reverse side it ntecessar a" 141etifIr &V block mUbef) This report describes a computer program and how it may

  7. A Statistical Analysis of Reviewer Agreement and Bias in Evaluating Medical Abstracts 1

    PubMed Central

    Cicchetti, Domenic V.; Conn, Harold O.

    1976-01-01

    Observer variability affects virtually all aspects of clinical medicine and investigation. One important aspect, not previously examined, is the selection of abstracts for presentation at national medical meetings. In the present study, 109 abstracts, submitted to the American Association for the Study of Liver Disease, were evaluated by three “blind” reviewers for originality, design-execution, importance, and overall scientific merit. Of the 77 abstracts rated for all parameters by all observers, interobserver agreement ranged between 81 and 88%. However, corresponding intraclass correlations varied between 0.16 (approaching statistical significance) and 0.37 (p < 0.01). Specific tests of systematic differences in scoring revealed statistically significant levels of observer bias on most of the abstract components. Moreover, the mean differences in interobserver ratings were quite small compared to the standard deviations of these differences. These results emphasize the importance of evaluating the simple percentage of rater agreement within the broader context of observer variability and systematic bias. PMID:997596

  8. Applications of Derandomization Theory in Coding

    NASA Astrophysics Data System (ADS)

    Cheraghchi, Mahdi

    2011-07-01

    Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.

  9. Governing sexual behaviour through humanitarian codes of conduct.

    PubMed

    Matti, Stephanie

    2015-10-01

    Since 2001, there has been a growing consensus that sexual exploitation and abuse of intended beneficiaries by humanitarian workers is a real and widespread problem that requires governance. Codes of conduct have been promoted as a key mechanism for governing the sexual behaviour of humanitarian workers and, ultimately, preventing sexual exploitation and abuse (PSEA). This article presents a systematic study of PSEA codes of conduct adopted by humanitarian non-governmental organisations (NGOs) and how they govern the sexual behaviour of humanitarian workers. It draws on Foucault's analytics of governance and speech act theory to examine the findings of a survey of references to codes of conduct made on the websites of 100 humanitarian NGOs, and to analyse some features of the organisation-specific PSEA codes identified. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  10. The development of non-coding RNA ontology.

    PubMed

    Huang, Jingshan; Eilbeck, Karen; Smith, Barry; Blake, Judith A; Dou, Dejing; Huang, Weili; Natale, Darren A; Ruttenberg, Alan; Huan, Jun; Zimmermann, Michael T; Jiang, Guoqian; Lin, Yu; Wu, Bin; Strachan, Harrison J; de Silva, Nisansa; Kasukurthi, Mohan Vamsi; Jha, Vikash Kumar; He, Yongqun; Zhang, Shaojie; Wang, Xiaowei; Liu, Zixing; Borchert, Glen M; Tan, Ming

    2016-01-01

    Identification of non-coding RNAs (ncRNAs) has been significantly improved over the past decade. On the other hand, semantic annotation of ncRNA data is facing critical challenges due to the lack of a comprehensive ontology to serve as common data elements and data exchange standards in the field. We developed the Non-Coding RNA Ontology (NCRO) to handle this situation. By providing a formally defined ncRNA controlled vocabulary, the NCRO aims to fill a specific and highly needed niche in semantic annotation of large amounts of ncRNA biological and clinical data.

  11. Automatic Review of Abstract State Machines by Meta Property Verification

    NASA Technical Reports Server (NTRS)

    Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia

    2010-01-01

    A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.

  12. Regional and temporal variations in coding of hospital diagnoses referring to upper gastrointestinal and oesophageal bleeding in Germany.

    PubMed

    Langner, Ingo; Mikolajczyk, Rafael; Garbe, Edeltraut

    2011-08-17

    Health insurance claims data are increasingly used for health services research in Germany. Hospital diagnoses in these data are coded according to the International Classification of Diseases, German modification (ICD-10-GM). Due to the historical division into West and East Germany, different coding practices might persist in both former parts. Additionally, the introduction of Diagnosis Related Groups (DRGs) in Germany in 2003/2004 might have changed the coding. The aim of this study was to investigate regional and temporal variations in coding of hospitalisation diagnoses in Germany. We analysed hospitalisation diagnoses for oesophageal bleeding (OB) and upper gastrointestinal bleeding (UGIB) from the official German Hospital Statistics provided by the Federal Statistical Office. Bleeding diagnoses were classified as "specific" (origin of bleeding provided) or "unspecific" (origin of bleeding not provided) coding. We studied regional (former East versus West Germany) differences in incidence of hospitalisations with specific or unspecific coding for OB and UGIB and temporal variations between 2000 and 2005. For each year, incidence ratios of hospitalisations for former East versus West Germany were estimated with log-linear regression models adjusting for age, gender and population density. Significant differences in specific and unspecific coding between East and West Germany and over time were found for both, OB and UGIB hospitalisation diagnoses, respectively. For example in 2002, incidence ratios of hospitalisations for East versus West Germany were 1.24 (95% CI 1.16-1.32) for specific and 0.67 (95% CI 0.60-0.74) for unspecific OB diagnoses and 1.43 (95% CI 1.36-1.51) for specific and 0.83 (95% CI 0.80-0.87) for unspecific UGIB. Regional differences nearly disappeared and time trends were less marked when using combined specific and unspecific diagnoses of OB or UGIB, respectively. During the study period, there were substantial regional and temporal

  13. Abstracts of contributed papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  14. New French Regulation for NPPs and Code Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, Claude

    2006-07-01

    On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less

  15. A Code Division Multiple Access Communication System for the Low Frequency Band.

    DTIC Science & Technology

    1983-04-01

    frequency channels spread-spectrum communication / complex sequences, orthogonal codes impulsive noise 20. ABSTRACT (Continue an reverse side It...their transmissions with signature sequences. Our LF/CDMA scheme is different in that each user’s signature sequence set consists of M orthogonal ...signature sequences. Our LF/CDMA scheme is different in that each user’s signature sequence set consists of M orthogonal sequences and thus log 2 M

  16. Changes in the Coding and Non-coding Transcriptome and DNA Methylome that Define the Schwann Cell Repair Phenotype after Nerve Injury.

    PubMed

    Arthur-Farraj, Peter J; Morgan, Claire C; Adamowicz, Martyna; Gomez-Sanchez, Jose A; Fazal, Shaline V; Beucher, Anthony; Razzaghi, Bonnie; Mirsky, Rhona; Jessen, Kristjan R; Aitman, Timothy J

    2017-09-12

    Repair Schwann cells play a critical role in orchestrating nerve repair after injury, but the cellular and molecular processes that generate them are poorly understood. Here, we perform a combined whole-genome, coding and non-coding RNA and CpG methylation study following nerve injury. We show that genes involved in the epithelial-mesenchymal transition are enriched in repair cells, and we identify several long non-coding RNAs in Schwann cells. We demonstrate that the AP-1 transcription factor C-JUN regulates the expression of certain micro RNAs in repair Schwann cells, in particular miR-21 and miR-34. Surprisingly, unlike during development, changes in CpG methylation are limited in injury, restricted to specific locations, such as enhancer regions of Schwann cell-specific genes (e.g., Nedd4l), and close to local enrichment of AP-1 motifs. These genetic and epigenomic changes broaden our mechanistic understanding of the formation of repair Schwann cell during peripheral nervous system tissue repair. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. BOCA BASIC BUILDING CODE. 4TH ED., 1965 AND 1967. BOCA BASIC BUILDING CODE ACCUMULATIVE SUPPLEMENT.

    ERIC Educational Resources Information Center

    Building Officials Conference of America, Inc., Chicago, IL.

    NATIONALLY RECOGNIZED STANDARDS FOR THE EVALUATION OF MINIMUM SAFE PRACTICE OR FOR DETERMINING THE PERFORMANCE OF MATERIALS OR SYSTEMS OF CONSTRUCTION HAVE BEEN COMPILED AS AN AID TO DESIGNERS AND LOCAL OFFICIALS. THE CODE PRESENTS REGULATIONS IN TERMS OF MEASURED PERFORMANCE RATHER THAN IN RIGID SPECIFICATION OF MATERIALS OR METHODS. THE AREAS…

  18. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  19. Advance Organizers: Concret Versus Abstract.

    ERIC Educational Resources Information Center

    Corkill, Alice J.; And Others

    1988-01-01

    Two experiments examined the relative effects of concrete and abstract advance organizers on students' memory for subsequent prose. Results of the experiments are discussed in terms of the memorability, familiarity, and visualizability of concrete and abstract verbal materials. (JD)

  20. Modes of Visual Recognition and Perceptually Relevant Sketch-based Coding for Images

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.

    1991-01-01

    A review of visual recognition studies is used to define two levels of information requirements. These two levels are related to two primary subdivisions of the spatial frequency domain of images and reflect two distinct different physical properties of arbitrary scenes. In particular, pathologies in recognition due to cerebral dysfunction point to a more complete split into two major types of processing: high spatial frequency edge based recognition vs. low spatial frequency lightness (and color) based recognition. The former is more central and general while the latter is more specific and is necessary for certain special tasks. The two modes of recognition can also be distinguished on the basis of physical scene properties: the highly localized edges associated with reflectance and sharp topographic transitions vs. smooth topographic undulation. The extreme case of heavily abstracted images is pursued to gain an understanding of the minimal information required to support both modes of recognition. Here the intention is to define the semantic core of transmission. This central core of processing can then be fleshed out with additional image information and coding and rendering techniques.

  1. Abstracting and indexing guide

    USGS Publications Warehouse

    ,; ,

    1974-01-01

    These instructions have been prepared for those who abstract and index scientific and technical documents for the Water Resources Scientific Information Center (WRSIC). With the recent publication growth in all fields, information centers have undertaken the task of keeping the various scientific communities aware of current and past developments. An abstract with carefully selected index terms offers the user of WRSIC services a more rapid means for deciding whether a document is pertinent to his needs and professional interests, thus saving him the time necessary to scan the complete work. These means also provide WRSIC with a document representation or surrogate which is more easily stored and manipulated to produce various services. Authors are asked to accept the responsibility for preparing abstracts of their own papers to facilitate quick evaluation, announcement, and dissemination to the scientific community.

  2. Publication Abstracts.

    ERIC Educational Resources Information Center

    Johns Hopkins Univ., Baltimore, MD. Center for the Study of Social Organization of Schools.

    This booklet contains abstracts of 62 documents published by the Johns Hopkins University Center for the Study of Social Organization of Schools from September 1967 to May 1970. The majority of the documents are research studies in the areas of desegregation, language development, educational opportunity, and educational games--most of them…

  3. Discrete Sparse Coding.

    PubMed

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  4. Child Injury Deaths: Comparing Prevention Information from Two Coding Systems

    PubMed Central

    Schnitzer, Patricia G.; Ewigman, Bernard G.

    2006-01-01

    Objectives The International Classification of Disease (ICD) external cause of injury E-codes do not sufficiently identify injury circumstances amenable to prevention. The researchers developed an alternative classification system (B-codes) that incorporates behavioral and environmental factors, for use in childhood injury research, and compare the two coding systems in this paper. Methods All fatal injuries among children less than age five that occurred between January 1, 1992, and December 31, 1994, were classified using both B-codes and E-codes. Results E-codes identified the most common causes of injury death: homicide (24%), fires (21%), motor vehicle incidents (21%), drowning (10%), and suffocation (9%). The B-codes further revealed that homicides (51%) resulted from the child being shaken or struck by another person; many fires deaths (42%) resulted from children playing with matches or lighters; drownings (46%) usually occurred in natural bodies of water; and most suffocation deaths (68%) occurred in unsafe sleeping arrangements. Conclusions B-codes identify additional information with specific relevance for prevention of childhood injuries. PMID:15944169

  5. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  6. Contextual Processing of Abstract Concepts Reveals Neural Representations of Non-Linguistic Semantic Content

    PubMed Central

    Wilson-Mendenhall, Christine D.; Simmons, W. Kyle; Martin, Alex; Barsalou, Lawrence W.

    2014-01-01

    Concepts develop for many aspects of experience, including abstract internal states and abstract social activities that do not refer to concrete entities in the world. The current study assessed the hypothesis that, like concrete concepts, distributed neural patterns of relevant, non-linguistic semantic content represent the meanings of abstract concepts. In a novel neuroimaging paradigm, participants processed two abstract concepts (convince, arithmetic) and two concrete concepts (rolling, red) deeply and repeatedly during a concept-scene matching task that grounded each concept in typical contexts. Using a catch trial design, neural activity associated with each concept word was separated from neural activity associated with subsequent visual scenes to assess activations underlying the detailed semantics of each concept. We predicted that brain regions underlying mentalizing and social cognition (e.g., medial prefrontal cortex, superior temporal sulcus) would become active to represent semantic content central to convince, whereas brain regions underlying numerical cognition (e.g., bilateral intraparietal sulcus) would become active to represent semantic content central to arithmetic. The results supported these predictions, suggesting that the meanings of abstract concepts arise from distributed neural systems that represent concept-specific content. PMID:23363408

  7. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  8. Tracking Holland Interest Codes: The Case of South African Field Guides

    ERIC Educational Resources Information Center

    Watson, Mark B.; Foxcroft, Cheryl D.; Allen, Lynda J.

    2007-01-01

    Holland believes that specific personality types seek out matching occupational environments and his theory codes personality and environment according to a six letter interest typology. Since 1985 there have been numerous American studies that have queried the validity of Holland's coding system. Research in South Africa is scarcer, despite…

  9. Phonological Coding Abilities: Identification of Impairments Related to Phonologically Based Reading Problems.

    ERIC Educational Resources Information Center

    Swank, Linda K.

    1994-01-01

    Relationships between phonological coding abilities and reading outcomes have implications for differential diagnosis of language-based reading problems. The theoretical construct of specific phonological coding ability is explained, including phonological encoding, phonological awareness and metaphonology, lexical access, working memory, and…

  10. Evaluation of Two PCR-based Swine-specific Fecal Source Tracking Assays (Abstract)

    EPA Science Inventory

    Several PCR-based methods have been proposed to identify swine fecal pollution in environmental waters. However, the utility of these assays in identifying swine fecal contamination on a broad geographic scale is largely unknown. In this study, we evaluated the specificity, distr...

  11. Dual coding: a cognitive model for psychoanalytic research.

    PubMed

    Bucci, W

    1985-01-01

    Four theories of mental representation derived from current experimental work in cognitive psychology have been discussed in relation to psychoanalytic theory. These are: verbal mediation theory, in which language determines or mediates thought; perceptual dominance theory, in which imagistic structures are dominant; common code or propositional models, in which all information, perceptual or linguistic, is represented in an abstract, amodal code; and dual coding, in which nonverbal and verbal information are each encoded, in symbolic form, in separate systems specialized for such representation, and connected by a complex system of referential relations. The weight of current empirical evidence supports the dual code theory. However, psychoanalysis has implicitly accepted a mixed model-perceptual dominance theory applying to unconscious representation, and verbal mediation characterizing mature conscious waking thought. The characterization of psychoanalysis, by Schafer, Spence, and others, as a domain in which reality is constructed rather than discovered, reflects the application of this incomplete mixed model. The representations of experience in the patient's mind are seen as without structure of their own, needing to be organized by words, thus vulnerable to distortion or dissolution by the language of the analyst or the patient himself. In these terms, hypothesis testing becomes a meaningless pursuit; the propositions of the theory are no longer falsifiable; the analyst is always more or less "right." This paper suggests that the integrated dual code formulation provides a more coherent theoretical framework for psychoanalysis than the mixed model, with important implications for theory and technique. In terms of dual coding, the problem is not that the nonverbal representations are vulnerable to distortion by words, but that the words that pass back and forth between analyst and patient will not affect the nonverbal schemata at all. Using the dual code

  12. Clinical application of ICF key codes to evaluate patients with dysphagia following stroke

    PubMed Central

    Dong, Yi; Zhang, Chang-Jie; Shi, Jie; Deng, Jinggui; Lan, Chun-Na

    2016-01-01

    Abstract This study was aimed to identify and evaluate the International Classification of Functioning (ICF) key codes for dysphagia in stroke patients. Thirty patients with dysphagia after stroke were enrolled in our study. To evaluate the ICF dysphagia scale, 6 scales were used as comparisons, namely the Barthel Index (BI), Repetitive Saliva Swallowing Test (RSST), Kubota Water Swallowing Test (KWST), Frenchay Dysarthria Assessment, Mini-Mental State Examination (MMSE), and the Montreal Cognitive Assessment (MoCA). Multiple regression analysis was performed to quantitate the relationship between the ICF scale and the other 7 scales. In addition, 60 ICF scales were analyzed by the least absolute shrinkage and selection operator (LASSO) method. A total of 21 ICF codes were identified, which were closely related with the other scales. These included 13 codes from Body Function, 1 from Body Structure, 3 from Activities and Participation, and 4 from Environmental Factors. A topographic network map with 30 ICF key codes was also generated to visualize their relationships. The number of ICF codes identified is in line with other well-established evaluation methods. The network topographic map generated here could be used as an instruction tool in future evaluations. We also found that attention functions and biting were critical codes of these scales, and could be used as treatment targets. PMID:27661012

  13. Student-Reported School Drinking Fountain Availability by Youth Characteristics and State Plumbing Codes

    PubMed Central

    Park, Sohyun; Wilking, Cara

    2014-01-01

    Introduction Caloric intake among children could be reduced if sugar-sweetened beverages were replaced by plain water. School drinking water infrastructure is dictated in part by state plumbing codes, which generally require a minimum ratio of drinking fountains to students. Actual availability of drinking fountains in schools and how availability differs according to plumbing codes is unknown. Methods We abstracted state plumbing code data and used the 2010 YouthStyles survey data from 1,196 youth aged 9 through 18 years from 47 states. We assessed youth-reported school drinking fountain or dispenser availability and differences in availability according to state plumbing codes, sociodemographic characteristics, and area-level characteristics. Results Overall, 57.3% of youth reported that drinking fountains or dispensers in their schools were widely available, 40.1% reported there were only a few, and 2.6% reported that there were no working fountains. Reported fountain availability differed significantly (P < .01) by race/ethnicity, census region, the fountain to student ratio specified in plumbing codes, and whether plumbing codes allowed substitution of nonplumbed water sources for plumbed fountains. “Widely available” fountain access ranged from 45.7% in the West to 65.4% in the Midwest and was less common where state plumbing codes required 1 fountain per more than 100 students (45.4%) compared with 1 fountain per 100 students (60.1%) or 1 fountain per fewer than 100 students (57.6%). Conclusion Interventions designed to increase consumption of water may want to consider the role of plumbing codes in availability of school drinking fountains. PMID:24742393

  14. Student-reported school drinking fountain availability by youth characteristics and state plumbing codes.

    PubMed

    Onufrak, Stephen J; Park, Sohyun; Wilking, Cara

    2014-04-17

    Caloric intake among children could be reduced if sugar-sweetened beverages were replaced by plain water. School drinking water infrastructure is dictated in part by state plumbing codes, which generally require a minimum ratio of drinking fountains to students. Actual availability of drinking fountains in schools and how availability differs according to plumbing codes is unknown. We abstracted state plumbing code data and used the 2010 YouthStyles survey data from 1,196 youth aged 9 through 18 years from 47 states. We assessed youth-reported school drinking fountain or dispenser availability and differences in availability according to state plumbing codes, sociodemographic characteristics, and area-level characteristics. Overall, 57.3% of youth reported that drinking fountains or dispensers in their schools were widely available, 40.1% reported there were only a few, and 2.6% reported that there were no working fountains. Reported fountain availability differed significantly (P < .01) by race/ethnicity, census region, the fountain to student ratio specified in plumbing codes, and whether plumbing codes allowed substitution of nonplumbed water sources for plumbed fountains. "Widely available" fountain access ranged from 45.7% in the West to 65.4% in the Midwest and was less common where state plumbing codes required 1 fountain per more than 100 students (45.4%) compared with 1 fountain per 100 students (60.1%) or 1 fountain per fewer than 100 students (57.6%). Interventions designed to increase consumption of water may want to consider the role of plumbing codes in availability of school drinking fountains.

  15. SEER Abstracting Tool (SEER*Abs)

    Cancer.gov

    With this customizable tool, registrars can collect and store data abstracted from medical records. Download the software and find technical support and reference manuals. SEER*Abs has features for creating records, managing abstracting work and data, accessing reference data, and integrating edits.

  16. Testing the abstractness of children’s linguistic representations

    PubMed Central

    Savage, Ceri; Lieven, Elena; Theakston, Anna; Tomasello, Michael

    2007-01-01

    The current studies used a priming methodology to assess the abstractness of children’s early syntactic constructions. In the main study, 3-, 4- and 6-year-old children were asked to describe a prime picture by repeating either an active or a passive sentence, and then they were left to their own devices to describe a target picture. For half the children at each age, the prime sentences they repeated had high lexical overlap with the sentence they were likely to produce for the target, whereas for the other half there was very low lexical overlap between prime and target. The main result was that 6-year-old children showed both lexical and structural priming for both the active transitive and passive constructions, whereas 3- and 4-year-old children showed lexical priming only. This pattern of results would seem to indicate that 6-year-old children have relatively abstract representations of these constructions, whereas 3- and 4-year-old children have as an integral part of their representations certain specific lexical items, especially pronouns and some grammatical morphemes. In a second study it was found that children did not need to repeat the prime out loud in order to be primed - suggesting that the priming effect observed concerns not just peripheral production mechanisms but underlying linguistic representations common to comprehension and production. These results support the view that young children develop abstract linguistic representations gradually during the preschool years. PMID:18259588

  17. The Abstraction Process of Limit Knowledge

    ERIC Educational Resources Information Center

    Sezgin Memnun, Dilek; Aydin, Bünyamin; Özbilen, Ömer; Erdogan, Günes

    2017-01-01

    The RBC+C abstraction model is an effective model in mathematics education because it gives the opportunity to analyze research data through cognitive actions. For this reason, we aim to examine the abstraction process of the limit knowledge of two volunteer participant students using the RBC+C abstraction model. With this aim, the students'…

  18. [Differentiation of coding quality in orthopaedics by special, illustration-oriented case group analysis in the G-DRG System 2005].

    PubMed

    Schütz, U; Reichel, H; Dreinhöfer, K

    2007-01-01

    We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.

  19. A Proposed Multimedia Cone of Abstraction: Updating a Classic Instructional Design Theory

    ERIC Educational Resources Information Center

    Baukal, Charles E.; Ausburn, Floyd B.; Ausburn, Lynna J.

    2013-01-01

    Advanced multimedia techniques offer significant learning potential for students. Dale (1946, 1954, 1969) developed a Cone of Experience (CoE) which is a hierarchy of learning experiences ranging from direct participation to abstract symbolic expression. This paper updates the CoE for today's technology and learning context, specifically focused…

  20. Efficient convolutional sparse coding

    DOEpatents

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  1. Letter Position Coding Across Modalities: The Case of Braille Readers

    PubMed Central

    Perea, Manuel; García-Chamorro, Cristina; Martín-Suesta, Miguel; Gómez, Pablo

    2012-01-01

    Background The question of how the brain encodes letter position in written words has attracted increasing attention in recent years. A number of models have recently been proposed to accommodate the fact that transposed-letter stimuli like jugde or caniso are perceptually very close to their base words. Methodology Here we examined how letter position coding is attained in the tactile modality via Braille reading. The idea is that Braille word recognition may provide more serial processing than the visual modality, and this may produce differences in the input coding schemes employed to encode letters in written words. To that end, we conducted a lexical decision experiment with adult Braille readers in which the pseudowords were created by transposing/replacing two letters. Principal Findings We found a word-frequency effect for words. In addition, unlike parallel experiments in the visual modality, we failed to find any clear signs of transposed-letter confusability effects. This dissociation highlights the differences between modalities. Conclusions The present data argue against models of letter position coding that assume that transposed-letter effects (in the visual modality) occur at a relatively late, abstract locus. PMID:23071522

  2. Validation and optimisation of an ICD-10-coded case definition for sepsis using administrative health data

    PubMed Central

    Jolley, Rachel J; Jetté, Nathalie; Sawka, Keri Jo; Diep, Lucy; Goliath, Jade; Roberts, Derek J; Yipp, Bryan G; Doig, Christopher J

    2015-01-01

    Objective Administrative health data are important for health services and outcomes research. We optimised and validated in intensive care unit (ICU) patients an International Classification of Disease (ICD)-coded case definition for sepsis, and compared this with an existing definition. We also assessed the definition's performance in non-ICU (ward) patients. Setting and participants All adults (aged ≥18 years) admitted to a multisystem ICU with general medicosurgical ICU care from one of three tertiary care centres in the Calgary region in Alberta, Canada, between 1 January 2009 and 31 December 2012 were included. Research design Patient medical records were randomly selected and linked to the discharge abstract database. In ICU patients, we validated the Canadian Institute for Health Information (CIHI) ICD-10-CA (Canadian Revision)-coded definition for sepsis and severe sepsis against a reference standard medical chart review, and optimised this algorithm through examination of other conditions apparent in sepsis. Measures Sensitivity (Sn), specificity (Sp), positive predictive value (PPV) and negative predictive value (NPV) were calculated. Results Sepsis was present in 604 of 1001 ICU patients (60.4%). The CIHI ICD-10-CA-coded definition for sepsis had Sn (46.4%), Sp (98.7%), PPV (98.2%) and NPV (54.7%); and for severe sepsis had Sn (47.2%), Sp (97.5%), PPV (95.3%) and NPV (63.2%). The optimised ICD-coded algorithm for sepsis increased Sn by 25.5% and NPV by 11.9% with slightly lowered Sp (85.4%) and PPV (88.2%). For severe sepsis both Sn (65.1%) and NPV (70.1%) increased, while Sp (88.2%) and PPV (85.6%) decreased slightly. Conclusions This study demonstrates that sepsis is highly undercoded in administrative data, thus under-ascertaining the true incidence of sepsis. The optimised ICD-coded definition has a higher validity with higher Sn and should be preferentially considered if used for surveillance purposes. PMID:26700284

  3. Validation and optimisation of an ICD-10-coded case definition for sepsis using administrative health data.

    PubMed

    Jolley, Rachel J; Quan, Hude; Jetté, Nathalie; Sawka, Keri Jo; Diep, Lucy; Goliath, Jade; Roberts, Derek J; Yipp, Bryan G; Doig, Christopher J

    2015-12-23

    Administrative health data are important for health services and outcomes research. We optimised and validated in intensive care unit (ICU) patients an International Classification of Disease (ICD)-coded case definition for sepsis, and compared this with an existing definition. We also assessed the definition's performance in non-ICU (ward) patients. All adults (aged ≥ 18 years) admitted to a multisystem ICU with general medicosurgical ICU care from one of three tertiary care centres in the Calgary region in Alberta, Canada, between 1 January 2009 and 31 December 2012 were included. Patient medical records were randomly selected and linked to the discharge abstract database. In ICU patients, we validated the Canadian Institute for Health Information (CIHI) ICD-10-CA (Canadian Revision)-coded definition for sepsis and severe sepsis against a reference standard medical chart review, and optimised this algorithm through examination of other conditions apparent in sepsis. Sensitivity (Sn), specificity (Sp), positive predictive value (PPV) and negative predictive value (NPV) were calculated. Sepsis was present in 604 of 1001 ICU patients (60.4%). The CIHI ICD-10-CA-coded definition for sepsis had Sn (46.4%), Sp (98.7%), PPV (98.2%) and NPV (54.7%); and for severe sepsis had Sn (47.2%), Sp (97.5%), PPV (95.3%) and NPV (63.2%). The optimised ICD-coded algorithm for sepsis increased Sn by 25.5% and NPV by 11.9% with slightly lowered Sp (85.4%) and PPV (88.2%). For severe sepsis both Sn (65.1%) and NPV (70.1%) increased, while Sp (88.2%) and PPV (85.6%) decreased slightly. This study demonstrates that sepsis is highly undercoded in administrative data, thus under-ascertaining the true incidence of sepsis. The optimised ICD-coded definition has a higher validity with higher Sn and should be preferentially considered if used for surveillance purposes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go

  4. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  5. Rate-compatible punctured convolutional codes (RCPC codes) and their applications

    NASA Astrophysics Data System (ADS)

    Hagenauer, Joachim

    1988-04-01

    The concept of punctured convolutional codes is extended by punctuating a low-rate 1/N code periodically with period P to obtain a family of codes with rate P/(P + l), where l can be varied between 1 and (N - 1)P. A rate-compatibility restriction on the puncturing tables ensures that all code bits of high rate codes are used by the lower-rate codes. This allows transmission of incremental redundancy in ARQ/FEC (automatic repeat request/forward error correction) schemes and continuous rate variation to change from low to high error protection within a data frame. Families of RCPC codes with rates between 8/9 and 1/4 are given for memories M from 3 to 6 (8 to 64 trellis states) together with the relevant distance spectra. These codes are almost as good as the best known general convolutional codes of the respective rates. It is shown that the same Viterbi decoder can be used for all RCPC codes of the same M. The application of RCPC codes to hybrid ARQ/FEC schemes is discussed for Gaussian and Rayleigh fading channels using channel-state information to optimize throughput.

  6. Abstracts of Research Papers 1970.

    ERIC Educational Resources Information Center

    Drowatzky, John N., Ed.

    This publication includes the abstracts of 199 research papers presented at the 1970 American Association for Health, Physical Education, and Recreation convention in Seattle, Washington. Abstracts from symposia on environmental quality education, obesity, motor development, research methods, and laboratory equipment are also included. Each…

  7. Engineering Codes of Ethics and the Duty to Set a Moral Precedent.

    PubMed

    Schlossberger, Eugene

    2016-10-01

    Each of the major engineering societies has its own code of ethics. Seven "common core" clauses and several code-specific clauses can be identified. The paper articulates objections to and rationales for two clauses that raise controversy: do engineers have a duty (a) to provide pro bono services and/or speak out on major issues, and (b) to associate only with reputable individuals and organizations? This latter "association clause" can be justified by the "proclamative principle," an alternative to Kant's universalizability requirement. At the heart of engineering codes of ethics, and implicit in what it is to be a moral agent, the "proclamative principle" asserts that one's life should proclaim one's moral stances (one's values, principles, perceptions, etc.). More specifically, it directs engineers to strive to insure that their actions, thoughts, and relationships be fit to offer to their communities as part of the body of moral precedents for how to be an engineer. Understanding codes of ethics as reflections of this principle casts light both on how to apply the codes and on the distinction between private and professional morality.

  8. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Product codes for manufacturing or sales dates. 20.115 Section 20.115 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Availability of Specific Categories of Records § 20.115 Product codes...

  9. Amino acid codes in mitochondria as possible clues to primitive codes

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  10. Clarifying the abstracts of systematic literature reviews*

    PubMed Central

    Hartley, James

    2000-01-01

    Background: There is a small body of research on improving the clarity of abstracts in general that is relevant to improving the clarity of abstracts of systematic reviews. Objectives: To summarize this earlier research and indicate its implications for writing the abstracts of systematic reviews. Method: Literature review with commentary on three main features affecting the clarity of abstracts: their language, structure, and typographical presentation. Conclusions: The abstracts of systematic reviews should be easier to read than the abstracts of medical research articles, as they are targeted at a wider audience. The aims, methods, results, and conclusions of systematic reviews need to be presented in a consistent way to help search and retrieval. The typographic detailing of the abstracts (type-sizes, spacing, and weights) should be planned to help, rather than confuse, the reader. PMID:11055300

  11. Hierarchical specification of the SIFT fault tolerant flight control system

    NASA Technical Reports Server (NTRS)

    Melliar-Smith, P. M.; Schwartz, R. L.

    1981-01-01

    The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.

  12. Medial orbitofrontal cortex codes relative rather than absolute value of financial rewards in humans.

    PubMed

    Elliott, R; Agnew, Z; Deakin, J F W

    2008-05-01

    Functional imaging studies in recent years have confirmed the involvement of orbitofrontal cortex (OFC) in human reward processing and have suggested that OFC responses are context-dependent. A seminal electrophysiological experiment in primates taught animals to associate abstract visual stimuli with differently valuable food rewards. Subsequently, pairs of these learned abstract stimuli were presented and firing of OFC neurons to the medium-value stimulus was measured. OFC firing was shown to depend on the relative value context. In this study, we developed a human analogue of this paradigm and scanned subjects using functional magnetic resonance imaging. The analysis compared neuronal responses to two superficially identical events, which differed only in terms of the preceding context. Medial OFC response to the same perceptual stimulus was greater when the stimulus predicted the more valuable of two rewards than when it predicted the less valuable. Additional responses were observed in other components of reward circuitry, the amygdala and ventral striatum. The central finding is consistent with the primate results and suggests that OFC neurons code relative rather than absolute reward value. Amygdala and striatal involvement in coding reward value is also consistent with recent functional imaging data. By using a simpler and less confounded paradigm than many functional imaging studies, we are able to demonstrate that relative financial reward value per se is coded in distinct subregions of an extended reward and decision-making network.

  13. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    NASA Astrophysics Data System (ADS)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  14. Abstracts Produced Using Computer Assistance.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2000-01-01

    Describes an experiment that evaluated features of TEXNET abstracting software, compared the use of keywords and phrases that were automatically extracted, tested hypotheses about relations between abstractors' backgrounds and their reactions to abstracting assistance software, and obtained ideas for further features to be developed in TEXNET.…

  15. Validating the Use of ICD-9 Code Mapping to Generate Injury Severity Scores

    PubMed Central

    Fleischman, Ross J.; Mann, N. Clay; Dai, Mengtao; Holmes, James F.; Wang, N. Ewen; Haukoos, Jason; Hsia, Renee Y.; Rea, Thomas; Newgard, Craig D.

    2017-01-01

    The Injury Severity Score (ISS) is a measure of injury severity widely used for research and quality assurance in trauma. Calculation of ISS requires chart abstraction, so it is often unavailable for patients cared for in nontrauma centers. Whether ISS can be accurately calculated from International Classification of Diseases, Ninth Revision (ICD-9) codes remains unclear. Our objective was to compare ISS derived from ICD-9 codes with those coded by trauma registrars. This was a retrospective study of patients entered into 9 U.S. trauma registries from January 2006 through December 2008. Two computer programs, ICDPIC and ICDMAP, were used to derive ISS from the ICD-9 codes in the registries. We compared derived ISS with ISS hand-coded by trained coders. There were 24,804 cases with a mortality rate of 3.9%. The median ISS derived by both ICDPIC (ISS-ICDPIC) and ICDMAP (ISS-ICDMAP) was 8 (interquartile range [IQR] = 4–13). The median ISS in the registry (ISS-registry) was 9 (IQR = 4–14). The median difference between either of the derived scores and ISS-registry was zero. However, the mean ISS derived by ICD-9 code mapping was lower than the hand-coded ISS in the registries (1.7 lower for ICDPIC, 95% CI [1.7, 1.8], Bland–Altman limits of agreement = −10.5 to 13.9; 1.8 lower for ICDMAP, 95% CI [1.7, 1.9], limits of agreement = −9.6 to 13.3). ICD-9-derived ISS slightly underestimated ISS compared with hand-coded scores. The 2 methods showed moderate to substantial agreement. Although hand-coded scores should be used when possible, ICD-9-derived scores may be useful in quality assurance and research when hand-coded scores are unavailable. PMID:28033134

  16. Two-Way Satellite Time and Frequency Transfer Using 1 MChips/s Codes

    DTIC Science & Technology

    2009-11-01

    Abstract The Ku-band transatlantic and Europe-to-Europe two-way satellite time and frequency transfer ( TWSTFT ) operations used 2.5 MChip/s...pseudo-random codes with 3.5 MHz bandwidth until the end of July 2009. The cost of TWSTFT operation is associated with the bandwidth used on a...geostationary satellite. The transatlantic and Europe-to-Europe TWSTFT operations faced a significant increase in cost for using 3.5 MHz bandwidth on a new

  17. Electrophysiological responses to feedback during the application of abstract rules.

    PubMed

    Walsh, Matthew M; Anderson, John R

    2013-11-01

    Much research focuses on how people acquire concrete stimulus-response associations from experience; however, few neuroscientific studies have examined how people learn about and select among abstract rules. To address this issue, we recorded ERPs as participants performed an abstract rule-learning task. In each trial, they viewed a sample number and two test numbers. Participants then chose a test number using one of three abstract mathematical rules they freely selected from: greater than the sample number, less than the sample number, or equal to the sample number. No one rule was always rewarded, but some rules were rewarded more frequently than others. To maximize their earnings, participants needed to learn which rules were rewarded most frequently. All participants learned to select the best rules for repeating and novel stimulus sets that obeyed the overall reward probabilities. Participants differed, however, in the extent to which they overgeneralized those rules to repeating stimulus sets that deviated from the overall reward probabilities. The feedback-related negativity (FRN), an ERP component thought to reflect reward prediction error, paralleled behavior. The FRN was sensitive to item-specific reward probabilities in participants who detected the deviant stimulus set, and the FRN was sensitive to overall reward probabilities in participants who did not. These results show that the FRN is sensitive to the utility of abstract rules and that the individual's representation of a task's states and actions shapes behavior as well as the FRN.

  18. Electrophysiological Responses to Feedback during the Application of Abstract Rules

    PubMed Central

    Walsh, Matthew M.; Anderson, John R.

    2017-01-01

    Much research focuses on how people acquire concrete stimulus–response associations from experience; however, few neuroscientific studies have examined how people learn about and select among abstract rules. To address this issue, we recorded ERPs as participants performed an abstract rule-learning task. In each trial, they viewed a sample number and two test numbers. Participants then chose a test number using one of three abstract mathematical rules they freely selected from: greater than the sample number, less than the sample number, or equal to the sample number. No one rule was always rewarded, but some rules were rewarded more frequently than others. To maximize their earnings, participants needed to learn which rules were rewarded most frequently. All participants learned to select the best rules for repeating and novel stimulus sets that obeyed the overall reward probabilities. Participants differed, however, in the extent to which they overgeneralized those rules to repeating stimulus sets that deviated from the overall reward probabilities. The feedback-related negativity (FRN), an ERP component thought to reflect reward prediction error, paralleled behavior. The FRN was sensitive to item-specific reward probabilities in participants who detected the deviant stimulus set, and the FRN was sensitive to overall reward probabilities in participants who did not. These results show that the FRN is sensitive to the utility of abstract rules and that the individualʼs representation of a taskʼs states and actions shapes behavior as well as the FRN. PMID:23915052

  19. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  20. Food Science and Technology Abstracts.

    ERIC Educational Resources Information Center

    Cohen, Elinor; Federman, Joan

    1979-01-01

    Introduces the reader to the Food Science and Technology Abstracts, a data file that covers worldwide literature on human food commodities and aspects of food processing. Topics include scope, subject index, thesaurus, searching online, and abstracts; tables provide a comparison of ORBIT and DIALOG versions of the file. (JD)

  1. Canonical microcircuits for predictive coding

    PubMed Central

    Bastos, Andre M.; Usrey, W. Martin; Adams, Rick A.; Mangun, George R.; Fries, Pascal; Friston, Karl J.

    2013-01-01

    Summary This review considers the influential notion of a canonical (cortical) microcircuit in light of recent theories about neuronal processing. Specifically, we conciliate quantitative studies of microcircuitry and the functional logic of neuronal computations. We revisit the established idea that message passing among hierarchical cortical areas implements a form of Bayesian inference – paying careful attention to the implications for intrinsic connections among neuronal populations. By deriving canonical forms for these computations, one can associate specific neuronal populations with specific computational roles. This analysis discloses a remarkable correspondence between the microcircuitry of the cortical column and the connectivity implied by predictive coding. Furthermore, it provides some intuitive insights into the functional asymmetries between feedforward and feedback connections and the characteristic frequencies over which they operate. PMID:23177956

  2. Number of minimum-weight code words in a product code

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1978-01-01

    Consideration is given to the number of minimum-weight code words in a product code. The code is considered as a tensor product of linear codes over a finite field. Complete theorems and proofs are presented.

  3. The effect of abstract versus concrete framing on judgments of biological and psychological bases of behavior.

    PubMed

    Kim, Nancy S; Johnson, Samuel G B; Ahn, Woo-Kyoung; Knobe, Joshua

    2017-01-01

    Human behavior is frequently described both in abstract, general terms and in concrete, specific terms. We asked whether these two ways of framing equivalent behaviors shift the inferences people make about the biological and psychological bases of those behaviors. In five experiments, we manipulated whether behaviors are presented concretely (i.e. with reference to a specific person, instantiated in the particular context of that person's life) or abstractly (i.e. with reference to a category of people or behaviors across generalized contexts). People judged concretely framed behaviors to be less biologically based and, on some dimensions, more psychologically based than the same behaviors framed in the abstract. These findings held true for both mental disorders (Experiments 1 and 2) and everyday behaviors (Experiments 4 and 5), and yielded downstream consequences for the perceived efficacy of disorder treatments (Experiment 3). Implications for science educators, students of science, and members of the lay public are discussed.

  4. Concentration of acrylamide in a polyacrylamide gel affects VP4 gene coding assignment of group A equine rotavirus strains with P[12] specificity

    PubMed Central

    2010-01-01

    Background It is universally acknowledged that genome segment 4 of group A rotavirus, the major etiologic agent of severe diarrhea in infants and neonatal farm animals, encodes outer capsid neutralization and protective antigen VP4. Results To determine which genome segment of three group A equine rotavirus strains (H-2, FI-14 and FI-23) with P[12] specificity encodes the VP4, we analyzed dsRNAs of strains H-2, FI-14 and FI-23 as well as their reassortants by polyacrylamide gel electrophoresis (PAGE) at varying concentrations of acrylamide. The relative position of the VP4 gene of the three equine P[12] strains varied (either genome segment 3 or 4) depending upon the concentration of acrylamide. The VP4 gene bearing P[3], P[4], P[6], P[7], P[8] or P[18] specificity did not exhibit this phenomenon when the PAGE running conditions were varied. Conclusions The concentration of acrylamide in a PAGE gel affected VP4 gene coding assignment of equine rotavirus strains bearing P[12] specificity. PMID:20573245

  5. Impact of Concreteness on Comprehensibility, Interest, and Memory for Text: Implications for Dual Coding Theory and Text Design.

    ERIC Educational Resources Information Center

    Sadoski, Mark; And Others

    1993-01-01

    The comprehensibility, interestingness, familiarity, and memorability of concrete and abstract instructional texts were studied in 4 experiments involving 221 college students. Results indicate that concreteness (ease of imagery) is the variable overwhelmingly most related to comprehensibility and recall. Dual coding theory and schema theory are…

  6. Newborn infants perceive abstract numbers

    PubMed Central

    Izard, Véronique; Sann, Coralie; Spelke, Elizabeth S.; Streri, Arlette

    2009-01-01

    Although infants and animals respond to the approximate number of elements in visual, auditory, and tactile arrays, only human children and adults have been shown to possess abstract numerical representations that apply to entities of all kinds (e.g., 7 samurai, seas, or sins). Do abstract numerical concepts depend on language or culture, or do they form a part of humans' innate, core knowledge? Here we show that newborn infants spontaneously associate stationary, visual-spatial arrays of 4–18 objects with auditory sequences of events on the basis of number. Their performance provides evidence for abstract numerical representations at the start of postnatal experience. PMID:19520833

  7. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  8. Innovation Abstracts; Volume XIV, 1992.

    ERIC Educational Resources Information Center

    Roueche, Suanne D., Ed.

    1992-01-01

    This series of 30 one- to two-page abstracts covering 1992 highlights a variety of innovative approaches to teaching and learning in the community college. Topics covered in the abstracts include: (1) faculty recognition and orientation; (2) the Amado M. Pena, Jr., Scholarship Program; (3) innovative teaching techniques, with individual abstracts…

  9. Helium: lifting high-performance stencil kernels from stripped x86 binaries to halide DSL code

    DOE PAGES

    Mendis, Charith; Bosboom, Jeffrey; Wu, Kevin; ...

    2015-06-03

    Highly optimized programs are prone to bit rot, where performance quickly becomes suboptimal in the face of new hardware and compiler techniques. In this paper we show how to automatically lift performance-critical stencil kernels from a stripped x86 binary and generate the corresponding code in the high-level domain-specific language Halide. Using Halide's state-of-the-art optimizations targeting current hardware, we show that new optimized versions of these kernels can replace the originals to rejuvenate the application for newer hardware. The original optimized code for kernels in stripped binaries is nearly impossible to analyze statically. Instead, we rely on dynamic traces to regeneratemore » the kernels. We perform buffer structure reconstruction to identify input, intermediate and output buffer shapes. Here, we abstract from a forest of concrete dependency trees which contain absolute memory addresses to symbolic trees suitable for high-level code generation. This is done by canonicalizing trees, clustering them based on structure, inferring higher-dimensional buffer accesses and finally by solving a set of linear equations based on buffer accesses to lift them up to simple, high-level expressions. Helium can handle highly optimized, complex stencil kernels with input-dependent conditionals. We lift seven kernels from Adobe Photoshop giving a 75 % performance improvement, four kernels from Irfan View, leading to 4.97 x performance, and one stencil from the mini GMG multigrid benchmark netting a 4.25 x improvement in performance. We manually rejuvenated Photoshop by replacing eleven of Photoshop's filters with our lifted implementations, giving 1.12 x speedup without affecting the user experience.« less

  10. Neural Elements for Predictive Coding.

    PubMed

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural

  11. Neural Elements for Predictive Coding

    PubMed Central

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by

  12. Design and implementation of coded aperture coherent scatter spectral imaging of cancerous and healthy breast tissue samples

    PubMed Central

    Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2016-01-01

    Abstract. A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice. PMID:26962543

  13. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1986-01-01

    A coding scheme for error control in data communication systems is investigated. The scheme is obtained by cascading two error correcting codes, called the inner and the outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon < 1/2. It is shown that, if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging from high rates to very low rates and Reed-Solomon codes are considered, and their probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates, say 0.1 to 0.01. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  14. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large

  15. The non-coding RNA landscape of human hematopoiesis and leukemia.

    PubMed

    Schwarzer, Adrian; Emmrich, Stephan; Schmidt, Franziska; Beck, Dominik; Ng, Michelle; Reimer, Christina; Adams, Felix Ferdinand; Grasedieck, Sarah; Witte, Damian; Käbler, Sebastian; Wong, Jason W H; Shah, Anushi; Huang, Yizhou; Jammal, Razan; Maroz, Aliaksandra; Jongen-Lavrencic, Mojca; Schambach, Axel; Kuchenbauer, Florian; Pimanda, John E; Reinhardt, Dirk; Heckl, Dirk; Klusmann, Jan-Henning

    2017-08-09

    Non-coding RNAs have emerged as crucial regulators of gene expression and cell fate decisions. However, their expression patterns and regulatory functions during normal and malignant human hematopoiesis are incompletely understood. Here we present a comprehensive resource defining the non-coding RNA landscape of the human hematopoietic system. Based on highly specific non-coding RNA expression portraits per blood cell population, we identify unique fingerprint non-coding RNAs-such as LINC00173 in granulocytes-and assign these to critical regulatory circuits involved in blood homeostasis. Following the incorporation of acute myeloid leukemia samples into the landscape, we further uncover prognostically relevant non-coding RNA stem cell signatures shared between acute myeloid leukemia blasts and healthy hematopoietic stem cells. Our findings highlight the importance of the non-coding transcriptome in the formation and maintenance of the human blood hierarchy.While micro-RNAs are known regulators of haematopoiesis and leukemogenesis, the role of long non-coding RNAs is less clear. Here the authors provide a non-coding RNA expression landscape of the human hematopoietic system, highlighting their role in the formation and maintenance of the human blood hierarchy.

  16. Qualitative assessment of cause-of-injury coding in U.S. military hospitals: NATO standardization agreement (STANAG) 2050.

    PubMed

    Amoroso, P J; Smith, G S; Bell, N S

    2000-04-01

    Accurate injury cause data are essential for injury prevention research. U.S. military hospitals, unlike civilian hospitals, use the NATO STANAG system for cause-of-injury coding. Reported deficiencies in civilian injury cause data suggested a need to specifically evaluate the STANAG. The Total Army Injury and Health Outcomes Database (TAIHOD) was used to evaluate worldwide Army injury hospitalizations, especially STANAG Trauma, Injury, and Place of Occurrence coding. We conducted a review of hospital procedures at Tripler Army Medical Center (TAMC) including injury cause and intent coding, potential crossover between acute injuries and musculoskeletal conditions, and data for certain hospital patients who are not true admissions. We also evaluated the use of free-text injury comment fields in three hospitals. Army-wide review of injury records coding revealed full compliance with cause coding, although nonspecific codes appeared to be overused. A small but intensive single hospital records review revealed relatively poor intent coding but good activity and cause coding. Data on specific injury history were present on most acute injury records and 75% of musculoskeletal conditions. Place of Occurrence coding, although inherently nonspecific, was over 80% accurate. Review of text fields produced additional details of the injuries in over 80% of cases. STANAG intent coding specificity was poor, while coding of cause of injury was at least comparable to civilian systems. The strengths of military hospital data systems are an exceptionally high compliance with injury cause coding, the availability of free text, and capture of all population hospital records without regard to work-relatedness. Simple changes in procedures could greatly improve data quality.

  17. Going beyond the Evidence: Abstract Laws and Preschoolers' Responses to Anomalous Data

    ERIC Educational Resources Information Center

    Schulz, Laura E.; Goodman, Noah D.; Tenenbaum, Joshua B.; Jenkins, Adrianna C.

    2008-01-01

    Given minimal evidence about novel objects, children might learn only relationships among the specific entities, or they might make a more abstract inference, positing classes of entities and the relations that hold among those classes. Here we show that preschoolers (mean: 57 months) can use sparse data about perceptually unique objects to infer…

  18. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  19. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  20. Innovation Abstracts, Volume XV, 1993.

    ERIC Educational Resources Information Center

    Roueche, Suanne D., Ed.

    1993-01-01

    This volume of 30 one- to two-page abstracts from 1993 highlights a variety of innovative approaches to teaching and learning in the community college. Topics covered in the abstracts include: (1) role-playing to encourage critical thinking; (2) team learning techniques to cultivate business skills; (3) librarian-instructor partnerships to create…