Sample records for regular expression compiler

  1. Specialized Silicon Compilers for Language Recognition.

    DTIC Science & Technology

    1984-07-01

    realizations of non-deterministic automata have been reported that solve these problems in diffierent ways. Floyd and Ullman [ 281 have presented a...in Applied Mathematics, pages 19-31. American Mathematical Society, 1967. [ 281 Floyd, R. W. and J. D. Ullman. The Compilation of Regular Expressions...Shannon (editor). Automata Studies, chapter 1, pages 3-41. Princeton University Press, Princeton. N. J., 1956. [44] Kohavi, Zwi . Switching and Finite

  2. 42 CFR 61.3 - Purpose of regular fellowships.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Purpose of regular fellowships. 61.3 Section 61.3..., TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships are... sciences and communication of information. (b) Special scientific projects for the compilation of existing...

  3. Classic of the Month. [Columns Compiled from Three Issues of "Notes Plus," September 1983 to January 1984.

    ERIC Educational Resources Information Center

    Notes Plus, 1984

    1984-01-01

    Three installments of "Classic of the Month," a regular feature of the National Council of Teachers of English publication, "Notes Plus," are presented in this compilation. Each installment of this feature is intended to provide teaching ideas related to a "classic" novel. The first article offers a variety of…

  4. Descriptions and Abstracts of Regular Education Inservice Projects (REGI).

    ERIC Educational Resources Information Center

    Erwin, Barbara, Comp.; And Others

    This description of the Regular Education Inservice (REGI) effort in fiscal year 1981 includes a summary analysis of data from the REGI projects and a state by state compilation of project abstracts. Following the summary analysis of the REGI effort, project abstracts are organized by state or territory. Within each state or territory section,…

  5. 36 CFR 705.1 - Scope and purpose of this part.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN TELEVISION AND RADIO... consisting of regularly scheduled newscasts or on-the-spot coverage of news events. ...

  6. 36 CFR 705.1 - Scope and purpose of this part.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN TELEVISION AND RADIO... consisting of regularly scheduled newscasts or on-the-spot coverage of news events. ...

  7. 36 CFR 705.1 - Scope and purpose of this part.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN TELEVISION AND RADIO... consisting of regularly scheduled newscasts or on-the-spot coverage of news events. ...

  8. 36 CFR 705.1 - Scope and purpose of this part.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN TELEVISION AND RADIO... consisting of regularly scheduled newscasts or on-the-spot coverage of news events. ...

  9. 42 CFR 61.3 - Purpose of regular fellowships.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... sciences and communication of information. (b) Special scientific projects for the compilation of existing, or writing of original, contributions relating to scientific, social, or cultural advancements in sciences related to health. ...

  10. 42 CFR 61.3 - Purpose of regular fellowships.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... sciences and communication of information. (b) Special scientific projects for the compilation of existing, or writing of original, contributions relating to scientific, social, or cultural advancements in sciences related to health. ...

  11. 42 CFR 61.3 - Purpose of regular fellowships.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... sciences and communication of information. (b) Special scientific projects for the compilation of existing, or writing of original, contributions relating to scientific, social, or cultural advancements in sciences related to health. ...

  12. 42 CFR 61.3 - Purpose of regular fellowships.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... sciences and communication of information. (b) Special scientific projects for the compilation of existing, or writing of original, contributions relating to scientific, social, or cultural advancements in sciences related to health. ...

  13. 36 CFR § 705.1 - Scope and purpose of this part.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN TELEVISION AND RADIO... consisting of regularly scheduled newscasts or on-the-spot coverage of news events. ...

  14. 36 CFR § 705.8 - Agreements modifying the terms of this part.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... REPRODUCTION, COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN... phonorecords of transmission programs of regularly scheduled newscasts or on-the-spot coverage of news events...

  15. 36 CFR 705.8 - Agreements modifying the terms of this part.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... REPRODUCTION, COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN... phonorecords of transmission programs of regularly scheduled newscasts or on-the-spot coverage of news events...

  16. 36 CFR 705.8 - Agreements modifying the terms of this part.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REPRODUCTION, COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN... phonorecords of transmission programs of regularly scheduled newscasts or on-the-spot coverage of news events...

  17. 36 CFR 705.8 - Agreements modifying the terms of this part.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... REPRODUCTION, COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN... phonorecords of transmission programs of regularly scheduled newscasts or on-the-spot coverage of news events...

  18. 36 CFR 705.8 - Agreements modifying the terms of this part.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... REPRODUCTION, COMPILATION, AND DISTRIBUTION OF NEWS TRANSMISSIONS UNDER THE PROVISIONS OF THE AMERICAN... phonorecords of transmission programs of regularly scheduled newscasts or on-the-spot coverage of news events...

  19. The Basic Regularities of Education and Their Application in Higher Education Research and Practice: Brief Description of the Basic Regularities ("Guilu") of Education

    ERIC Educational Resources Information Center

    Maoyuan, Pan

    2007-01-01

    Research on the issues of higher education has been going on for a long time. However, higher education pedagogy as independent discipline has been present in China for only about ten years. The structure of a discipline cannot consist merely of a compilation of the issues under research but must also include its basic theories and a system of…

  20. An integrated runtime and compile-time approach for parallelizing structured and block structured applications

    NASA Technical Reports Server (NTRS)

    Agrawal, Gagan; Sussman, Alan; Saltz, Joel

    1993-01-01

    Scientific and engineering applications often involve structured meshes. These meshes may be nested (for multigrid codes) and/or irregularly coupled (called multiblock or irregularly coupled regular mesh problems). A combined runtime and compile-time approach for parallelizing these applications on distributed memory parallel machines in an efficient and machine-independent fashion was described. A runtime library which can be used to port these applications on distributed memory machines was designed and implemented. The library is currently implemented on several different systems. To further ease the task of application programmers, methods were developed for integrating this runtime library with compilers for HPK-like parallel programming languages. How this runtime library was integrated with the Fortran 90D compiler being developed at Syracuse University is discussed. Experimental results to demonstrate the efficacy of our approach are presented. A multiblock Navier-Stokes solver template and a multigrid code were experimented with. Our experimental results show that our primitives have low runtime communication overheads. Further, the compiler parallelized codes perform within 20 percent of the code parallelized by manually inserting calls to the runtime library.

  1. Library Laws of Texas.

    ERIC Educational Resources Information Center

    Seidenberg, Ed, Ed.

    Compiled to provide a central reference point for all legislative information pertaining to libraries in the state of Texas, this publication includes all pertinent legislation as amended through the 66th Legislature, Regular Session, 1979. It contains articles dealing specifically with archives, buildings and property, city libraries, non-profit…

  2. Compile-time estimation of communication costs in multicomputers

    NASA Technical Reports Server (NTRS)

    Gupta, Manish; Banerjee, Prithviraj

    1991-01-01

    An important problem facing numerous research projects on parallelizing compilers for distributed memory machines is that of automatically determining a suitable data partitioning scheme for a program. Any strategy for automatic data partitioning needs a mechanism for estimating the performance of a program under a given partitioning scheme, the most crucial part of which involves determining the communication costs incurred by the program. A methodology is described for estimating the communication costs at compile-time as functions of the numbers of processors over which various arrays are distributed. A strategy is described along with its theoretical basis, for making program transformations that expose opportunities for combining of messages, leading to considerable savings in the communication costs. For certain loops with regular dependences, the compiler can detect the possibility of pipelining, and thus estimate communication costs more accurately than it could otherwise. These results are of great significance to any parallelization system supporting numeric applications on multicomputers. In particular, they lay down a framework for effective synthesis of communication on multicomputers from sequential program references.

  3. System, apparatus and methods to implement high-speed network analyzers

    DOEpatents

    Ezick, James; Lethin, Richard; Ros-Giralt, Jordi; Szilagyi, Peter; Wohlford, David E

    2015-11-10

    Systems, apparatus and methods for the implementation of high-speed network analyzers are provided. A set of high-level specifications is used to define the behavior of the network analyzer emitted by a compiler. An optimized inline workflow to process regular expressions is presented without sacrificing the semantic capabilities of the processing engine. An optimized packet dispatcher implements a subset of the functions implemented by the network analyzer, providing a fast and slow path workflow used to accelerate specific processing units. Such dispatcher facility can also be used as a cache of policies, wherein if a policy is found, then packet manipulations associated with the policy can be quickly performed. An optimized method of generating DFA specifications for network signatures is also presented. The method accepts several optimization criteria, such as min-max allocations or optimal allocations based on the probability of occurrence of each signature input bit.

  4. 1 CFR 10.3 - Format.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Format. 10.3 Section 10.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER SPECIAL EDITIONS OF THE FEDERAL REGISTER PRESIDENTIAL PAPERS Regular Publication § 10.3 Format. The Daily Compilation of Presidential Documents is published online on...

  5. 1 CFR 10.3 - Format.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 1 General Provisions 1 2011-01-01 2011-01-01 false Format. 10.3 Section 10.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER SPECIAL EDITIONS OF THE FEDERAL REGISTER PRESIDENTIAL PAPERS Regular Publication § 10.3 Format. The Daily Compilation of Presidential Documents is published online on...

  6. 1 CFR 10.3 - Format.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 1 General Provisions 1 2014-01-01 2012-01-01 true Format. 10.3 Section 10.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER SPECIAL EDITIONS OF THE FEDERAL REGISTER PRESIDENTIAL PAPERS Regular Publication § 10.3 Format. The Daily Compilation of Presidential Documents is published online on...

  7. 1 CFR 10.3 - Format.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 1 General Provisions 1 2013-01-01 2012-01-01 true Format. 10.3 Section 10.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER SPECIAL EDITIONS OF THE FEDERAL REGISTER PRESIDENTIAL PAPERS Regular Publication § 10.3 Format. The Daily Compilation of Presidential Documents is published online on...

  8. 1 CFR 10.3 - Format.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 1 General Provisions 1 2012-01-01 2012-01-01 false Format. 10.3 Section 10.3 General Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER SPECIAL EDITIONS OF THE FEDERAL REGISTER PRESIDENTIAL PAPERS Regular Publication § 10.3 Format. The Daily Compilation of Presidential Documents is published online on...

  9. Guide to Special Information in Scientific and Engineering Journals.

    ERIC Educational Resources Information Center

    Harris, Mary Elizabeth

    This annotated bibliography lists 203 special features or special issues of science and technology periodicals with emphasis on compilations of information that appear in periodicals on a regular basis. Subjects covered in the guide include aeronautics, air-conditioning and refrigeration engineering, astronomy, automobiles, biology, botany,…

  10. Library Laws of Texas.

    ERIC Educational Resources Information Center

    Getz, Richard E., Comp.

    Compiled to provide a central reference point for all legislative information pertaining to libraries in the State of Texas, this publication includes all pertinent legislation as amended through the 71st Legislature, 1989, Regular Session. This update of the 1980 edition has been expanded to include statutes pertaining to the school and academic…

  11. Clues about Reading Enrichment (CARE).

    ERIC Educational Resources Information Center

    Daly, Nancy Jo; And Others

    Compiled by members of the Reading Committee of the North Middlesex Regional School District (Massachusetts), this illustrated guide provides tips, suggestions, and activities that parents can follow at home to help their children read. The Clues about Reading Enrichment (CARE) guide notes that regularly reading aloud to and with children is an…

  12. HOPE: Just-in-time Python compiler for astrophysical computations

    NASA Astrophysics Data System (ADS)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  13. Extension of Alvis compiler front-end

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providingmore » new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.« less

  14. Russian Universities: Towards Ambitious Goals

    ERIC Educational Resources Information Center

    Rodionov, Dmitriy Grigorievich; Fersman, Natalia Gennadievna; Kushneva, Olga Alexandrovna

    2016-01-01

    An increased competition in the world market of educational services has brought about new tools to raise the prestige of higher education institutions in the opinion of students and employers. The most important of these tools are the rankings of the best universities in the world, regularly compiled by well-known foreign agencies. The Russian…

  15. Research in Structures, Structural Dynamics and Materials, 1990

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Compiler); Noor, Ahmed K. (Compiler)

    1990-01-01

    The Structural Dynamics and Materials (SDM) Conference was held on April 2 to 4, 1990 in Long Beach, California. This publication is a compilation of presentations of the work-in-progress sessions and does not contain papers from the regular sessions since those papers are published by AIAA in the conference proceedings.

  16. Interventions. Organizing Systems To Support Competent Social Behavior in Children and Youth.

    ERIC Educational Resources Information Center

    Carter, Susanne

    This guide describes classroom and school interventions intended to meet the needs of students with emotional/behavioral disabilities and those at risk for developing these disabilities. The first section presents "Classroom Interventions," a compilation of 77 interventions which may be used in regular or self-contained classrooms. A brief…

  17. ONRASIA Scientific Information Bulletin, Volume 16, Number 1

    DTIC Science & Technology

    1991-03-01

    be expressed naturally in an and hence the programs produced by pline. They range from computing the algebraic language such as Fortran, these efforts...years devel- gram an iterative scheme to solve the function satisfies oping vectorizing compilers for Hitachi. problem. This is quite natural to do in...for it ential equations to be expressed in a on the plate, with 0,=1 at the outside to compile into efficient vectorizable natural mathematical syntax

  18. Systematic genomic identification of colorectal cancer genes delineating advanced from early clinical stage and metastasis

    PubMed Central

    2013-01-01

    Background Colorectal cancer is the third leading cause of cancer deaths in the United States. The initial assessment of colorectal cancer involves clinical staging that takes into account the extent of primary tumor invasion, determining the number of lymph nodes with metastatic cancer and the identification of metastatic sites in other organs. Advanced clinical stage indicates metastatic cancer, either in regional lymph nodes or in distant organs. While the genomic and genetic basis of colorectal cancer has been elucidated to some degree, less is known about the identity of specific cancer genes that are associated with advanced clinical stage and metastasis. Methods We compiled multiple genomic data types (mutations, copy number alterations, gene expression and methylation status) as well as clinical meta-data from The Cancer Genome Atlas (TCGA). We used an elastic-net regularized regression method on the combined genomic data to identify genetic aberrations and their associated cancer genes that are indicators of clinical stage. We ranked candidate genes by their regression coefficient and level of support from multiple assay modalities. Results A fit of the elastic-net regularized regression to 197 samples and integrated analysis of four genomic platforms identified the set of top gene predictors of advanced clinical stage, including: WRN, SYK, DDX5 and ADRA2C. These genetic features were identified robustly in bootstrap resampling analysis. Conclusions We conducted an analysis integrating multiple genomic features including mutations, copy number alterations, gene expression and methylation. This integrated approach in which one considers all of these genomic features performs better than any individual genomic assay. We identified multiple genes that robustly delineate advanced clinical stage, suggesting their possible role in colorectal cancer metastatic progression. PMID:24308539

  19. JPL Mission Bibliometrics

    NASA Technical Reports Server (NTRS)

    Coppin, Ann

    2013-01-01

    For a number of years ongoing bibliographies of various JPL missions (AIRS, ASTER, Cassini, GRACE, Earth Science, Mars Exploration Rovers (Spirit & Opportunity)) have been compiled by the JPL Library. Mission specific bibliographies are compiled by the Library and sent to mission scientists and managers in the form of regular (usually quarterly) updates. Charts showing publications by years are periodically provided to the ASTER, Cassini, and GRACE missions for supporting Senior Review/ongoing funding requests, and upon other occasions as a measure of the impact of the missions. Basically the Web of Science, Compendex, sometimes Inspec, GeoRef and Aerospace databases are searched for the mission name in the title, abstract, and assigned keywords. All get coded for journal publications that are refereed publications.

  20. Space Mathematics, A Resource for Teachers Outlining Supplementary Space-Related Problems in Mathematics.

    ERIC Educational Resources Information Center

    Reynolds, Thomas D.; And Others

    This compilation of 138 problems illustrating applications of high school mathematics to various aspects of space science is intended as a resource from which the teacher may select questions to supplement his regular course. None of the problems require a knowledge of calculus or physics, and solutions are presented along with the problem…

  1. East Coast Logo Conference '87 Proceedings (Arlington, Virginia, April 2-4, 1987).

    ERIC Educational Resources Information Center

    International Council for Computers in Education, Eugene, OR.

    A total of 59 papers are compiled into these proceedings. The papers are organized alphabetically by each author's last name. A directory of speakers' names and addresses is included. In the index of this publication, papers are listed by session. General sessions are listed first, followed by 21 regular sessions: (1) "Logo and Music";…

  2. Guide to Special Information in Scientific and Engineering Journals.

    ERIC Educational Resources Information Center

    Harris, Mary Elizabeth

    This update of a 1983 annotated bibliography lists 298 special features or special issues of science and technology periodicals with emphasis on compilations of information that appear in periodicals on a regular basis. In addition to the 203 entries listed in the original edition, 95 new entries are included. Subjects covered in the guide include…

  3. A Language for Specifying Compiler Optimizations for Generic Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcock, Jeremiah J.

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less

  4. Interprocedural Analysis and the Verification of Concurrent Programs

    DTIC Science & Technology

    2009-01-01

    SSPE ) problem is to compute a regular expression that represents paths(s, v) for all vertices v in the graph. The syntax of regular expressions is as...follows: r ::= ∅ | ε | e | r1 ∪ r2 | r1.r2 | r∗, where e stands for an edge in G. We can use any algorithm for SSPE to compute regular expressions for...a closed representation of loops provides an exponential speedup.2 Tarjan’s path-expression algorithm solves the SSPE problem efficiently. It uses

  5. [Application of regular expression in extracting key information from Chinese medicine literatures about re-evaluation of post-marketing surveillance].

    PubMed

    Wang, Zhifei; Xie, Yanming; Wang, Yongyan

    2011-10-01

    Computerizing extracting information from Chinese medicine literature seems more convenient than hand searching, which could simplify searching process and improve the accuracy. However, many computerized auto-extracting methods are increasingly used, regular expression is so special that could be efficient for extracting useful information in research. This article focused on regular expression applying in extracting information from Chinese medicine literature. Two practical examples were reported in this article about regular expression to extract "case number (non-terminology)" and "efficacy rate (subgroups for related information identification)", which explored how to extract information in Chinese medicine literature by means of some special research method.

  6. Southern California Daily Energy Report

    EIA Publications

    2016-01-01

    EIA has updated its Southern California Daily Energy Report to provide additional information on key energy market indicators for the winter season. The dashboard includes information that EIA regularly compiles about energy operations and the management of natural gas and electricity systems in Southern California in the aftermath of a leak at the Aliso Canyon natural gas storage facility outside of Los Angeles

  7. Idea Place: Early Grades. [Columns Compiled from Seven Issues of Learning Magazine, September to November 1982 and January to April-May 1983.

    ERIC Educational Resources Information Center

    Learning, 1983

    1983-01-01

    The "Idea Place," a regular feature carried by the magazine "Learning," provides an assortment of practical teaching techniques selected from commercially available materials and from ideas submitted by readers. Games, puzzles, and other activities are given for the areas of language arts, reading, mathematics, science, social…

  8. Transportation Expressions

    DOT National Transportation Integrated Search

    1994-11-01

    This report compiles definitions of transportation terms used throughout the Department of Transportation and other US government agencies. This is the first edition of Transportation Expressions; future editions will be expanded in scope to include ...

  9. Processing SPARQL queries with regular expressions in RDF databases

    PubMed Central

    2011-01-01

    Background As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users’ requests for extracting information from the RDF data as well as the lack of users’ knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. Results In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Conclusions Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns. PMID:21489225

  10. Processing SPARQL queries with regular expressions in RDF databases.

    PubMed

    Lee, Jinsoo; Pham, Minh-Duc; Lee, Jihwan; Han, Wook-Shin; Cho, Hune; Yu, Hwanjo; Lee, Jeong-Hoon

    2011-03-29

    As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users' requests for extracting information from the RDF data as well as the lack of users' knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns.

  11. Low-rank regularization for learning gene expression programs.

    PubMed

    Ye, Guibo; Tang, Mengfan; Cai, Jian-Feng; Nie, Qing; Xie, Xiaohui

    2013-01-01

    Learning gene expression programs directly from a set of observations is challenging due to the complexity of gene regulation, high noise of experimental measurements, and insufficient number of experimental measurements. Imposing additional constraints with strong and biologically motivated regularizations is critical in developing reliable and effective algorithms for inferring gene expression programs. Here we propose a new form of regulation that constrains the number of independent connectivity patterns between regulators and targets, motivated by the modular design of gene regulatory programs and the belief that the total number of independent regulatory modules should be small. We formulate a multi-target linear regression framework to incorporate this type of regulation, in which the number of independent connectivity patterns is expressed as the rank of the connectivity matrix between regulators and targets. We then generalize the linear framework to nonlinear cases, and prove that the generalized low-rank regularization model is still convex. Efficient algorithms are derived to solve both the linear and nonlinear low-rank regularized problems. Finally, we test the algorithms on three gene expression datasets, and show that the low-rank regularization improves the accuracy of gene expression prediction in these three datasets.

  12. Immersion francaise precoce: Education physique 1-7 (Early French Immersion: Physical Education for Grades 1-7).

    ERIC Educational Resources Information Center

    Burt, Andy; And Others

    This curriculum guide for physical education is intended for use in grades 1-7 in the early French immersion program. It is a translation of the regular physical education program and a compilation of references and supplementary teaching material. It is noted that because of the comparative lack of references in French, much of the reference…

  13. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gropp, William Douglas

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  14. Toward Abstracting the Communication Intent in Applications to Improve Portability and Productivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mintz, Tiffany M; Hernandez, Oscar R; Kartsaklis, Christos

    Programming with communication libraries such as the Message Passing Interface (MPI) obscures the high-level intent of the communication in an application and makes static communication analysis difficult to do. Compilers are unaware of communication libraries specifics, leading to the exclusion of communication patterns from any automated analysis and optimizations. To overcome this, communication patterns can be expressed at higher-levels of abstraction and incrementally added to existing MPI applications. In this paper, we propose the use of directives to clearly express the communication intent of an application in a way that is not specific to a given communication library. Our communicationmore » directives allow programmers to express communication among processes in a portable way, giving hints to the compiler on regions of computations that can be overlapped with communication and relaxing communication constraints on the ordering, completion and synchronization of the communication imposed by specific libraries such as MPI. The directives can then be translated by the compiler into message passing calls that efficiently implement the intended pattern and be targeted to multiple communication libraries. Thus far, we have used the directives to express point-to-point communication patterns in C, C++ and Fortran applications, and have translated them to MPI and SHMEM.« less

  15. Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gupta, Manish

    1992-01-01

    Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.

  16. Current and Future Applications of Machine Learning for the US Army

    DTIC Science & Technology

    2018-04-13

    designing from the unwieldy application of the first principles of flight controls, aerodynamics, blade propulsion, and so on, the designers turned...when the number of features runs into millions can become challenging. To overcome these issues, regularization techniques have been developed which...and compiled to run efficiently on either CPU or GPU architectures. 5) Keras63 is a library that contains numerous implementations of commonly used

  17. Mining Clinicians' Electronic Documentation to Identify Heart Failure Patients with Ineffective Self-Management: A Pilot Text-Mining Study.

    PubMed

    Topaz, Maxim; Radhakrishnan, Kavita; Lei, Victor; Zhou, Li

    2016-01-01

    Effective self-management can decrease up to 50% of heart failure hospitalizations. Unfortunately, self-management by patients with heart failure remains poor. This pilot study aimed to explore the use of text-mining to identify heart failure patients with ineffective self-management. We first built a comprehensive self-management vocabulary based on the literature and clinical notes review. We then randomly selected 545 heart failure patients treated within Partners Healthcare hospitals (Boston, MA, USA) and conducted a regular expression search with the compiled vocabulary within 43,107 interdisciplinary clinical notes of these patients. We found that 38.2% (n = 208) patients had documentation of ineffective heart failure self-management in the domains of poor diet adherence (28.4%), missed medical encounters (26.4%) poor medication adherence (20.2%) and non-specified self-management issues (e.g., "compliance issues", 34.6%). We showed the feasibility of using text-mining to identify patients with ineffective self-management. More natural language processing algorithms are needed to help busy clinicians identify these patients.

  18. Reformulating Constraints for Compilability and Efficiency

    NASA Technical Reports Server (NTRS)

    Tong, Chris; Braudaway, Wesley; Mohan, Sunil; Voigt, Kerstin

    1992-01-01

    KBSDE is a knowledge compiler that uses a classification-based approach to map solution constraints in a task specification onto particular search algorithm components that will be responsible for satisfying those constraints (e.g., local constraints are incorporated in generators; global constraints are incorporated in either testers or hillclimbing patchers). Associated with each type of search algorithm component is a subcompiler that specializes in mapping constraints into components of that type. Each of these subcompilers in turn uses a classification-based approach, matching a constraint passed to it against one of several schemas, and applying a compilation technique associated with that schema. While much progress has occurred in our research since we first laid out our classification-based approach [Ton91], we focus in this paper on our reformulation research. Two important reformulation issues that arise out of the choice of a schema-based approach are: (1) compilability-- Can a constraint that does not directly match any of a particular subcompiler's schemas be reformulated into one that does? and (2) Efficiency-- If the efficiency of the compiled search algorithm depends on the compiler's performance, and the compiler's performance depends on the form in which the constraint was expressed, can we find forms for constraints which compile better, or reformulate constraints whose forms can be recognized as ones that compile poorly? In this paper, we describe a set of techniques we are developing for partially addressing these issues.

  19. Manned Space Flight Experiments Symposium: Gemini Missions III and IV

    NASA Technical Reports Server (NTRS)

    1965-01-01

    This is a compilation of papers on in-flight experiments presented at the first symposium of a series, Manned Space Flight Experiments Symposium, sponsored by the National Aeronautics and Space Administration. The results of experiments conducted during the Gemini Missions III and IV are covered. These symposiums are to be conducted for the scientific community at regular intervals on the results of experiments carried out in conjunction with manned space flights.

  20. Compiling global name-space programs for distributed execution

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush

    1990-01-01

    Distributed memory machines do not provide hardware support for a global address space. Thus programmers are forced to partition the data across the memories of the architecture and use explicit message passing to communicate data between processors. The compiler support required to allow programmers to express their algorithms using a global name-space is examined. A general method is presented for analysis of a high level source program and along with its translation to a set of independently executing tasks communicating via messages. If the compiler has enough information, this translation can be carried out at compile-time. Otherwise run-time code is generated to implement the required data movement. The analysis required in both situations is described and the performance of the generated code on the Intel iPSC/2 is presented.

  1. C to VHDL compiler

    NASA Astrophysics Data System (ADS)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  2. Novel harmonic regularization approach for variable selection in Cox's proportional hazards model.

    PubMed

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  3. Status Report on Speech Research: A Report on the Status and Progress of Studies on the Nature of Speech, Instrumentation for its Investigation, and Practical Applications, April 1-September 30, 1983.

    ERIC Educational Resources Information Center

    Studdert-Kennedy, Michael, Ed.; O'Brien, Nancy, Ed.

    Prepared as part of a regular series on the status and progress of studies on the nature of speech, instrumentation for its evaluation, and practical applications for speech research, this compilation contains 14 reports. Topics covered in the reports include the following: (1) phonetic coding and order memory in relation to reading proficiency,…

  4. Effect of MUC8 on Airway Inflammation: A Friend or a Foe?

    PubMed

    Cha, Hee-Jae; Song, Kyoung Seob

    2018-02-06

    In this review, we compile identifying molecular mechanisms of MUC8 gene expression and studies characterizing the physiological functions of MUC8 in the airway and analyzing how altered MUC8 gene expression in the lung is affected by negative regulators.

  5. Evolutionary modification of T-brain (tbr) expression patterns in sand dollar.

    PubMed

    Minemura, Keiko; Yamaguchi, Masaaki; Minokawa, Takuya

    2009-10-01

    The sand dollars are a group of irregular echinoids that diverged from other regular sea urchins approximately 200 million years ago. We isolated two orthologs of T-brain (tbr), Smtbr and Pjtbr, from the indirect developing sand dollar Scaphechinus mirabilis and the direct developing sand dollar Peronella japonica, respectively. The expression patterns of Smtbr and Pjtbr during early development were examined by whole mount in situ hybridization. The expression of Smtbr was first detected in micromere descendants in early blastula stage, similar to tbr expression in regular sea urchins. However, unlike in regular sea urchin, Smtbr expression in middle blastula stage was detected in micromere-descendent cells and a subset of macromere-descendant cells. At gastrula stage, expression of Smtbr was detected in part of the archenteron as well as primary mesenchyme cells. A similar pattern of tbr expression was observed in early Peronella embryos. A comparison of tbr expression patterns between sand dollars and other echinoderm species suggested that broader expression in the endomesoderm is an ancestral character of echinoderms. In addition to the endomesoderm, Pjtbr expression was detected in the apical organ, the animal-most part of the ectoderm.

  6. Novel Harmonic Regularization Approach for Variable Selection in Cox's Proportional Hazards Model

    PubMed Central

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods. PMID:25506389

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Richard D.; Hones, Holger E.

    The RAJA Performance Suite is designed to evaluate performance of the RAJA performance portability library on a wide variety of important high performance computing (HPC) algorithmic lulmels. These kernels assess compiler optimizations and various parallel programming model backends accessible through RAJA, such as OpenMP, CUDA, etc. The Initial version of the suite contains 25 computational kernels, each of which appears in 6 variants: Baseline SequcntiaJ, RAJA SequentiaJ, Baseline OpenMP, RAJA OpenMP, Baseline CUDA, RAJA CUDA. All variants of each kernel perform essentially the same mathematical operations and the loop body code for each kernel is identical across all variants. Theremore » are a few kernels, such as those that contain reduction operations, that require CUDA-specific coding for their CUDA variants. ActuaJ computer instructions executed and how they run in parallel differs depending on the parallel programming model backend used and which optimizations are perfonned by the compiler used to build the Perfonnance Suite executable. The Suite will be used primarily by RAJA developers to perform regular assessments of RAJA performance across a range of hardware platforms and compilers as RAJA features are being developed. It will also be used by LLNL hardware and software vendor panners for new defining requirements for future computing platform procurements and acceptance testing. In particular, the RAJA Performance Suite will be used for compiler acceptance testing of the upcoming CORAUSierra machine {initial LLNL delivery expected in late-2017/early 2018) and the CORAL-2 procurement. The Suite will aJso be used to generate concise source code reproducers of compiler and runtime issues we uncover so that we may provide them to relevant vendors to be fixed.« less

  8. Charting the Inland Seas: A History of the U.S. Lake Survey

    DTIC Science & Technology

    1991-01-01

    thermometer sport- ing in the nineties, we were roasted ; had the pains of purgatory within and without. Return to camp after sundown-supper same as...Islands, and to the east of Bass, and Hen and Chicken Islands on Lake Erie. Early in the season, the party on the steamer Col. J.L. Lusk con- ducted...and outflow rivers from es- tablished relationships, distributed the data to a number of regular recip - ients, and compiled diversion tabulations

  9. Band Gap of InAs(1-x)Sbx with Native Lattice Constant

    DTIC Science & Technology

    2012-12-17

    photodetector9 ( QWIP ) and the type II strained layer superlattice10,11 (SLS) were, and continue to be, studied extensively. The graphical compilation of band... noise floor. To demonstrate that TEM can observe ordering when it is present at the levels needed to affect the band gap, we examined a sample with...and fundamental spots to be about 1.6 orders of magnitude. The same ratio from a regular peak to the noise floor is about 2 orders of magnitude. These

  10. Handbook for Spoken Mathematics: (Larry's Speakeasy).

    ERIC Educational Resources Information Center

    Chang, Lawrence A.; And Others

    This handbook is directed toward those who have to deal with spoken mathematics, yet have insufficient background to know the correct verbal expression for the written symbolic one. It compiles consistent and well-defined ways of uttering mathematical expressions so listeners will receive clear, unambiguous, and well-pronounced representations.…

  11. Empirical Performance Model-Driven Data Layout Optimization and Library Call Selection for Tensor Contraction Expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram

    Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less

  12. Comparison of human cell signaling pathway databases—evolution, drawbacks and challenges

    PubMed Central

    Chowdhury, Saikat; Sarkar, Ram Rup

    2015-01-01

    Elucidating the complexities of cell signaling pathways is of immense importance to gain understanding about various biological phenomenon, such as dynamics of gene/protein expression regulation, cell fate determination, embryogenesis and disease progression. The successful completion of human genome project has also helped experimental and theoretical biologists to analyze various important pathways. To advance this study, during the past two decades, systematic collections of pathway data from experimental studies have been compiled and distributed freely by several databases, which also integrate various computational tools for further analysis. Despite significant advancements, there exist several drawbacks and challenges, such as pathway data heterogeneity, annotation, regular update and automated image reconstructions, which motivated us to perform a thorough review on popular and actively functioning 24 cell signaling databases. Based on two major characteristics, pathway information and technical details, freely accessible data from commercial and academic databases are examined to understand their evolution and enrichment. This review not only helps to identify some novel and useful features, which are not yet included in any of the databases but also highlights their current limitations and subsequently propose the reasonable solutions for future database development, which could be useful to the whole scientific community. PMID:25632107

  13. Note on use of slope diffraction coefficients for aperture antennas on finite ground planes

    NASA Technical Reports Server (NTRS)

    Cockrell, C. R.; Beck, F. B.

    1995-01-01

    The use of slope diffraction coefficients along with regular diffraction coefficients for calculating the radiation patterns of aperture antennas in a finite ground plane is investigated. Explicit expressions for regular diffraction coefficients and slope diffraction coefficients are presented. The expressions for the incident magnetic field in terms of the magnetic current in the aperture are given. The slope of the incident magnetic field is calculated and closed form expressions are presented.

  14. Vacuum polarization in the field of a multidimensional global monopole

    NASA Astrophysics Data System (ADS)

    Grats, Yu. V.; Spirin, P. A.

    2016-11-01

    An approximate expression for the Euclidean Green function of a massless scalar field in the spacetime of a multidimensional global monopole has been derived. Expressions for the vacuum expectation values <ϕ2>ren and < T 00>ren have been derived by the dimensional regularization method. Comparison with the results obtained by alternative regularization methods is made.

  15. Designing a Syntax-Based Retrieval System for Supporting Language Learning

    ERIC Educational Resources Information Center

    Tsao, Nai-Lung; Kuo, Chin-Hwa; Wible, David; Hung, Tsung-Fu

    2009-01-01

    In this paper, we propose a syntax-based text retrieval system for on-line language learning and use a fast regular expression search engine as its main component. Regular expression searches provide more scalable querying and search results than keyword-based searches. However, without a well-designed index scheme, the execution time of regular…

  16. ScreenRecorder: A Utility for Creating Screenshot Video Using Only Original Equipment Manufacturer (OEM) Software on Microsoft Windows Systems

    DTIC Science & Technology

    2015-01-01

    class within Microsoft Visual Studio . 2 It has been tested on and is compatible with Microsoft Vista, 7, and 8 and Visual Studio Express 2008...the ScreenRecorder utility assumes a basic understanding of compiling and running C++ code within Microsoft Visual Studio . This report does not...of Microsoft Visual Studio , the ScreenRecorder utility was developed as a C++ class that can be compiled as a library (static or dynamic) to be

  17. Restriction enzyme cutting site distribution regularity for DNA looping technology.

    PubMed

    Shang, Ying; Zhang, Nan; Zhu, Pengyu; Luo, Yunbo; Huang, Kunlun; Tian, Wenying; Xu, Wentao

    2014-01-25

    The restriction enzyme cutting site distribution regularity and looping conditions were studied systematically. We obtained the restriction enzyme cutting site distributions of 13 commonly used restriction enzymes in 5 model organism genomes through two novel self-compiled software programs. All of the average distances between two adjacent restriction sites fell sharply with increasing statistic intervals, and most fragments were 0-499 bp. A shorter DNA fragment resulted in a lower looping rate, which was also directly proportional to the DNA concentration. When the length was more than 500 bp, the concentration did not affect the looping rate. Therefore, the best known fragment length was longer than 500 bp, and did not contain the restriction enzyme cutting sites which would be used for digestion. In order to make the looping efficiencies reach nearly 100%, 4-5 single cohesive end systems were recommended to digest the genome separately. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Resolving the paradigm crisis in intravenous iron and erythropoietin management.

    PubMed

    Besarab, A

    2006-05-01

    Despite the proven benefits of intravenous (i.v.) iron therapy in anemia management, it remains underutilized in the hemodialysis population. Although overall i.v. iron usage continues to increase slowly, monthly usage statistics compiled by the US Renal Data System suggest that clinicians are not implementing continued dosing regimens following repletion of iron stores. Continued therapy with i.v. iron represents a key opportunity to improve patient outcomes and increase the efficiency of anemia treatment. Regular administration of low doses of i.v. iron prevents the recurrence of iron deficiency, enhances response to recombinant human erythropoietin therapy, minimizes fluctuation of hemoglobin levels, hematocrit levels, and iron stores, and may reduce overall costs of care. This article reviews the importance of i.v. iron dosing on a regular basis in the hemodialysis patient with iron-deficiency anemia and explores reasons why some clinicians may still be reluctant to employ these protocols in the hemodialysis setting.

  19. Complexity in the Chinese stock market and its relationships with monetary policy intensity

    NASA Astrophysics Data System (ADS)

    Ying, Shangjun; Fan, Ying

    2014-01-01

    This paper introduces how to formulate the CSI300 evolving stock index using the Paasche compiling technique of weighed indexes after giving the GCA model. It studies dynamics characteristics of the Chinese stock market and its relationships with monetary policy intensity, based on the evolving stock index. It concludes by saying that it is possible to construct a dynamics equation of the Chinese stock market using three variables, and that it is useless to regular market-complexity according to changing intensity of external factors from a chaos point of view.

  20. Processing of task-irrelevant emotional faces impacted by implicit sequence learning.

    PubMed

    Peng, Ming; Cai, Mengfei; Zhou, Renlai

    2015-12-02

    Attentional load may be increased by task-relevant attention, such as difficulty of task, or task-irrelevant attention, such as an unexpected light-spot in the screen. Several studies have focused on the influence of task-relevant attentional load on task-irrelevant emotion processing. In this study, we used event-related potentials to examine the impact of task-irrelevant attentional load on task-irrelevant expression processing. Eighteen participants identified the color of a word (i.e. the color Stroop task) while a picture of a fearful or a neutral face was shown in the background. The task-irrelevant attentional load was increased by regularly presented congruence trials (congruence between the color and the meaning of the word) in the regular condition because implicit sequence learning was induced. We compared the task-irrelevant expression processing between the regular condition and the random condition (the congruence and incongruence trials were presented randomly). Behaviorally, reaction times for the fearful face condition were faster than the neutral faces condition in the random condition, whereas no significant difference was found in the regular condition. The event-related potential results indicated enhanced positive amplitudes in P2, N2, and P3 components relative to neutral faces in the random condition. In comparison, only P2 differed significantly for the two types of expressions in the regular condition. The study showed that attentional load increased by implicit sequence learning influenced the late processing of task-irrelevant expression.

  1. On the regularized fermionic projector of the vacuum

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    2008-03-01

    We construct families of fermionic projectors with spherically symmetric regularization, which satisfy the condition of a distributional MP-product. The method is to analyze regularization tails with a power law or logarithmic scaling in composite expressions in the fermionic projector. The resulting regularizations break the Lorentz symmetry and give rise to a multilayer structure of the fermionic projector near the light cone. Furthermore, we construct regularizations which go beyond the distributional MP-product in that they yield additional distributional contributions supported at the origin. The remaining freedom for the regularization parameters and the consequences for the normalization of the fermionic states are discussed.

  2. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    NASA Astrophysics Data System (ADS)

    Lee, Chien-Cheng; Huang, Shin-Sheng; Shih, Cheng-Yuan

    2010-12-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO) algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  3. A high performance data parallel tensor contraction framework: Application to coupled electro-mechanics

    NASA Astrophysics Data System (ADS)

    Poya, Roman; Gil, Antonio J.; Ortigosa, Rogelio

    2017-07-01

    The paper presents aspects of implementation of a new high performance tensor contraction framework for the numerical analysis of coupled and multi-physics problems on streaming architectures. In addition to explicit SIMD instructions and smart expression templates, the framework introduces domain specific constructs for the tensor cross product and its associated algebra recently rediscovered by Bonet et al. (2015, 2016) in the context of solid mechanics. The two key ingredients of the presented expression template engine are as follows. First, the capability to mathematically transform complex chains of operations to simpler equivalent expressions, while potentially avoiding routes with higher levels of computational complexity and, second, to perform a compile time depth-first or breadth-first search to find the optimal contraction indices of a large tensor network in order to minimise the number of floating point operations. For optimisations of tensor contraction such as loop transformation, loop fusion and data locality optimisations, the framework relies heavily on compile time technologies rather than source-to-source translation or JIT techniques. Every aspect of the framework is examined through relevant performance benchmarks, including the impact of data parallelism on the performance of isomorphic and nonisomorphic tensor products, the FLOP and memory I/O optimality in the evaluation of tensor networks, the compilation cost and memory footprint of the framework and the performance of tensor cross product kernels. The framework is then applied to finite element analysis of coupled electro-mechanical problems to assess the speed-ups achieved in kernel-based numerical integration of complex electroelastic energy functionals. In this context, domain-aware expression templates combined with SIMD instructions are shown to provide a significant speed-up over the classical low-level style programming techniques.

  4. Gene Expression Data to Mouse Atlas Registration Using a Nonlinear Elasticity Smoother and Landmark Points Constraints

    PubMed Central

    Lin, Tungyou; Guyader, Carole Le; Dinov, Ivo; Thompson, Paul; Toga, Arthur; Vese, Luminita

    2013-01-01

    This paper proposes a numerical algorithm for image registration using energy minimization and nonlinear elasticity regularization. Application to the registration of gene expression data to a neuroanatomical mouse atlas in two dimensions is shown. We apply a nonlinear elasticity regularization to allow larger and smoother deformations, and further enforce optimality constraints on the landmark points distance for better feature matching. To overcome the difficulty of minimizing the nonlinear elasticity functional due to the nonlinearity in the derivatives of the displacement vector field, we introduce a matrix variable to approximate the Jacobian matrix and solve for the simplified Euler-Lagrange equations. By comparison with image registration using linear regularization, experimental results show that the proposed nonlinear elasticity model also needs fewer numerical corrections such as regridding steps for binary image registration, it renders better ground truth, and produces larger mutual information; most importantly, the landmark points distance and L2 dissimilarity measure between the gene expression data and corresponding mouse atlas are smaller compared with the registration model with biharmonic regularization. PMID:24273381

  5. Vacuum polarization in the field of a multidimensional global monopole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grats, Yu. V., E-mail: grats@phys.msu.ru; Spirin, P. A.

    2016-11-15

    An approximate expression for the Euclidean Green function of a massless scalar field in the spacetime of a multidimensional global monopole has been derived. Expressions for the vacuum expectation values 〈ϕ{sup 2}〉{sub ren} and 〈T{sub 00}〉{sub ren} have been derived by the dimensional regularization method. Comparison with the results obtained by alternative regularization methods is made.

  6. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  7. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  8. Morpho-syntactic processing of Arabic plurals after aphasia: dissecting lexical meaning from morpho-syntax within word boundaries.

    PubMed

    Khwaileh, Tariq; Body, Richard; Herbert, Ruth

    2015-01-01

    Within the domain of inflectional morpho-syntax, differential processing of regular and irregular forms has been found in healthy speakers and in aphasia. One view assumes that irregular forms are retrieved as full entities, while regular forms are compiled on-line. An alternative view holds that a single mechanism oversees regular and irregular forms. Arabic offers an opportunity to study this phenomenon, as Arabic nouns contain a consonantal root, delivering lexical meaning, and a vocalic pattern, delivering syntactic information, such as gender and number. The aim of this study is to investigate morpho-syntactic processing of regular (sound) and irregular (broken) Arabic plurals in patients with morpho-syntactic impairment. Three participants with acquired agrammatic aphasia produced plural forms in a picture-naming task. We measured overall response accuracy, then analysed lexical errors and morpho-syntactic errors, separately. Error analysis revealed different patterns of morpho-syntactic errors depending on the type of pluralization (sound vs broken). Omissions formed the vast majority of errors in sound plurals, while substitution was the only error mechanism that occurred in broken plurals. The dissociation was statistically significant for retrieval of morpho-syntactic information (vocalic pattern) but not for lexical meaning (consonantal root), suggesting that the participants' selective impairment was an effect of the morpho-syntax of plurals. These results suggest that irregular plurals forms are stored, while regular forms are derived. The current findings support the findings from other languages and provide a new analysis technique for data from languages with non-concatenative morpho-syntax.

  9. Polarimetric image reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Valenzuela, John R.

    In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalized-likelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters (traditional estimator), and when estimating Stokes parameters directly (Stokes estimator). We define our cost function for reconstruction by a weighted least squares data fit term and a regularization penalty. It is shown that under quadratic regularization, the traditional and Stokes estimators can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating the Stokes parameters directly leads to lower RMS error in reconstruction. Also, the addition of a cross channel regularization term further lowers the RMS error for both methods especially in the case of low SNR. The technique of phase diversity has been used in traditional incoherent imaging systems to jointly estimate an object and optical system aberrations. We extend the technique of phase diversity to polarimetric imaging systems. Specifically, we describe penalized-likelihood methods for jointly estimating Stokes images and optical system aberrations from measurements that contain phase diversity. Jointly estimating Stokes images and optical system aberrations involves a large parameter space. A closed-form expression for the estimate of the Stokes images in terms of the aberration parameters is derived and used in a formulation that reduces the dimensionality of the search space to the number of aberration parameters only. We compare the performance of the joint estimator under both quadratic and edge-preserving regularization. The joint estimator with edge-preserving regularization yields higher fidelity polarization estimates than with quadratic regularization. Under quadratic regularization, using the reduced-parameter search strategy, accurate aberration estimates can be obtained without recourse to regularization "tuning". Phase-diverse wavefront sensing is emerging as a viable candidate wavefront sensor for adaptive-optics systems. In a quadratically penalized weighted least squares estimation framework a closed form expression for the object being imaged in terms of the aberrations in the system is available. This expression offers a dramatic reduction of the dimensionality of the estimation problem and thus is of great interest for practical applications. We have derived an expression for an approximate joint covariance matrix for object and aberrations in the phase diversity context. Our expression for the approximate joint covariance is compared with the "known-object" Cramer-Rao lower bound that is typically used for system parameter optimization. Estimates of the optimal amount of defocus in a phase-diverse wavefront sensor derived from the joint-covariance matrix, the known-object Cramer-Rao bound, and Monte Carlo simulations are compared for an extended scene and a point object. It is found that our variance approximation, that incorporates the uncertainty of the object, leads to an improvement in predicting the optimal amount of defocus to use in a phase-diverse wavefront sensor.

  10. Dimensional regularization in position space and a Forest Formula for Epstein-Glaser renormalization

    NASA Astrophysics Data System (ADS)

    Dütsch, Michael; Fredenhagen, Klaus; Keller, Kai Johannes; Rejzner, Katarzyna

    2014-12-01

    We reformulate dimensional regularization as a regularization method in position space and show that it can be used to give a closed expression for the renormalized time-ordered products as solutions to the induction scheme of Epstein-Glaser. This closed expression, which we call the Epstein-Glaser Forest Formula, is analogous to Zimmermann's Forest Formula for BPH renormalization. For scalar fields, the resulting renormalization method is always applicable, we compute several examples. We also analyze the Hopf algebraic aspects of the combinatorics. Our starting point is the Main Theorem of Renormalization of Stora and Popineau and the arising renormalization group as originally defined by Stückelberg and Petermann.

  11. Evaluation of uncertainty for regularized deconvolution: A case study in hydrophone measurements.

    PubMed

    Eichstädt, S; Wilkens, V

    2017-06-01

    An estimation of the measurand in dynamic metrology usually requires a deconvolution based on a dynamic calibration of the measuring system. Since deconvolution is, mathematically speaking, an ill-posed inverse problem, some kind of regularization is required to render the problem stable and obtain usable results. Many approaches to regularized deconvolution exist in the literature, but the corresponding evaluation of measurement uncertainties is, in general, an unsolved issue. In particular, the uncertainty contribution of the regularization itself is a topic of great importance, because it has a significant impact on the estimation result. Here, a versatile approach is proposed to express prior knowledge about the measurand based on a flexible, low-dimensional modeling of an upper bound on the magnitude spectrum of the measurand. This upper bound allows the derivation of an uncertainty associated with the regularization method in line with the guidelines in metrology. As a case study for the proposed method, hydrophone measurements in medical ultrasound with an acoustic working frequency of up to 7.5 MHz are considered, but the approach is applicable for all kinds of estimation methods in dynamic metrology, where regularization is required and which can be expressed as a multiplication in the frequency domain.

  12. Object-oriented biomedical system modelling--the language.

    PubMed

    Hakman, M; Groth, T

    1999-11-01

    The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.

  13. Trawler Icing: A Compilation of Work Done at N.R.C. (Givrage des Chalutiers: Compilation des Recherches Effectuees au C.N.R.).

    DTIC Science & Technology

    1980-12-01

    stability of fishing vessels of the Inter-Governmental Maritime (’onsultative Organization 1M(’()). it has been found convenient to express the degree of...Stability of Fishing Vessels. Intergovernmental Maritime Consultative Organization , Londm, 1 969. 42. Stallabrass, ,J.R. Icing of Fishing Vessels: An...ovirlinkrd Iniwexir that soic o)f tlIm iiii𔃽iii.1ia \\vatcr x Ill likcly ’Iaslh (I run t1I the iIg surface, hefore reacliui thit( -ul hu tempe33rattirl I

  14. Discovering mutated driver genes through a robust and sparse co-regularized matrix factorization framework with prior information from mRNA expression patterns and interaction network.

    PubMed

    Xi, Jianing; Wang, Minghui; Li, Ao

    2018-06-05

    Discovery of mutated driver genes is one of the primary objective for studying tumorigenesis. To discover some relatively low frequently mutated driver genes from somatic mutation data, many existing methods incorporate interaction network as prior information. However, the prior information of mRNA expression patterns are not exploited by these existing network-based methods, which is also proven to be highly informative of cancer progressions. To incorporate prior information from both interaction network and mRNA expressions, we propose a robust and sparse co-regularized nonnegative matrix factorization to discover driver genes from mutation data. Furthermore, our framework also conducts Frobenius norm regularization to overcome overfitting issue. Sparsity-inducing penalty is employed to obtain sparse scores in gene representations, of which the top scored genes are selected as driver candidates. Evaluation experiments by known benchmarking genes indicate that the performance of our method benefits from the two type of prior information. Our method also outperforms the existing network-based methods, and detect some driver genes that are not predicted by the competing methods. In summary, our proposed method can improve the performance of driver gene discovery by effectively incorporating prior information from interaction network and mRNA expression patterns into a robust and sparse co-regularized matrix factorization framework.

  15. Transcriptome interrogation of human myometrium identifies differentially expressed sense-antisense pairs of protein-coding and long non-coding RNA genes in spontaneous labor at term

    PubMed Central

    Romero, Roberto; Tarca, Adi; Chaemsaithong, Piya; Miranda, Jezid; Chaiworapongsa, Tinnakorn; Jia, Hui; Hassan, Sonia S.; Kalita, Cynthia A.; Cai, Juan; Yeo, Lami; Lipovich, Leonard

    2014-01-01

    Objective The mechanisms responsible for normal and abnormal parturition are poorly understood. Myometrial activation leading to regular uterine contractions is a key component of labor. Dysfunctional labor (arrest of dilatation and/or descent) is a leading indication for cesarean delivery. Compelling evidence suggests that most of these disorders are functional in nature, and not the result of cephalopelvic disproportion. The methodology and the datasets afforded by the post-genomic era provide novel opportunities to understand and target gene functions in these disorders. In 2012, the ENCODE Consortium elucidated the extraordinary abundance and functional complexity of long non-coding RNA genes in the human genome. The purpose of the study was to identify differentially expressed long non-coding RNA genes in human myometrium in women in spontaneous labor at term. Materials and Methods Myometrium was obtained from women undergoing cesarean deliveries who were not in labor (n=19) and women in spontaneous labor at term (n=20). RNA was extracted and profiled using an Illumina® microarray platform. The analysis of the protein coding genes from this study has been previously reported. Here, we have used computational approaches to bound the extent of long non-coding RNA representation on this platform, and to identify co-differentially expressed and correlated pairs of long non-coding RNA genes and protein-coding genes sharing the same genomic loci. Results Upon considering more than 18,498 distinct lncRNA genes compiled nonredundantly from public experimental data sources, and interrogating 2,634 that matched Illumina microarray probes, we identified co-differential expression and correlation at two genomic loci that contain coding-lncRNA gene pairs: SOCS2-AK054607 and LMCD1-NR_024065 in women in spontaneous labor at term. This co-differential expression and correlation was validated by qRT-PCR, an independent experimental method. Intriguingly, one of the two lncRNA genes differentially expressed in term labor had a key genomic structure element, a splice site that lacked evolutionary conservation beyond primates. Conclusions We provide for the first time evidence for coordinated differential expression and correlation of cis-encoded antisense lncRNAs and protein-coding genes with known, as well as novel roles in pregnancy in the myometrium of women in spontaneous labor at term. PMID:24168098

  16. Blooming reduces the antioxidant capacity of dark chocolate in rats without lowering its capacity to improve lipid profiles.

    PubMed

    Shadwell, Naomi; Villalobos, Fatima; Kern, Mark; Hong, Mee Young

    2013-05-01

    Dark chocolate contains high levels of antioxidants which are linked to a reduced risk of cardiovascular disease. Chocolate blooming occurs after exposure to high temperatures. Although bloomed chocolate is safe for human consumption, it is not known whether or not the biological function of bloomed chocolate is affected. We hypothesized that bloomed chocolate would reduce the antioxidant potential and lipid-lowering properties of chocolate through altered expression of related genes. Thirty Sprague-Dawley rats were divided into 3 groups and fed either the control (CON), regular dark chocolate (RDC), or bloomed dark chocolate (BDC) diet. After 3 weeks, serum lipid levels and antioxidant capacity were measured. Hepatic expression of key genes was determined by real time polymerase chain reaction (PCR). Sensory characteristics of bloomed versus regular chocolate were assessed in 28 semi-trained panelists. Rats fed RDC exhibited greater serum antioxidant capacities compared to the CON (P < .05). Antioxidant levels of BDC were not different from RDC or CON. Both RDC and BDC lowered TG compared to CON (P < .05). The rats fed RDC had higher high-density lipoprotein levels compared to the CON (P < .05). In rats given RDC, fatty acid synthase gene expression was down-regulated and low-density lipoprotein receptor transcription was up-regulated (P < .05). Sensory panelists preferred the appearance and surface smoothness of the regular chocolate compared to bloomed chocolate (P < .001). Although blooming blunted the robust antioxidant response produced by regular dark chocolate, these results suggest that bloomed dark chocolate yields similarly beneficial effects on most blood lipid parameters or biomarkers. However, regular dark chocolate may be more beneficial for the improvement of antioxidant status and modulation of gene expression involved in lipid metabolism and promoted greater sensory ratings. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. The Large Sky Area Multi-object Fiber Spectroscopic Telescope Quasar Survey: Quasar Properties from the First Data Release

    NASA Astrophysics Data System (ADS)

    Ai, Y. L.; Wu, Xue-Bing; Yang, Jinyi; Yang, Qian; Wang, Feige; Guo, Rui; Zuo, Wenwen; Dong, Xiaoyi; Zhang, Y.-X.; Yuan, H.-L.; Song, Y.-H.; Wang, Jianguo; Dong, Xiaobo; Yang, M.; -Wu, H.; Shen, S.-Y.; Shi, J.-R.; He, B.-L.; Lei, Y.-J.; Li, Y.-B.; Luo, A.-L.; Zhao, Y.-H.; Zhang, H.-T.

    2016-02-01

    We present preliminary results of the quasar survey in the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) first data release (DR1), which includes the pilot survey and the first year of the regular survey. There are 3921 quasars reliably identified, among which 1180 are new quasars discovered in the survey. These quasars are at low to median redshifts, with a highest z of 4.83. We compile emission line measurements around the Hα, Hβ, Mg II, and C IV regions for the new quasars. The continuum luminosities are inferred from SDSS photometric data with model fitting, as the spectra in DR1 are non-flux-calibrated. We also compile the virial black hole mass estimates, with flags indicating the selection methods, and broad absorption line quasars. The catalog and spectra for these quasars are also available. Of the 3921 quasars, 28% are independently selected with optical-infrared colors, indicating that the method is quite promising for the completeness of the quasar survey. LAMOST DR1 and the ongoing quasar survey will provide valuable data for studies of quasars.

  18. Learning SAS’s Perl Regular Expression Matching the Easy Way: By Doing

    DTIC Science & Technology

    2015-01-12

    Doing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paul Genovesi 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...regex_learning_tool allows both beginner and expert to efficiently practice PRX matching by selecting and processing only the match records that the user is interested...perl regular expression and/or source string. The regex_learning_tool allows both beginner and expert to efficiently practice PRX matching by

  19. Florida Journal of Communication Disorders, 1997.

    ERIC Educational Resources Information Center

    Langhans, Joseph J., Ed.

    1997-01-01

    This annual volume is a compilation of traditional articles, poster publications and clinical reports addressing speech and language impairments and intervention. Featured articles include: (1) "Pantomime Recognition and Pantomime Expression in Persons with Aphasia" (Joseph J. Langhans); (2) "Bilingual Classroom Discourse Skills: An…

  20. HypoxiaDB: a database of hypoxia-regulated proteins

    PubMed Central

    Khurana, Pankaj; Sugadev, Ragumani; Jain, Jaspreet; Singh, Shashi Bala

    2013-01-01

    There has been intense interest in the cellular response to hypoxia, and a large number of differentially expressed proteins have been identified through various high-throughput experiments. These valuable data are scattered, and there have been no systematic attempts to document the various proteins regulated by hypoxia. Compilation, curation and annotation of these data are important in deciphering their role in hypoxia and hypoxia-related disorders. Therefore, we have compiled HypoxiaDB, a database of hypoxia-regulated proteins. It is a comprehensive, manually-curated, non-redundant catalog of proteins whose expressions are shown experimentally to be altered at different levels and durations of hypoxia. The database currently contains 72 000 manually curated entries taken on 3500 proteins extracted from 73 peer-reviewed publications selected from PubMed. HypoxiaDB is distinctive from other generalized databases: (i) it compiles tissue-specific protein expression changes under different levels and duration of hypoxia. Also, it provides manually curated literature references to support the inclusion of the protein in the database and establish its association with hypoxia. (ii) For each protein, HypoxiaDB integrates data on gene ontology, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway, protein–protein interactions, protein family (Pfam), OMIM (Online Mendelian Inheritance in Man), PDB (Protein Data Bank) structures and homology to other sequenced genomes. (iii) It also provides pre-compiled information on hypoxia-proteins, which otherwise requires tedious computational analysis. This includes information like chromosomal location, identifiers like Entrez, HGNC, Unigene, Uniprot, Ensembl, Vega, GI numbers and Genbank accession numbers associated with the protein. These are further cross-linked to respective public databases augmenting HypoxiaDB to the external repositories. (iv) In addition, HypoxiaDB provides an online sequence-similarity search tool for users to compare their protein sequences with HypoxiaDB protein database. We hope that HypoxiaDB will enrich our knowledge about hypoxia-related biology and eventually will lead to the development of novel hypothesis and advancements in diagnostic and therapeutic activities. HypoxiaDB is freely accessible for academic and non-profit users via http://www.hypoxiadb.com. Database URL: http://www.hypoxiadb.com PMID:24178989

  1. Modular Expression Language for Ordinary Differential Equation Editing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blake, Robert C.

    MELODEEis a system for describing systems of initial value problem ordinary differential equations, and a compiler for the language that produces optimized code to integrate the differential equations. Features include rational polynomial approximation for expensive functions and automatic differentiation for symbolic jacobians

  2. Notions of Physical Laws in Childhood.

    ERIC Educational Resources Information Center

    Von Pfuhl Rodrigues, Dulce Madalena Autran

    1980-01-01

    Presented is an experiment investigating children's awareness of regularities in physical phenomena and their capacity for expressing these regularities. Hypothesized and confirmed is that children can use statements with the form and purpose of a physical law. Cartoons related to Archimedes' principle (and connected gravitation and fluid…

  3. Violence against women migrant workers: issues, data and partial solutions.

    PubMed

    Shah, N M; Menon, I

    1997-01-01

    "Despite the creation of specific norms, procedures, and institutions to protect women migrant workers, serious gaps remain. Statistics for measuring violence are not compiled comprehensively or regularly. Two occupations that increase the risk of violence are domestic service and entertainment-related services. Migration through illegal channels and trafficking also increase the risk. This article suggests a list of indicators to measure violence of three major types: (1) economic, (2) social/psychological, and (3) physical/sexual. Evidence from several countries to document instances of violence is reviewed. Major policy issues for the sending and receiving countries are outlined, and some recommendations for addressing such violations are made." excerpt

  4. Quantitative characterization of the small-scale fracture patterns on the plains of Venus

    NASA Technical Reports Server (NTRS)

    Sammis, Charles G.; Bowman, David D.

    1995-01-01

    The objectives of this research project were to (1) compile a comprehensive database of the occurrence of regularly spaced kilometer scale lineations on the volcanic plains of Venus in an effort to verify the effectiveness of the shear-lag model developed by Banerdt and Sammis (1992), and (2) develop a model for the formation of irregular kilometer scale lineations such as typified in the gridded plains region of Guinevere Planitia. Attached to this report is the paper 'A Tectonic Model for the Formation of the Gridded Plains on Guinevere Planitia, Venus, and Implications for the Elastic Thickness of the Lithosphere'.

  5. Skylab sleep monitoring experiment (experiment M133)

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.

    1975-01-01

    A summary of the conceptual design of the Skylab sleep monitoring experiment and a comprehensive compilation of the data-analysis results from the three Skylab missions is presented. One astronaut was studied per flight, electroencephalographic, electro-oculographic, and headmotion signals acquired during sleep by use of an elastic recording cap containing sponge electrodes and an attached miniature preamplifier/accelerometer unit are shown. A control-panel assembly, mounted in the sleep compartment, tested electrodes, preserved analog signals, and automatically analyzed data in real time (providing a telemetered indication of sleep stage). Results indicate that men are able to obtain adequate sleep in regularly scheduled eight-hour rest periods during extended space missions.

  6. Religious Education Forum: Legitimizing Your Value List.

    ERIC Educational Resources Information Center

    McBride, Alfred

    1979-01-01

    Addressing the problem that educators have in compiling a legitimate list of values to teach, the author examines the Bible as a source of value legitimacy and details the values expressed in the Hebrew and Christian covenants, the Ten Commandments, and the six beatitudes. (SJL)

  7. Effective Vectorization with OpenMP 4.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huber, Joseph N.; Hernandez, Oscar R.; Lopez, Matthew Graham

    This paper describes how the Single Instruction Multiple Data (SIMD) model and its extensions in OpenMP work, and how these are implemented in different compilers. Modern processors are highly parallel computational machines which often include multiple processors capable of executing several instructions in parallel. Understanding SIMD and executing instructions in parallel allows the processor to achieve higher performance without increasing the power required to run it. SIMD instructions can significantly reduce the runtime of code by executing a single operation on large groups of data. The SIMD model is so integral to the processor s potential performance that, if SIMDmore » is not utilized, less than half of the processor is ever actually used. Unfortunately, using SIMD instructions is a challenge in higher level languages because most programming languages do not have a way to describe them. Most compilers are capable of vectorizing code by using the SIMD instructions, but there are many code features important for SIMD vectorization that the compiler cannot determine at compile time. OpenMP attempts to solve this by extending the C++/C and Fortran programming languages with compiler directives that express SIMD parallelism. OpenMP is used to pass hints to the compiler about the code to be executed in SIMD. This is a key resource for making optimized code, but it does not change whether or not the code can use SIMD operations. However, in many cases critical functions are limited by a poor understanding of how SIMD instructions are actually implemented, as SIMD can be implemented through vector instructions or simultaneous multi-threading (SMT). We have found that it is often the case that code cannot be vectorized, or is vectorized poorly, because the programmer does not have sufficient knowledge of how SIMD instructions work.« less

  8. Regularization strategies for hyperplane classifiers: application to cancer classification with gene expression data.

    PubMed

    Andries, Erik; Hagstrom, Thomas; Atlas, Susan R; Willman, Cheryl

    2007-02-01

    Linear discrimination, from the point of view of numerical linear algebra, can be treated as solving an ill-posed system of linear equations. In order to generate a solution that is robust in the presence of noise, these problems require regularization. Here, we examine the ill-posedness involved in the linear discrimination of cancer gene expression data with respect to outcome and tumor subclasses. We show that a filter factor representation, based upon Singular Value Decomposition, yields insight into the numerical ill-posedness of the hyperplane-based separation when applied to gene expression data. We also show that this representation yields useful diagnostic tools for guiding the selection of classifier parameters, thus leading to improved performance.

  9. FPGA-accelerated algorithm for the regular expression matching system

    NASA Astrophysics Data System (ADS)

    Russek, P.; Wiatr, K.

    2015-01-01

    This article describes an algorithm to support a regular expressions matching system. The goal was to achieve an attractive performance system with low energy consumption. The basic idea of the algorithm comes from a concept of the Bloom filter. It starts from the extraction of static sub-strings for strings of regular expressions. The algorithm is devised to gain from its decomposition into parts which are intended to be executed by custom hardware and the central processing unit (CPU). The pipelined custom processor architecture is proposed and a software algorithm explained accordingly. The software part of the algorithm was coded in C and runs on a processor from the ARM family. The hardware architecture was described in VHDL and implemented in field programmable gate array (FPGA). The performance results and required resources of the above experiments are given. An example of target application for the presented solution is computer and network security systems. The idea was tested on nearly 100,000 body-based viruses from the ClamAV virus database. The solution is intended for the emerging technology of clusters of low-energy computing nodes.

  10. Quadratic Expressions by Means of "Summing All the Matchsticks"

    ERIC Educational Resources Information Center

    Gierdien, M. Faaiz

    2012-01-01

    This note presents demonstrations of quadratic expressions that come about when particular problems are posed with respect to matchsticks that form regular triangles, squares, pentagons and so on. Usually when such "matchstick" problems are used as ways to foster algebraic thinking, the expressions for the number of matchstick quantities are…

  11. Domain Specific Language Support for Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadayappan, Ponnuswamy

    Domain-Specific Languages (DSLs) offer an attractive path to Exascale software since they provide expressive power through appropriate abstractions and enable domain-specific optimizations. But the advantages of a DSL compete with the difficulties of implementing a DSL, even for a narrowly defined domain. The DTEC project addresses how a variety of DSLs can be easily implemented to leverage existing compiler analysis and transformation capabilities within the ROSE open source compiler as part of a research program focusing on Exascale challenges. The OSU contributions to the DTEC project are in the area of code generation from high-level DSL descriptions, as well asmore » verification of the automatically-generated code.« less

  12. Hypercortisolism as a Potential Concern for Submariners

    DTIC Science & Technology

    2010-12-01

    adverse physiological and psychological states such as glucose intolerance, dyslipidemia , obesity, hypertension (all components of metabolic syndrome...induce hyperglycemia, dyslipidemia , increased amino acid turnover, acidosis, loss of lean body mass, alterations in expression of metabolic genes...glucose intolerance, dyslipidemia , and stress-mediated hypertension are positively affected by regular exercise (93). The obvious benefits that regular

  13. Generalized Second-Order Partial Derivatives of 1/r

    ERIC Educational Resources Information Center

    Hnizdo, V.

    2011-01-01

    The generalized second-order partial derivatives of 1/r, where r is the radial distance in three dimensions (3D), are obtained using a result of the potential theory of classical analysis. Some non-spherical-regularization alternatives to the standard spherical-regularization expression for the derivatives are derived. The utility of a…

  14. Reflections of Life Through Books.

    ERIC Educational Resources Information Center

    Porter, Jane

    The anthology by Jesse Perry, "Reading Ladders for Human Relations," constructed of a blend of best literary works, was compiled based on the conviction that reading selected books would increase the social sensitivity of young people and play a unique role in fostering better human relationships. Its main purposes are expressed in the section…

  15. High-Performance Design Patterns for Modern Fortran

    DOE PAGES

    Haveraaen, Magne; Morris, Karla; Rouson, Damian; ...

    2015-01-01

    This paper presents ideas for using coordinate-free numerics in modern Fortran to achieve code flexibility in the partial differential equation (PDE) domain. We also show how Fortran, over the last few decades, has changed to become a language well-suited for state-of-the-art software development. Fortran’s new coarray distributed data structure, the language’s class mechanism, and its side-effect-free, pure procedure capability provide the scaffolding on which we implement HPC software. These features empower compilers to organize parallel computations with efficient communication. We present some programming patterns that support asynchronous evaluation of expressions comprised of parallel operations on distributed data. We implemented thesemore » patterns using coarrays and the message passing interface (MPI). We compared the codes’ complexity and performance. The MPI code is much more complex and depends on external libraries. The MPI code on Cray hardware using the Cray compiler is 1.5–2 times faster than the coarray code on the same hardware. The Intel compiler implements coarrays atop Intel’s MPI library with the result apparently being 2–2.5 times slower than manually coded MPI despite exhibiting nearly linear scaling efficiency. As compilers mature and further improvements to coarrays comes in Fortran 2015, we expect this performance gap to narrow.« less

  16. [Present situation of hepatitis B in Chile].

    PubMed

    Pereira S, Ana; Valenzuela B, María Teresa; Mora, Judith; Vera, Lilian

    2008-06-01

    Hepatitis B virus infection generates carriers and 8% will evolve to a chronic phase. To perform a compilation of studies on hepatitis B in Chile and other sources of information to estimate the impact of this disease in our country. Published and unpublished evidence about the infection, in the general population and risk groups in our country, was compiled and reviewed critically. Informal interviews with experts, revision of the mandatory notification book of the Ministry of Health and collection of data from laboratories that study hepatitis B virus, were also carried out. The seroprevalence of chronic carriers in blood donors is nearly O.3%. Among risk groups such as health care personnel, the figure is O.7%, among homosexuals 29%, among HIV positive patients 30%, among sexual workers 2% and among children with chronic hemodialysis, 9%. Prevalence rate according to notified cases in 2004 was 1.8 x 100,000 inhabitants. Detection of viral hepatitis B surface antigen in laboratories occurs in 0.2% of donors and 1.396 of non donors. The seroprevalence of hepatitis B virus, the lack of notification, and the introduction of hepatitis B vaccine to our Regular Program of Immunizations, are arguments to develop in Chile a hepatitis B and C surveillance system.

  17. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  18. THE LARGE SKY AREA MULTI-OBJECT FIBER SPECTROSCOPIC TELESCOPE QUASAR SURVEY: QUASAR PROPERTIES FROM THE FIRST DATA RELEASE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ai, Y. L.; Wu, Xue-Bing; Yang, Jinyi

    2016-02-15

    We present preliminary results of the quasar survey in the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) first data release (DR1), which includes the pilot survey and the first year of the regular survey. There are 3921 quasars reliably identified, among which 1180 are new quasars discovered in the survey. These quasars are at low to median redshifts, with a highest z of 4.83. We compile emission line measurements around the Hα, Hβ, Mg ii, and C iv regions for the new quasars. The continuum luminosities are inferred from SDSS photometric data with model fitting, as the spectra inmore » DR1 are non-flux-calibrated. We also compile the virial black hole mass estimates, with flags indicating the selection methods, and broad absorption line quasars. The catalog and spectra for these quasars are also available. Of the 3921 quasars, 28% are independently selected with optical–infrared colors, indicating that the method is quite promising for the completeness of the quasar survey. LAMOST DR1 and the ongoing quasar survey will provide valuable data for studies of quasars.« less

  19. The Colorado River and its deposits downstream from Grand Canyon in Arizona, California, and Nevada

    USGS Publications Warehouse

    Crow, Ryan S.; Block, Debra L.; Felger, Tracey J.; House, P. Kyle; Pearthree, Philip A.; Gootee, Brian F.; Youberg, Ann M.; Howard, Keith A.; Beard, L. Sue

    2018-02-05

    Understanding the evolution of the Colorado River system has direct implications for (1) the processes and timing of continental-scale river system integration, (2) the formation of iconic landscapes like those in and around Grand Canyon, and (3) the availability of groundwater resources. Spatial patterns in the position and type of Colorado River deposits, only discernible through geologic mapping, can be used to test models related to Colorado River evolution. This is particularly true downstream from Grand Canyon where ancestral Colorado River deposits are well-exposed. We are principally interested in (1) regional patterns in the minimum and maximum elevation of each depositional unit, which are affected by depositional mechanism and postdepositional deformation; and (2) the volume of each unit, which reflects regional changes in erosion, transport efficiency, and accommodation space. The volume of Colorado River deposits below Grand Canyon has implications for groundwater resources, as the primary regional aquifer there is composed of those deposits. To this end, we are presently mapping Colorado River deposits and compiling and updating older mapping. This preliminary data release shows the current status of our mapping and compilation efforts. We plan to update it at regular intervals in conjunction with ongoing mapping.

  20. Formal language theory: refining the Chomsky hierarchy

    PubMed Central

    Jäger, Gerhard; Rogers, James

    2012-01-01

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages). PMID:22688632

  1. Formal language theory: refining the Chomsky hierarchy.

    PubMed

    Jäger, Gerhard; Rogers, James

    2012-07-19

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).

  2. Regularization Methods for High-Dimensional Instrumental Variables Regression With an Application to Genetical Genomics

    PubMed Central

    Lin, Wei; Feng, Rui; Li, Hongzhe

    2014-01-01

    In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642

  3. A Systems Toxicology Approach Reveals Biological Pathways Dysregulated by Prenatal Arsenic Exposure

    PubMed Central

    Laine, Jessica E.; Fry, Rebecca C.

    2016-01-01

    BACKGROUND Prenatal exposure to inorganic arsenic (iAs) is associated with dysregulated gene and protein expression in the fetus, both evident at birth. Potential epigenetic mechanisms that underlie these changes include but are not limited to the methylation of cytosines (CpG). OBJECTIVE The aim of the present study was to compile datasets from studies on prenatal arsenic exposure to identify whether key genes, proteins, or both and their associated biological pathways are perturbed. METHODS We compiled datasets from 12 studies that analyzed the relationship between prenatal iAs exposure and fetal changes to the epigenome (5-methyl cytosine), transcriptome (mRNA expression), and/or proteome (protein expression changes). FINDINGS Across the 12 studies, a set of 845 unique genes was identified and found to enrich for their role in biological pathways, including those signaled by peroxisome proliferator-activated receptor, nuclear factor of kappa light polypeptide gene enhancer in B-cells inhibitor, and the glucocorticoid receptor. Tumor necrosis factor was identified as a putative cellular regulator underlying most (n = 277) of the identified iAs-associated genes or proteins. CONCLUSIONS Given their common identification across numerous human cohorts and their known toxicologic role in disease, the identified genes and pathways may underlie altered disease susceptibility associated with prenatal exposure to iAs. PMID:27325076

  4. The representation of the back in idiomatic expressions--do idioms value the body?

    PubMed

    Cedraschi, C; Bove, D; Perrin, E; Vischer, T L

    2000-01-01

    Whilst investigating the influence of patients' representations on the impact of teaching in the back school, we took an interest in 1) the place of the back in the French idioms referring to the body; and 2) the meaning these idioms convey about the back. The idioms including body part terms were sought on the basis of a compilation of French idioms; it has to be noted that such a compilation, however excellent it may be, can only offer a partial view of lay conversation. Occurrence of body parts and of their connotations were assessed. Idioms were classified as positive, negative or neutral, keeping in mind the difficulties of a strict classification in such a field. Drawings were then performed on the basis of the results of the descriptive analysis. Globally, idiomatic expressions offer a rather negative picture of the body or at least suggest that the body is prominently used to express negative ideas and emotions. This is particularly striking for the idioms associated with the back. The analysis of idioms referring to the body allows us to 'see with our own eyes' another aspect of the representations of the body and the back, as they are conveyed in the French language.

  5. Naturalistic Observations of Elicited Expressive Communication of Children with Autism: An Analysis of Teacher Instructions

    ERIC Educational Resources Information Center

    Chiang, Hsu-Min

    2009-01-01

    This study observed expressive communication of 17 Australian and 15 Taiwanese children with autism who were mute or had limited spoken language during 2 hour regular school routines and analyzed teacher instructions associated with elicited expressive communication. Results indicated: (a) the frequency of occurrence of elicited expressive…

  6. Reactor shutdown experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cletcher, J.W.

    1995-10-01

    This is a regular report of summary statistics relating to recent reactor shutdown experience. The information includes both number of events and rates of occurence. It was compiled from data about operating events that were entered into the SCSS data system by the Nuclear Operations Analysis Center at the Oak ridge National Laboratory and covers the six mont period of July 1 to December 31, 1994. Cumulative information, starting from May 1, 1994, is also reported. Updates on shutdown events included in earlier reports is excluded. Information on shutdowns as a function of reactor power at the time of themore » shutdown for both BWR and PWR reactors is given. Data is also discerned by shutdown type and reactor age.« less

  7. Galactic and stellar dynamics in the era of high resolution surveys

    NASA Astrophysics Data System (ADS)

    Boily, C. M.; Combes, F.; Hensler, G.; Spurzem, R.

    2008-12-01

    The conference Galactic and Stellar Dynamics in the Era of High Resolution Surveys took place at the European Doctoral College (EDC) in Strasbourg from 2008 March 16 to 20. The event was co-sponsored by the Astronomische Gesellschaft (AG) and the Société Fran\\c{c}aise d'Astronomie et d'Astrophysique (SF2A), a joint venture aiming to set a new trend of regular thematic meetings in specific areas of research. This special issue of the Astronomische Nachrichten is a compilation of the papers presented at the meeting. We give an outline of the meeting together with a short history of the relations of the two societies.

  8. Dynamics of coherent states in regular and chaotic regimes of the non-integrable Dicke model

    NASA Astrophysics Data System (ADS)

    Lerma-Hernández, S.; Chávez-Carlos, J.; Bastarrachea-Magnani, M. A.; López-del-Carpio, B.; Hirsch, J. G.

    2018-04-01

    The quantum dynamics of initial coherent states is studied in the Dicke model and correlated with the dynamics, regular or chaotic, of their classical limit. Analytical expressions for the survival probability, i.e. the probability of finding the system in its initial state at time t, are provided in the regular regions of the model. The results for regular regimes are compared with those of the chaotic ones. It is found that initial coherent states in regular regions have a much longer equilibration time than those located in chaotic regions. The properties of the distributions for the initial coherent states in the Hamiltonian eigenbasis are also studied. It is found that for regular states the components with no negligible contribution are organized in sequences of energy levels distributed according to Gaussian functions. In the case of chaotic coherent states, the energy components do not have a simple structure and the number of participating energy levels is larger than in the regular cases.

  9. Novel Minicircle Vector for Gene Therapy in Murine Myocardial Infarction

    PubMed Central

    Huang, Mei; Chen, ZhiYing; Hu, Shijun; Jia, Fangjun; Li, Zongjin; Hoyt, Grant; Robbins, Robert C.; Kay, Mark A.; Wu, Joseph C.

    2011-01-01

    Background Conventional plasmids for gene therapy produce low-level and short-term gene expression. In this study, we develop a novel non-viral vector which robustly and persistently expresses the hypoxia inducible factor-1 alpha (HIF-1α) therapeutic gene in the heart, leading to functional benefits following myocardial infarction (MI). Methods and Results We first created minicircles carrying double fusion (MC-DF) reporter gene consisting of firefly luciferase and enhanced green fluorescent protein (Fluc-eGFP) for noninvasive measurement of transfection efficiency. Mouse C2C12 myoblasts and normal FVB mice were used for in vitro and in vivo confirmation, respectively. Bioluminescence imaging (BLI) showed stable minicircle gene expression in the heart for >12 weeks and the activity level was 5.6±1.2 fold stronger than regular plasmid at day 4 (P<0.01). Next, we created minicircles carrying hypoxia inducible factor-1 alpha (MC-HIF-1α) therapeutic gene for treatment of MI. Adult FVB mice underwent LAD ligation and were injected intramyocardially with (1) MC-HIF-1α, (2) regular plasmid carrying HIF-1α (PL-HIF-1α) as positive control, and (3) PBS as negative control (n=10/group). Echocardiographic study showed a significantly greater improvement of left ventricular ejection fraction (LVEF) in the minicircle group (51.3%±3.6%) compared to regular plasmid group (42.3%±4.1%) and saline group (30.5%±2.8%) at week 4 (P<0.05 for both). Histology demonstrated increased neoangiogenesis in both treatment groups. Finally, Western blot showed minicircles express >50% higher HIF-1α level than regular plasmid. Conclusion Taken together, this is the first study to demonstrate that minicircles can significantly improve transfection efficiency, duration of transgene expression, and cardiac contractility. Given the serious drawbacks associated with most viral vectors, we believe this novel non-viral vector can be of great value for cardiac gene therapy protocols. PMID:19752373

  10. The production and perception of emotionally expressive walking sounds: similarities between musical performance and everyday motor activity.

    PubMed

    Giordano, Bruno L; Egermann, Hauke; Bresin, Roberto

    2014-01-01

    Several studies have investigated the encoding and perception of emotional expressivity in music performance. A relevant question concerns how the ability to communicate emotions in music performance is acquired. In accordance with recent theories on the embodiment of emotion, we suggest here that both the expression and recognition of emotion in music might at least in part rely on knowledge about the sounds of expressive body movements. We test this hypothesis by drawing parallels between musical expression of emotions and expression of emotions in sounds associated with a non-musical motor activity: walking. In a combined production-perception design, two experiments were conducted, and expressive acoustical features were compared across modalities. An initial performance experiment tested for similar feature use in walking sounds and music performance, and revealed that strong similarities exist. Features related to sound intensity, tempo and tempo regularity were identified as been used similarly in both domains. Participants in a subsequent perception experiment were able to recognize both non-emotional and emotional properties of the sound-generating walkers. An analysis of the acoustical correlates of behavioral data revealed that variations in sound intensity, tempo, and tempo regularity were likely used to recognize expressed emotions. Taken together, these results lend support the motor origin hypothesis for the musical expression of emotions.

  11. Library Law Handbook: State Laws Relating to Michigan Libraries. 1993 Edition.

    ERIC Educational Resources Information Center

    Michigan Library, Lansing.

    This document is a compilation of state laws relating to Michigan libraries, intended as a tool for library managers and as an expression of continued commitment to strengthening library services throughout the state. It reprints legislation directly related to libraries of all levels, including: library networks; regional libraries: district…

  12. Linking Landscape Characteristics and High Stream Nitrogen in the Oregon Coast Range: Red Alder Complicates Use of Nutrient Criteria

    EPA Science Inventory

    Red alder (a nitrogen-fixing tree) and sea salt inputs can strongly influence stream nitrogen concentrations in western Oregon and Washington. We compiled a database of stream nitrogen and landscape characteristics in the Oregon Coast Range. Basal area of alder, expressed as a ...

  13. ACTFL Workshop Proceedings: Black Literature of French Expression.

    ERIC Educational Resources Information Center

    Geno, Thomas H., Ed.; Bostick, Herman F., Ed.

    This compilation of working papers is part 1 of the proceedings of the ACTFL 1972 preconference workshop on black francophone literature. Part 2, to be completed in the future, will be a bibliography containing primary and secondary sources, textbooks, articles, a glossary of African terms, and resource materials useful in language classrooms. The…

  14. American Indian Prose and Poetry. An Anthology.

    ERIC Educational Resources Information Center

    Astrov, Margot, Ed.

    In this anthology of translations of American Indian prose and poetry, it is pointed out that differences in styles and mental attitudes of various tribes are reflected through self-expression. In keeping with this, the compilation is organized according to geographical regions in North and South America, including Mexico and Central America.…

  15. Pittsburgh Area Preschool Association Publication: Selected Articles (Volume 8, No. 1-4).

    ERIC Educational Resources Information Center

    Frank, Mary, Ed.

    This compilation of short reports distributed to preschool teachers in the Pittsburgh area covers four main topics: (1) Adoption (2) Expressive Art Therapy, (3) The Infant, and (4) Learning Disorders in Young Children. The adoption section includes reports pertaining to the adoption process in Pennsylvania, adoptive parents' legal rights, medical…

  16. Functional Programming with C++ Template Metaprograms

    NASA Astrophysics Data System (ADS)

    Porkoláb, Zoltán

    Template metaprogramming is an emerging new direction of generative programming. With the clever definitions of templates we can force the C++ compiler to execute algorithms at compilation time. Among the application areas of template metaprograms are the expression templates, static interface checking, code optimization with adaption, language embedding and active libraries. However, as template metaprogramming was not an original design goal, the C++ language is not capable of elegant expression of metaprograms. The complicated syntax leads to the creation of code that is hard to write, understand and maintain. Although template metaprogramming has a strong relationship with functional programming, this is not reflected in the language syntax and existing libraries. In this paper we give a short and incomplete introduction to C++ templates and the basics of template metaprogramming. We will enlight the role of template metaprograms, and some important and widely used idioms. We give an overview of the possible application areas as well as debugging and profiling techniques. We suggest a pure functional style programming interface for C++ template metaprograms in the form of embedded Haskell code which is transformed to standard compliant C++ source.

  17. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    NASA Technical Reports Server (NTRS)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  18. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.

  19. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.

  20. A closed expression for the UV-divergent parts of one-loop tensor integrals in dimensional regularization

    NASA Astrophysics Data System (ADS)

    Sulyok, G.

    2017-07-01

    Starting from the general definition of a one-loop tensor N-point function, we use its Feynman parametrization to calculate the ultraviolet (UV-)divergent part of an arbitrary tensor coefficient in the framework of dimensional regularization. In contrast to existing recursion schemes, we are able to present a general analytic result in closed form that enables direct determination of the UV-divergent part of any one-loop tensor N-point coefficient independent from UV-divergent parts of other one-loop tensor N-point coefficients. Simplified formulas and explicit expressions are presented for A-, B-, C-, D-, E-, and F-functions.

  1. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  2. Fish Karyome version 2.1: a chromosome database of fishes and other aquatic organisms

    PubMed Central

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Rashid, Iliyas; Sharma, Jyoti; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra; Murali, S.

    2016-01-01

    A voluminous information is available on karyological studies of fishes; however, limited efforts were made for compilation and curation of the available karyological data in a digital form. ‘Fish Karyome’ database was the preliminary attempt to compile and digitize the available karyological information on finfishes belonging to the Indian subcontinent. But the database had limitations since it covered data only on Indian finfishes with limited search options. Perceiving the feedbacks from the users and its utility in fish cytogenetic studies, the Fish Karyome database was upgraded by applying Linux, Apache, MySQL and PHP (pre hypertext processor) (LAMP) technologies. In the present version, the scope of the system was increased by compiling and curating the available chromosomal information over the globe on fishes and other aquatic organisms, such as echinoderms, molluscs and arthropods, especially of aquaculture importance. Thus, Fish Karyome version 2.1 presently covers 866 chromosomal records for 726 species supported with 253 published articles and the information is being updated regularly. The database provides information on chromosome number and morphology, sex chromosomes, chromosome banding, molecular cytogenetic markers, etc. supported by fish and karyotype images through interactive tools. It also enables the users to browse and view chromosomal information based on habitat, family, conservation status and chromosome number. The system also displays chromosome number in model organisms, protocol for chromosome preparation and allied techniques and glossary of cytogenetic terms. A data submission facility has also been provided through data submission panel. The database can serve as a unique and useful resource for cytogenetic characterization, sex determination, chromosomal mapping, cytotaxonomy, karyo-evolution and systematics of fishes. Database URL: http://mail.nbfgr.res.in/Fish_Karyome PMID:26980518

  3. Fish Karyome version 2.1: a chromosome database of fishes and other aquatic organisms.

    PubMed

    Nagpure, Naresh Sahebrao; Pathak, Ajey Kumar; Pati, Rameshwar; Rashid, Iliyas; Sharma, Jyoti; Singh, Shri Prakash; Singh, Mahender; Sarkar, Uttam Kumar; Kushwaha, Basdeo; Kumar, Ravindra; Murali, S

    2016-01-01

    A voluminous information is available on karyological studies of fishes; however, limited efforts were made for compilation and curation of the available karyological data in a digital form. 'Fish Karyome' database was the preliminary attempt to compile and digitize the available karyological information on finfishes belonging to the Indian subcontinent. But the database had limitations since it covered data only on Indian finfishes with limited search options. Perceiving the feedbacks from the users and its utility in fish cytogenetic studies, the Fish Karyome database was upgraded by applying Linux, Apache, MySQL and PHP (pre hypertext processor) (LAMP) technologies. In the present version, the scope of the system was increased by compiling and curating the available chromosomal information over the globe on fishes and other aquatic organisms, such as echinoderms, molluscs and arthropods, especially of aquaculture importance. Thus, Fish Karyome version 2.1 presently covers 866 chromosomal records for 726 species supported with 253 published articles and the information is being updated regularly. The database provides information on chromosome number and morphology, sex chromosomes, chromosome banding, molecular cytogenetic markers, etc. supported by fish and karyotype images through interactive tools. It also enables the users to browse and view chromosomal information based on habitat, family, conservation status and chromosome number. The system also displays chromosome number in model organisms, protocol for chromosome preparation and allied techniques and glossary of cytogenetic terms. A data submission facility has also been provided through data submission panel. The database can serve as a unique and useful resource for cytogenetic characterization, sex determination, chromosomal mapping, cytotaxonomy, karyo-evolution and systematics of fishes. Database URL: http://mail.nbfgr.res.in/Fish_Karyome. © The Author(s) 2016. Published by Oxford University Press.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiZio, S.M.

    Various state regulatory agencies have expressed a need for networking with information gatherers/researchers to produce a concise compilation of primary information so that the basis for regulatory standards can be scientifically referenced. California has instituted several programs to retrieve primary information, generate primary information through research, and generate unique regulatory standards by integrating the primary literature and the products of research. This paper describes these programs.

  5. Statewide Coordinating Agencies and State Commissions for Federal Programs in Higher Education. A Directory of Professional Personnel. 1969.

    ERIC Educational Resources Information Center

    Winandy, Donald H.; Marsh, Robert

    This is a comprehensive, up-dated directory of professional personnel of state higher education governing or coordinating agencies and state commissions for the administration of certain federal programs relating to higher education. The directory, compiled from questionnaires, originated from a need expressed by many persons in the field for…

  6. Developing Toxicogenomics as a Research Tool by Applying Benchmark Dose-Response Modeling to inform Chemical Mode of Action and Tumorigenic Potency

    EPA Science Inventory

    ABSTRACT Results of global gene expression profiling after short-term exposures can be used to inform tumorigenic potency and chemical mode of action (MOA) and thus serve as a strategy to prioritize future or data-poor chemicals for further evaluation. This compilation of cas...

  7. Cardiopulmonary resuscitation standards for clinical practice and training in the UK.

    PubMed

    Gabbott, David; Smith, Gary; Mitchell, Sarah; Colquhoun, Michael; Nolan, Jerry; Soar, Jasmeet; Pitcher, David; Perkins, Gavin; Phillips, Barbara; King, Ben; Spearpoint, Ken

    2005-07-01

    The Royal College of Anaesthetists, the Royal College of Physicians, the Intensive Care Society and the Resuscitation Council (UK) have published new resuscitation standards. The document provides advice to UK healthcare organisations, resuscitation committees and resuscitation officers on all aspects of the resuscitation service. It includes sections on resuscitation training, resuscitation equipment, the cardiac arrest team, cardiac arrest prevention, patient transfer, post-resuscitation care, audit and research. The document makes several recommendations. Healthcare institutions should have, or be represented on, a resuscitation committee that is responsible for all resuscitation issues. Every institution should have at least one resuscitation officer responsible for teaching and conducting training in resuscitation techniques. Staff with patient contact should be given regular resuscitation training appropriate to their expected abilities and roles. Clinical staff should receive regular training in the recognition of patients at risk of cardiopulmonary arrest and the measures required for the prevention of cardiopulmonary arrest. Healthcare institutions admitting acutely ill patients should have a resuscitation team, or its equivalent, available at all times. Clear guidelines should be available indicating how and when to call for the resuscitation team. Cardiopulmonary arrest should be managed according to current national guidelines. Resuscitation equipment should be available throughout the institution for clinical use and for training. The practice of resuscitation should be audited to maintain and improve standards of care. A do not attempt resuscitation (DNAR) policy should be compiled, communicated to relevant members of staff, used and audited regularly. Funding must be provided to support an effective resuscitation service.

  8. Cardiopulmonary resuscitation standards for clinical practice and training in the UK.

    PubMed

    Gabbott, David; Smith, Gary; Mitchell, Sarah; Colquhoun, Michael; Nolan, Jerry; Soar, Jasmeet; Pitcher, David; Perkins, Gavin; Phillips, Barbara; King, Ben; Spearpoint, Ken

    2005-01-01

    The Royal College of Anaesthetists, the Royal College of Physicians, the Intensive Care Society and the Resuscitation Council (UK) have published new resuscitation standards. The document provides advice to UK healthcare organisations, resuscitation committees and resuscitation officers on all aspects of the resuscitation service. It includes sections on resuscitation training, resuscitation equipment, the cardiac arrest team, cardiac arrest prevention, patient transfer, post resuscitation care, audit and research. The document makes several recommendations. Healthcare institutions should have, or be represented on, a resuscitation committee that is responsible for all resuscitation issues. Every institution should have at least one resuscitation officer responsible for teaching and conducting training in resuscitation techniques. Staff with patient contact should be given regular resuscitation training appropriate to their expected abilities and roles. Clinical staff should receive regular training in the recognition of patients at risk of cardiopulmonary arrest and the measures required for the prevention of cardiopulmonary arrest. Healthcare institutions admitting acutely ill patients should have a resuscitation team, or its equivalent, available at all times. Clear guidelines should be available indicating how and when to call for the resuscitation team. Cardiopulmonary arrest should be managed according to current national guidelines. Resuscitation equipment should be available throughout the institution for clinical use and for training. The practice of resuscitation should be audited to maintain and improve standards of care. A do not attempt resuscitation (DNAR) policy should be compiled, communicated to relevant members of staff, used and audited regularly. Funding must be provided to support an effective resuscitation service.

  9. Place of the reposition flap in the treatment of distal amputations of the fingers.

    PubMed

    Sbai, Mohamed Ali; M'chirgui, Mayssa El; Maalla, Riadh; Khorbi, Adel

    2017-08-01

    Distal finger amputations pose a therapeutic problem with the distal fragment quality. Reimplantation remains the reference treatment for functional and aesthetic recovery of the hand. The interest of this study is to propose the reposition flap as an alternative to different hedging techniques in the proximal stump, in many situations where revascularization is impossible. It consists in osteosynthesis of the bone fragment and its coverage by a pedicled local flap. The technique of reposition flap was evaluated retrospectively between 2003 and 2016 through a study of 13 patients compiled in Nabeul orthopedic department. For each patient, the sensitivity, the pulp trophicity, the interphalangeal mobility, the digital length, the appearance of the nail and radiological consolidation were evaluated. The reposition flap keeps more than 80% of the length of p3. This procedure improves nail aesthetics in comparison with the regularizations. There is no significant difference in sensitivity of the pulp or of the mobility of the distal inter-phalangeal (DIP) joint as a function of the technique studied. However there is a significant difference in average test of the Quick Dash (350 against 500 for regularizations). The reposition flap seems to be a good alternative to regularization in the context of trans-p3 fingers amputations, in which the distal fragment is not revascularizable. It allows better aesthetic and functional results. Copyright © 2017 Daping Hospital and the Research Institute of Surgery of the Third Military Medical University. Production and hosting by Elsevier B.V. All rights reserved.

  10. The Production and Perception of Emotionally Expressive Walking Sounds: Similarities between Musical Performance and Everyday Motor Activity

    PubMed Central

    Giordano, Bruno L.; Egermann, Hauke; Bresin, Roberto

    2014-01-01

    Several studies have investigated the encoding and perception of emotional expressivity in music performance. A relevant question concerns how the ability to communicate emotions in music performance is acquired. In accordance with recent theories on the embodiment of emotion, we suggest here that both the expression and recognition of emotion in music might at least in part rely on knowledge about the sounds of expressive body movements. We test this hypothesis by drawing parallels between musical expression of emotions and expression of emotions in sounds associated with a non-musical motor activity: walking. In a combined production-perception design, two experiments were conducted, and expressive acoustical features were compared across modalities. An initial performance experiment tested for similar feature use in walking sounds and music performance, and revealed that strong similarities exist. Features related to sound intensity, tempo and tempo regularity were identified as been used similarly in both domains. Participants in a subsequent perception experiment were able to recognize both non-emotional and emotional properties of the sound-generating walkers. An analysis of the acoustical correlates of behavioral data revealed that variations in sound intensity, tempo, and tempo regularity were likely used to recognize expressed emotions. Taken together, these results lend support the motor origin hypothesis for the musical expression of emotions. PMID:25551392

  11. 78 FR 66962 - Advisory Committee on Construction Safety and Health (ACCSH)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... Assistant Secretary of Labor request for nominations for membership on ACCSH. DATES: ACCSH meeting: ACCSH... Office at (202) 693- 1648. Regular mail, express mail, hand delivery, or messenger (courier) service...). OSHA's Docket Office accepts deliveries (hand deliveries, express mail, and messenger service) during...

  12. Sulfur dioxide emission rates from Kīlauea Volcano, Hawai‘i, 2007–2010

    USGS Publications Warehouse

    Elias, T.; Sutton, A.J.

    2012-01-01

    Kīlauea Volcano has one of the longest running volcanic sulfur dioxide (SO2) emission rate databases on record. Sulfur dioxide emission rates from Kīlauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Elias and Sutton, 2007, and references within). Compilations of SO2 emission-rate and wind-vector data from 1979 through 2006 are available on the USGS Web site (Elias and others, 1998; Elias and Sutton, 2002; Elias and Sutton, 2007). This report updates the database, documents the changes in data collection and processing methods, and highlights how SO2 emissions have varied with eruptive activity at Kīlauea Volcano for the interval 2007–2010.

  13. A Standard of Knowledge for the Professional Practice of Toxicology.

    PubMed

    Hulla, Janis E; Kinter, Lewis B; Kelman, Bruce

    2015-08-01

    Employers, courts, and the general public judge the credibility of professionals based on credentials such as academic degrees, publications, memberships in professional organizations, board certifications, and professional registrations. However, the relevance and merit of these credentials can be difficult to determine objectively. Board certification can be a reliable indicator of proficiency if the certifying organization demonstrates, through regularly scheduled independent review, that its processes meet established standards and when a certificate holder is required to periodically demonstrate command of a body of knowledge that is essential to current professional practice. We report herein a current Standard of Knowledge in general toxicology compiled from the experience and opinions of 889 certified practicing professional toxicologists. An examination is the most commonly used instrument for testing a certification candidate's command of the body of knowledge. However, an examination-based certification is only creditable when the body of knowledge, to which a certification examination tests, is representative of the current knowledge, skills, and capabilities needed to effectively practice at the professional level. Thus, that body of knowledge must be the current "Standard of Knowledge" for the profession, compiled in a transparent fashion from current practitioners of the profession. This work was conducted toward ensuring the scientific integrity of the products produced by professional toxicologists.

  14. [Progress of genome engineering technology via clustered regularly interspaced short palindromic repeats--a review].

    PubMed

    Li, Hao; Qiu, Shaofu; Song, Hongbin

    2013-10-04

    In survival competition with phage, bacteria and archaea gradually evolved the acquired immune system--Clustered regularly interspaced short palindromic repeats (CRISPR), presenting the trait of transcribing the crRNA and the CRISPR-associated protein (Cas) to silence or cleaving the foreign double-stranded DNA specifically. In recent years, strong interest arises in prokaryotes primitive immune system and many in-depth researches are going on. Recently, researchers successfully repurposed CRISPR as an RNA-guided platform for sequence-specific gene expression, which provides a simple approach for selectively perturbing gene expression on a genome-wide scale. It will undoubtedly bring genome engineering into a more convenient and accurate new era.

  15. Turning Bone Morphogenetic Protein 2 (BMP2) On and Off in Mesenchymal Cells†

    PubMed Central

    Rogers, Melissa B.; Shah, Tapan A.; Shaikh, Nadia N.

    2016-01-01

    The concentration, location, and timing of bone morphogenetic protein 2 (BMP2, HGNC:1069, GeneID: 650) gene expression must be precisely regulated. Abnormal BMP2 levels cause congenital anomalies and diseases involving the mesenchymal cells that differentiate into muscle, fat, cartilage, and bone. The molecules and conditions that influence BMP2 synthesis are diverse. Understandably, complex mechanisms control Bmp2 gene expression. This review includes a compilation of agents and conditions that can induce Bmp2. The currently known trans-regulatory factors and cis-regulatory elements that modulate Bmp2 expression are summarized and discussed. This article is protected by copyright. All rights reserved PMID:25776852

  16. Fine Structure and Dynamics of Sunspot Penumbra

    NASA Astrophysics Data System (ADS)

    Ryutova, M.; Berger, T.; Title, A.

    2007-08-01

    A mature sunspot is usually surrounded by a penumbra: strong vertical magnetic field in the umbra, the dark central region of sunspot, becomes more and more horizontal toward the periphery forming an ensemble of a thin magnetic filaments of varying inclinations. Recent high resolution observations with the 1-meter Swedish Solar Telescope (SST) on La Palma revealed a fine substructure of penumbral filaments and new regularities in their dynamics.1 These findings provide both the basis and constraints for an adequate model of the penumbra whose origin still remains enigmatic. We present results of recent observations obtained with the SST. Our data, taken simultaneously in 4305 Å G-band and 4396 Å continuum bandpasses and compiled in high cadence movies, confirm previous results and reveal new features of the penumbra. We find e.g. that individual filaments are cylindrical helices with a pitch/radius ratio providing their dynamic stability. We propose a mechanism that may explain the fine structure of penumbral filaments, the observed regularities, and their togetherness with sunspot formation. The mechanism is based on the anatomy of sunspots in which not only penumbra has a filamentary structure but umbra itself is a dense conglomerate of twisted interlaced flux tubes.

  17. Distributional and regularized radiation fields of non-uniformly moving straight dislocations, and elastodynamic Tamm problem

    NASA Astrophysics Data System (ADS)

    Lazar, Markus; Pellegrini, Yves-Patrick

    2016-11-01

    This work introduces original explicit solutions for the elastic fields radiated by non-uniformly moving, straight, screw or edge dislocations in an isotropic medium, in the form of time-integral representations in which acceleration-dependent contributions are explicitly separated out. These solutions are obtained by applying an isotropic regularization procedure to distributional expressions of the elastodynamic fields built on the Green tensor of the Navier equation. The obtained regularized field expressions are singularity-free, and depend on the dislocation density rather than on the plastic eigenstrain. They cover non-uniform motion at arbitrary speeds, including faster-than-wave ones. A numerical method of computation is discussed, that rests on discretizing motion along an arbitrary path in the plane transverse to the dislocation, into a succession of time intervals of constant velocity vector over which time-integrated contributions can be obtained in closed form. As a simple illustration, it is applied to the elastodynamic equivalent of the Tamm problem, where fields induced by a dislocation accelerated from rest beyond the longitudinal wave speed, and thereafter put to rest again, are computed. As expected, the proposed expressions produce Mach cones, the dynamic build-up and decay of which is illustrated by means of full-field calculations.

  18. A Novel Hypercomplex Solution to Kepler's Problem

    NASA Astrophysics Data System (ADS)

    Condurache, C.; Martinuşi, V.

    2007-05-01

    By using a Sundman like regularization, we offer a unified solution to Kepler's problem by using hypercomplex numbers. The fundamental role in this paper is played by the Laplace-Runge-Lenz prime integral and by the hypercomplex numbers algebra. The procedure unifies and generalizes the regularizations offered by Levi-Civita and Kustaanheimo-Stiefel. Closed form hypercomplex expressions for the law of motion and velocity are deduced, together with inedite hypercomplex prime integrals.

  19. T3SEdb: data warehousing of virulence effectors secreted by the bacterial Type III Secretion System.

    PubMed

    Tay, Daniel Ming Ming; Govindarajan, Kunde Ramamoorthy; Khan, Asif M; Ong, Terenze Yao Rui; Samad, Hanif M; Soh, Wei Wei; Tong, Minyan; Zhang, Fan; Tan, Tin Wee

    2010-10-15

    Effectors of Type III Secretion System (T3SS) play a pivotal role in establishing and maintaining pathogenicity in the host and therefore the identification of these effectors is important in understanding virulence. However, the effectors display high level of sequence diversity, therefore making the identification a difficult process. There is a need to collate and annotate existing effector sequences in public databases to enable systematic analyses of these sequences for development of models for screening and selection of putative novel effectors from bacterial genomes that can be validated by a smaller number of key experiments. Herein, we present T3SEdb http://effectors.bic.nus.edu.sg/T3SEdb, a specialized database of annotated T3SS effector (T3SE) sequences containing 1089 records from 46 bacterial species compiled from the literature and public protein databases. Procedures have been defined for i) comprehensive annotation of experimental status of effectors, ii) submission and curation review of records by users of the database, and iii) the regular update of T3SEdb existing and new records. Keyword fielded and sequence searches (BLAST, regular expression) are supported for both experimentally verified and hypothetical T3SEs. More than 171 clusters of T3SEs were detected based on sequence identity comparisons (intra-cluster difference up to ~60%). Owing to this high level of sequence diversity of T3SEs, the T3SEdb provides a large number of experimentally known effector sequences with wide species representation for creation of effector predictors. We created a reliable effector prediction tool, integrated into the database, to demonstrate the application of the database for such endeavours. T3SEdb is the first specialised database reported for T3SS effectors, enriched with manual annotations that facilitated systematic construction of a reliable prediction model for identification of novel effectors. The T3SEdb represents a platform for inclusion of additional annotations of metadata for future developments of sophisticated effector prediction models for screening and selection of putative novel effectors from bacterial genomes/proteomes that can be validated by a small number of key experiments.

  20. Administrative and Legislative Uses of the Terms "Poverty,""Low Income," and other Related Items. The Measure of Poverty, Technical Paper II.

    ERIC Educational Resources Information Center

    Grob, George; And Others

    This paper is a compilation of the major federal, legislative, administrative and statistical uses of the terms poverty, low income, and related expressions. The first section summarizes the most commonly used definitions of poverty. These are: (1) the official statistical poverty definition, (2) program eligibility guidelines of the Community…

  1. "The Purpose of This Study Is to": Connecting Lexical Bundles and Moves in Research Article Introductions

    ERIC Educational Resources Information Center

    Cortes, Viviana

    2013-01-01

    This article presents a group of lexical bundles identified in a corpus of research article introductions as the first step in the analysis of these expressions in the different sections of the research article. A one-million word corpus of research article introductions from various disciplines was compiled and the lexical bundles identified in…

  2. The 2007 Anatomy Ceremony: A Service of Gratitude

    PubMed Central

    2008-01-01

    Yale University medical and PA students, classes of 2010 and 2008 respectively, express their gratitude in a compilation of reflections on learning human anatomy. In coordination with the Section of Anatomy and Experimental Surgery at the School of Medicine, the Yale Journal of Biology and Medicine encourages you to hear the stories of the body as narrated by the student.

  3. Sensitivity regularization of the Cramér-Rao lower bound to minimize B1 nonuniformity effects in quantitative magnetization transfer imaging.

    PubMed

    Boudreau, Mathieu; Pike, G Bruce

    2018-05-07

    To develop and validate a regularization approach of optimizing B 1 insensitivity of the quantitative magnetization transfer (qMT) pool-size ratio (F). An expression describing the impact of B 1 inaccuracies on qMT fitting parameters was derived using a sensitivity analysis. To simultaneously optimize for robustness against noise and B 1 inaccuracies, the optimization condition was defined as the Cramér-Rao lower bound (CRLB) regularized by the B 1 -sensitivity expression for the parameter of interest (F). The qMT protocols were iteratively optimized from an initial search space, with and without B 1 regularization. Three 10-point qMT protocols (Uniform, CRLB, CRLB+B 1 regularization) were compared using Monte Carlo simulations for a wide range of conditions (e.g., SNR, B 1 inaccuracies, tissues). The B 1 -regularized CRLB optimization protocol resulted in the best robustness of F against B 1 errors, for a wide range of SNR and for both white matter and gray matter tissues. For SNR = 100, this protocol resulted in errors of less than 1% in mean F values for B 1 errors ranging between -10 and 20%, the range of B 1 values typically observed in vivo in the human head at field strengths of 3 T and less. Both CRLB-optimized protocols resulted in the lowest σ F values for all SNRs and did not increase in the presence of B 1 inaccuracies. This work demonstrates a regularized optimization approach for improving the robustness of auxiliary measurements (e.g., B 1 ) sensitivity of qMT parameters, particularly the pool-size ratio (F). Predicting substantially less B 1 sensitivity using protocols optimized with this method, B 1 mapping could even be omitted for qMT studies primarily interested in F. © 2018 International Society for Magnetic Resonance in Medicine.

  4. Map and Data for Quaternary Faults and Fault Systems on the Island of Hawai`i

    USGS Publications Warehouse

    Cannon, Eric C.; Burgmann, Roland; Crone, Anthony J.; Machette, Michael N.; Dart, Richard L.

    2007-01-01

    Introduction This report and digitally prepared, GIS-based map is one of a series of similar products covering individual states or regions of United States that show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. It is part of a continuing the effort to compile a comprehensive Quaternary fault and fold map and database for the United States, which is supported by the U.S. Geological Survey's (USGS) Earthquake Hazards Program. Guidelines for the compilation of the Quaternary fault and fold maps for the United States were published by Haller and others (1993) at the onset of this project. This compilation of Quaternary surface faulting and folding in Hawai`i is one of several similar state and regional compilations that were planned for the United States. Reports published to date include West Texas (Collins and others, 1996), New Mexico (Machette and others, 1998), Arizona (Pearthree, 1998), Colorado (Widmann and others, 1998), Montana (Stickney and others, 2000), Idaho (Haller and others, 2005), and Washington (Lidke and others, 2003). Reports for other states such as California and Alaska are still in preparation. The primary intention of this compilation is to aid in seismic-hazard evaluations. The report contains detailed information on the location and style of faulting, the time of most recent movement, and assigns each feature to a slip-rate category (as a proxy for fault activity). It also contains the name and affiliation of the compiler, date of compilation, geographic and other paleoseismologic parameters, as well as an extensive set of references for each feature. The map (plate 1) shows faults, volcanic rift zones, and lineaments that show evidence of Quaternary surface movement related to faulting, including data on the time of most recent movement, sense of movement, slip rate, and continuity of surface expression. This compilation is presented as a digitally prepared map product and catalog of data, both in Adobe Acrobat PDF format. The senior authors (Eric C. Cannon and Roland Burgmann) compiled the fault data as part of ongoing studies of active faulting on the Island of Hawai`i. The USGS is responsible for organizing and integrating the State or regional products under their National Seismic Hazard Mapping project, including the coordination and oversight of contributions from individuals and groups (Michael N. Machette and Anthony J. Crone), database design and management (Kathleen M. Haller), and digitization and analysis of map data (Richard L. Dart). After being released an Open-File Report, the data in this report will be available online at http://earthquake.usgs.gov/regional/qfaults/, the USGS Quaternary Fault and Fold Database of the United States.

  5. Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D J; Willcock, J J

    2008-12-12

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructuremore » which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.« less

  6. Online handwritten mathematical expression recognition

    NASA Astrophysics Data System (ADS)

    Büyükbayrak, Hakan; Yanikoglu, Berrin; Erçil, Aytül

    2007-01-01

    We describe a system for recognizing online, handwritten mathematical expressions. The system is designed with a user-interface for writing scientific articles, supporting the recognition of basic mathematical expressions as well as integrals, summations, matrices etc. A feed-forward neural network recognizes symbols which are assumed to be single-stroke and a recursive algorithm parses the expression by combining neural network output and the structure of the expression. Preliminary results show that writer-dependent recognition rates are very high (99.8%) while writer-independent symbol recognition rates are lower (75%). The interface associated with the proposed system integrates the built-in recognition capabilities of the Microsoft's Tablet PC API for recognizing textual input and supports conversion of hand-drawn figures into PNG format. This enables the user to enter text, mathematics and draw figures in a single interface. After recognition, all output is combined into one LATEX code and compiled into a PDF file.

  7. Analytical expressions for stability regions in the Ince-Strutt diagram of Mathieu equation

    NASA Astrophysics Data System (ADS)

    Butikov, Eugene I.

    2018-04-01

    Simple analytical expressions are suggested for transition curves that separate, in the Ince-Strutt diagram, different types of solutions to the famous Mathieu equation. The derivations of these expressions in this paper rely on physically meaningful periodic solutions describing various regular motions of a familiar nonlinear mechanical system—a rigid planar pendulum with a vertically oscillating pivot. The paper is accompanied by a relevant simulation program.

  8. Robust Principal Component Analysis Regularized by Truncated Nuclear Norm for Identifying Differentially Expressed Genes.

    PubMed

    Wang, Ya-Xuan; Gao, Ying-Lian; Liu, Jin-Xing; Kong, Xiang-Zhen; Li, Hai-Jun

    2017-09-01

    Identifying differentially expressed genes from the thousands of genes is a challenging task. Robust principal component analysis (RPCA) is an efficient method in the identification of differentially expressed genes. RPCA method uses nuclear norm to approximate the rank function. However, theoretical studies showed that the nuclear norm minimizes all singular values, so it may not be the best solution to approximate the rank function. The truncated nuclear norm is defined as the sum of some smaller singular values, which may achieve a better approximation of the rank function than nuclear norm. In this paper, a novel method is proposed by replacing nuclear norm of RPCA with the truncated nuclear norm, which is named robust principal component analysis regularized by truncated nuclear norm (TRPCA). The method decomposes the observation matrix of genomic data into a low-rank matrix and a sparse matrix. Because the significant genes can be considered as sparse signals, the differentially expressed genes are viewed as the sparse perturbation signals. Thus, the differentially expressed genes can be identified according to the sparse matrix. The experimental results on The Cancer Genome Atlas data illustrate that the TRPCA method outperforms other state-of-the-art methods in the identification of differentially expressed genes.

  9. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences.

    PubMed

    Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.

  10. Volcano plots in analyzing differential expressions with mRNA microarrays.

    PubMed

    Li, Wentian

    2012-12-01

    A volcano plot displays unstandardized signal (e.g. log-fold-change) against noise-adjusted/standardized signal (e.g. t-statistic or -log(10)(p-value) from the t-test). We review the basic and interactive use of the volcano plot and its crucial role in understanding the regularized t-statistic. The joint filtering gene selection criterion based on regularized statistics has a curved discriminant line in the volcano plot, as compared to the two perpendicular lines for the "double filtering" criterion. This review attempts to provide a unifying framework for discussions on alternative measures of differential expression, improved methods for estimating variance, and visual display of a microarray analysis result. We also discuss the possibility of applying volcano plots to other fields beyond microarray.

  11. Chimeric mitochondrial peptides from contiguous regular and swinger RNA.

    PubMed

    Seligmann, Hervé

    2016-01-01

    Previous mass spectrometry analyses described human mitochondrial peptides entirely translated from swinger RNAs, RNAs where polymerization systematically exchanged nucleotides. Exchanges follow one among 23 bijective transformation rules, nine symmetric exchanges (X ↔ Y, e.g. A ↔ C) and fourteen asymmetric exchanges (X → Y → Z → X, e.g. A → C → G → A), multiplying by 24 DNA's protein coding potential. Abrupt switches from regular to swinger polymerization produce chimeric RNAs. Here, human mitochondrial proteomic analyses assuming abrupt switches between regular and swinger transcriptions, detect chimeric peptides, encoded by part regular, part swinger RNA. Contiguous regular- and swinger-encoded residues within single peptides are stronger evidence for translation of swinger RNA than previously detected, entirely swinger-encoded peptides: regular parts are positive controls matched with contiguous swinger parts, increasing confidence in results. Chimeric peptides are 200 × rarer than swinger peptides (3/100,000 versus 6/1000). Among 186 peptides with > 8 residues for each regular and swinger parts, regular parts of eleven chimeric peptides correspond to six among the thirteen recognized, mitochondrial protein-coding genes. Chimeric peptides matching partly regular proteins are rarer and less expressed than chimeric peptides matching non-coding sequences, suggesting targeted degradation of misfolded proteins. Present results strengthen hypotheses that the short mitogenome encodes far more proteins than hitherto assumed. Entirely swinger-encoded proteins could exist.

  12. The BLAZE language: A parallel language for scientific programming

    NASA Technical Reports Server (NTRS)

    Mehrotra, P.; Vanrosendale, J.

    1985-01-01

    A Pascal-like scientific programming language, Blaze, is described. Blaze contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus Blaze should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with onceptually sequential control flow. A central goal in the design of Blaze is portability across a broad range of parallel architectures. The multiple levels of parallelism present in Blaze code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of Blaze are described and shows how this language would be used in typical scientific programming.

  13. A powerful graphical pulse sequence programming tool for magnetic resonance imaging.

    PubMed

    Jie, Shen; Ying, Liu; Jianqi, Li; Gengying, Li

    2005-12-01

    A powerful graphical pulse sequence programming tool has been designed for creating magnetic resonance imaging (MRI) applications. It allows rapid development of pulse sequences in graphical mode (allowing for the visualization of sequences), and consists of three modules which include a graphical sequence editor, a parameter management module and a sequence compiler. Its key features are ease to use, flexibility and hardware independence. When graphic elements are combined with a certain text expressions, the graphical pulse sequence programming is as flexible as text-based programming tool. In addition, a hardware-independent design is implemented by using the strategy of two step compilations. To demonstrate the flexibility and the capability of this graphical sequence programming tool, a multi-slice fast spin echo experiment is performed on our home-made 0.3 T permanent magnet MRI system.

  14. Pituitary tumor-transforming gene 1 regulates the patterning of retinal mosaics

    PubMed Central

    Keeley, Patrick W.; Zhou, Cuiqi; Lu, Lu; Williams, Robert W.; Melmed, Shlomo; Reese, Benjamin E.

    2014-01-01

    Neurons are commonly organized as regular arrays within a structure, and their patterning is achieved by minimizing the proximity between like-type cells, but molecular mechanisms regulating this process have, until recently, been unexplored. We performed a forward genetic screen using recombinant inbred (RI) strains derived from two parental A/J and C57BL/6J mouse strains to identify genomic loci controlling spacing of cholinergic amacrine cells, which is a subclass of retinal interneuron. We found conspicuous variation in mosaic regularity across these strains and mapped a sizeable proportion of that variation to a locus on chromosome 11 that was subsequently validated with a chromosome substitution strain. Using a bioinformatics approach to narrow the list of potential candidate genes, we identified pituitary tumor-transforming gene 1 (Pttg1) as the most promising. Expression of Pttg1 was significantly different between the two parental strains and correlated with mosaic regularity across the RI strains. We identified a seven-nucleotide deletion in the Pttg1 promoter in the C57BL/6J mouse strain and confirmed a direct role for this motif in modulating Pttg1 expression. Analysis of Pttg1 KO mice revealed a reduction in the mosaic regularity of cholinergic amacrine cells, as well as horizontal cells, but not in two other retinal cell types. Together, these results implicate Pttg1 in the regulation of homotypic spacing between specific types of retinal neurons. The genetic variant identified creates a binding motif for the transcriptional activator protein 1 complex, which may be instrumental in driving differential expression of downstream processes that participate in neuronal spacing. PMID:24927528

  15. Efficiently and easily integrating differential equations with JiTCODE, JiTCDDE, and JiTCSDE

    NASA Astrophysics Data System (ADS)

    Ansmann, Gerrit

    2018-04-01

    We present a family of Python modules for the numerical integration of ordinary, delay, or stochastic differential equations. The key features are that the user enters the derivative symbolically and it is just-in-time-compiled, allowing the user to efficiently integrate differential equations from a higher-level interpreted language. The presented modules are particularly suited for large systems of differential equations such as those used to describe dynamics on complex networks. Through the selected method of input, the presented modules also allow almost complete automatization of the process of estimating regular as well as transversal Lyapunov exponents for ordinary and delay differential equations. We conceptually discuss the modules' design, analyze their performance, and demonstrate their capabilities by application to timely problems.

  16. Substance P modulation of TRPC3/7 channels improves respiratory rhythm regularity and ICAN-dependent pacemaker activity

    PubMed Central

    Ben-Mabrouk, Faiza; Tryba, Andrew Kieran

    2011-01-01

    Neuromodulators, such as Substance P (SubP) play an important role in modulating many rhythmic activities driven by central pattern generators (e.g., locomotion, respiration). However, the mechanism by which SubP enhances breathing regularity has not been determined. Here, we used mouse brainstem slices containing the pre-Bötzinger Complex (Pre-BötC) to demonstrate, for the first time, that SubP activates transient receptor protein canonical (TRPC) channels to enhance respiratory rhythm regularity. Moreover, SubP enhancement of network regularity is accomplished via selective enhancement of ICAN-dependent intrinsic bursting properties. In contrast to INaP-dependant pacemakers, ICAN-dependant pacemaker bursting activity is TRPC dependent. Western Blots reveal TRPC3 and TRPC7 channels are expressed in rhythmically active ventral respiratory group (VRG) island preparations. Taken together, these data suggest that SubP-mediated activation of TRPC3/7 channels underlies rhythmic ICAN-dependent pacemaker activity and enhances the regularity of respiratory rhythm activity. PMID:20345918

  17. Substance P modulation of TRPC3/7 channels improves respiratory rhythm regularity and ICAN-dependent pacemaker activity.

    PubMed

    Ben-Mabrouk, Faiza; Tryba, Andrew K

    2010-04-01

    Neuromodulators, such as substance P (SubP), play an important role in modulating many rhythmic activities driven by central pattern generators (e.g. locomotion, respiration). However, the mechanism by which SubP enhances breathing regularity has not been determined. Here, we used mouse brainstem slices containing the pre-Bötzinger complex to demonstrate, for the first time, that SubP activates transient receptor protein canonical (TRPC) channels to enhance respiratory rhythm regularity. Moreover, SubP enhancement of network regularity is accomplished via selective enhancement of ICAN (inward non-specific cation current)-dependent intrinsic bursting properties. In contrast to INaP (persistent sodium current)-dependent pacemakers, ICAN-dependent pacemaker bursting activity is TRPC-dependent. Western Blots reveal TRPC3 and TRPC7 channels are expressed in rhythmically active ventral respiratory group island preparations. Taken together, these data suggest that SubP-mediated activation of TRPC3/7 channels underlies rhythmic ICAN-dependent pacemaker activity and enhances the regularity of respiratory rhythm activity.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gydesen, S.P.

    The purpose of this letter report is to reconstruct from available information that data which can be used to develop daily reactor operating history for 1960--1964. The information needed for source team calculations (as determined by the Source Terms Task Leader) were extracted and included in this report. The data on the amount of uranium dissolved by the separations plants (expressed both as tons and as MW) is also included in this compilation.

  19. Approximate matching of regular expressions.

    PubMed

    Myers, E W; Miller, W

    1989-01-01

    Given a sequence A and regular expression R, the approximate regular expression matching problem is to find a sequence matching R whose optimal alignment with A is the highest scoring of all such sequences. This paper develops an algorithm to solve the problem in time O(MN), where M and N are the lengths of A and R. Thus, the time requirement is asymptotically no worse than for the simpler problem of aligning two fixed sequences. Our method is superior to an earlier algorithm by Wagner and Seiferas in several ways. First, it treats real-valued costs, in addition to integer costs, with no loss of asymptotic efficiency. Second, it requires only O(N) space to deliver just the score of the best alignment. Finally, its structure permits implementation techniques that make it extremely fast in practice. We extend the method to accommodate gap penalties, as required for typical applications in molecular biology, and further refine it to search for sub-strings of A that strongly align with a sequence in R, as required for typical data base searches. We also show how to deliver an optimal alignment between A and R in only O(N + log M) space using O(MN log M) time. Finally, an O(MN(M + N) + N2log N) time algorithm is presented for alignment scoring schemes where the cost of a gap is an arbitrary increasing function of its length.

  20. Sulfur Dioxide Emission Rates of Kilauea Volcano, Hawaii, 1979-1997

    USGS Publications Warehouse

    Elias, Tamar; Sutton, A.J.; Stokes, J.B.; Casadevall, T.J.

    1998-01-01

    INTRODUCTION Sulfur dioxide (SO2) emission rates from Kilauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Casadevall and others, 1987; Greenland and others, 1985; Elias and others, 1993; Elias and Sutton, 1996). The purpose of this report is to present a compilation of Kilauea SO2 emission rate data from 1979 through 1997 with ancillary meteorological data (wind speed and wind direction). We have included measurements previously reported by Casadevall and others (1987) for completeness and to improve the usefulness of this current database compilation. Kilauea releases SO2 gas predominantly from its summit caldera and rift zones (fig. 1). From 1979 through 1982, vehicle-based COSPEC measurements made within the summit caldera were adequate to quantify most of the SO2 emitted from the volcano. Beginning in 1983. the focus of SO2 release shifted from the summit to the east rift zone (ERZ) eruption site at Pu'u 'O'o and, later, Kupaianaha. Since 1984, the Kilauea gas measurement effort has been augmented with intermittent airborne and tripod-based surveys made near the ERZ eruption site. In addition, beginning in 1992 vehicle-based measurements have been made along a section of Chain of Craters Road approximately 9 km downwind of the eruption site. These several types of COSPEC measurements continue to the present.

  1. Tiled architecture of a CNN-mostly IP system

    NASA Astrophysics Data System (ADS)

    Spaanenburg, Lambert; Malki, Suleyman

    2009-05-01

    Multi-core architectures have been popularized with the advent of the IBM CELL. On a finer grain the problems in scheduling multi-cores have already existed in the tiled architectures, such as the EPIC and Da Vinci. It is not easy to evaluate the performance of a schedule on such architecture as historical data are not available. One solution is to compile algorithms for which an optimal schedule is known by analysis. A typical example is an algorithm that is already defined in terms of many collaborating simple nodes, such as a Cellular Neural Network (CNN). A simple node with a local register stack together with a 'rotating wheel' internal communication mechanism has been proposed. Though the basic CNN allows for a tiled implementation of a tiled algorithm on a tiled structure, a practical CNN system will have to disturb this regularity by the additional need for arithmetical and logical operations. Arithmetic operations are needed for instance to accommodate for low-level image processing, while logical operations are needed to fork and merge different data streams without use of the external memory. It is found that the 'rotating wheel' internal communication mechanism still handles such mechanisms without the need for global control. Overall the CNN system provides for a practical network size as implemented on a FPGA, can be easily used as embedded IP and provides a clear benchmark for a multi-core compiler.

  2. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences

    PubMed Central

    Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287

  3. Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks

    PubMed Central

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    Background The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. Methodology/Principal Findings To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes () and latency of the optimized engineered gene networks. Conclusions/Significance Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems. PMID:21850228

  4. Domain Specific Language Support for Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    A multi-institutional project known as D-TEC (short for “Domain- specific Technology for Exascale Computing”) set out to explore technologies to support the construction of Domain Specific Languages (DSLs) to map application programs to exascale architectures. DSLs employ automated code transformation to shift the burden of delivering portable performance from application programmers to compilers. Two chief properties contribute: DSLs permit expression at a high level of abstraction so that a programmer’s intent is clear to a compiler and DSL implementations encapsulate human domain-specific optimization knowledge so that a compiler can be smart enough to achieve good results on specific hardware. Domainmore » specificity is what makes these properties possible in a programming language. If leveraging domain specificity is the key to keep exascale software tractable, a corollary is that many different DSLs will be needed to encompass the full range of exascale computing applications; moreover, a single application may well need to use several different DSLs in conjunction. As a result, developing a general toolkit for building domain-specific languages was a key goal for the D-TEC project. Different aspects of the D-TEC research portfolio were the focus of work at each of the partner institutions in the multi-institutional project. D-TEC research and development work at Rice University focused on on three principal topics: understanding how to automate the tuning of code for complex architectures, research and development of the Rosebud DSL engine, and compiler technology to support complex execution platforms. This report provides a summary of the research and development work on the D-TEC project at Rice University.« less

  5. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    PubMed

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  6. Twistor interpretation of slice regular functions

    NASA Astrophysics Data System (ADS)

    Altavilla, Amedeo

    2018-01-01

    Given a slice regular function f : Ω ⊂ H → H, with Ω ∩ R ≠ ∅, it is possible to lift it to surfaces in the twistor space CP3 of S4 ≃ H ∪ { ∞ } (see Gentili et al., 2014). In this paper we show that the same result is true if one removes the hypothesis Ω ∩ R ≠ ∅ on the domain of the function f. Moreover we find that if a surface S ⊂CP3 contains the image of the twistor lift of a slice regular function, then S has to be ruled by lines. Starting from these results we find all the projective classes of algebraic surfaces up to degree 3 in CP3 that contain the lift of a slice regular function. In addition we extend and further explore the so-called twistor transform, that is a curve in Gr2(C4) which, given a slice regular function, returns the arrangement of lines whose lift carries on. With the explicit expression of the twistor lift and of the twistor transform of a slice regular function we exhibit the set of slice regular functions whose twistor transform describes a rational line inside Gr2(C4) , showing the role of slice regular functions not defined on R. At the end we study the twistor lift of a particular slice regular function not defined over the reals. This example shows the effectiveness of our approach and opens some questions.

  7. Pragmatic diabetes management in nursing homes: individual care plan.

    PubMed

    Benetos, Athanase; Novella, Jean-Luc; Guerci, Bruno; Blickle, Jean-Frederic; Boivin, Jean-Marc; Cuny, Pierre; Delemer, Brigitte; Gabreau, Thierry; Jan, Philippe; Louis, Jacques; Passadori, Yves; Petit, Jean-Michel; Weryha, Georges

    2013-11-01

    Although the management of diabetes as a simple entity has been extensively developed, there is a dearth of evidence in elderly, frail patients with multiple comorbidities and polymedication. This population represents a large proportion of the residents of nursing homes (NHs). As a multidisciplinary group of French experts (geriatricians, endocrinologists, diabetologists, and general practitioners) with practical experience in this area, which is growing in magnitude throughout the world, we convened to compile pragmatic, simple advice on the management of elderly, frail diabetic patients. Given demands on NH personnel (manager, medical coordinator, nurses, and, at the front line of care provision, the undertrained and overworked carers), coupled with the quasiconstant of high staff turnover, the foundation stone of a patient's diabetes management is an Individual Care Plan (ICP) expressed in layman's language. This document that is opened on the patient's admission aims to make sure that the prescriptions established at admission are followed, notably to ensure correct treatment and adapted, regular monitoring with dates and times when examinations and tests are due. This includes monitoring of the diabetes control (HbA1c and, if necessary, blood and urine glucose) and its complications (cardiovascular disease, hypoglycemia, ocular problems, foot disorders, malnutrition, peripheral neuropathy, kidney failure). A necessary corollary is the training of staff to understand the specificities of caring for a frail patient with diabetes, on what to do in a potential emergency, and how to keep the ICP up to date for consultation by doctors and nurses. Copyright © 2013 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  8. Duty Rosters and Workloads of Obstetricians in Germany: Results of a Germany-wide Survey.

    PubMed

    Neimann, Johannes; Knabl, Julia; Puppe, Julian; Bayer, Christian Michael; Gass, Paul; Gabriel, Lena; Seelbach-Goebel, Birgit; Lermann, Johannes; Schott, Sarah

    2017-08-01

    Compiling a daily hospital roster which complies with existing laws and tariff regulations and meets the requirements for ongoing professional training while also taking the legal regulations on the health of employees into account makes planning the duty roster a challenge. The aim of this study was to obtain a realistic picture of existing duty roster systems and of the current workloads of obstetricians in Germany. This online survey was sent to 2770 physicians training to become obstetricians or specializing in specific areas of obstetric care. The survey consisted of an anonymized 95-item questionnaire which collected data on different types of duty roster systems and the workload of obstetricians in Germany for the period from 17.02.2015 to 16.05.2015. Out of a total of 2770 physicians who were contacted, 437 (16%) completed the questionnaire. Across all forms of care, the care provided outside normal working hours usually (75%) consisted of a combination of regular working times and on-call duty or even consisted entirely of standby duty. Level I perinatal centers were most likely 20% (n = 88) to have a shift system in place. Working a shift system was significantly more common in care facilities which had previously carried out a job analysis. The number of physicians in hospitals who are present during the night shift was higher in facilities with higher numbers of births and in facilities which offered higher levels of care. In addition to regularly working overtime and the fact that often not all the hours worked were recorded, it was notable that the systems used to compile duty rosters often did not comply with legal regulations or with collectively agreed working hours nor were they compatible with the staff planning requirements. The results of this study show that the conditions of work, the working times, and the organization of working times in obstetric departments are in need of improvement. Recording the actual times worked together with an analysis of the activities performed during working times and while on standby would increase the level of transparency for employers and employees.

  9. PubMed Central

    Neimann, Johannes; Knabl, Julia; Puppe, Julian; Bayer, Christian Michael; Gass, Paul; Gabriel, Lena; Seelbach-Goebel, Birgit; Lermann, Johannes; Schott, Sarah

    2017-01-01

    Background Compiling a daily hospital roster which complies with existing laws and tariff regulations and meets the requirements for ongoing professional training while also taking the legal regulations on the health of employees into account makes planning the duty roster a challenge. The aim of this study was to obtain a realistic picture of existing duty roster systems and of the current workloads of obstetricians in Germany. Method This online survey was sent to 2770 physicians training to become obstetricians or specializing in specific areas of obstetric care. The survey consisted of an anonymized 95-item questionnaire which collected data on different types of duty roster systems and the workload of obstetricians in Germany for the period from 17.02.2015 to 16.05.2015. Results Out of a total of 2770 physicians who were contacted, 437 (16%) completed the questionnaire. Across all forms of care, the care provided outside normal working hours usually (75%) consisted of a combination of regular working times and on-call duty or even consisted entirely of standby duty. Level I perinatal centers were most likely 20% (n = 88) to have a shift system in place. Working a shift system was significantly more common in care facilities which had previously carried out a job analysis. The number of physicians in hospitals who are present during the night shift was higher in facilities with higher numbers of births and in facilities which offered higher levels of care. In addition to regularly working overtime and the fact that often not all the hours worked were recorded, it was notable that the systems used to compile duty rosters often did not comply with legal regulations or with collectively agreed working hours nor were they compatible with the staff planning requirements. Outlook The results of this study show that the conditions of work, the working times, and the organization of working times in obstetric departments are in need of improvement. Recording the actual times worked together with an analysis of the activities performed during working times and while on standby would increase the level of transparency for employers and employees. PMID:28845054

  10. Cigarette Smoke and Nicotine Effects on Brain Proinflammatory Responses and Behavioral and Motor Function in HIV-1 Transgenic Rats

    PubMed Central

    Royal, Walter; Can, Adem; Gould, Todd D.; Guo, Ming; Huse, Jared; Jackson, Myles; Davis, Harry; Bryant, Joseph

    2018-01-01

    Cognitive impairment in HIV-1 infection is associated with the induction of chronic proinflammatory responses in the brains of infected individuals. The risk of HIV-related cognitive impairment is increased by cigarette smoking, which induces brain inflammation in rodent models. To better understand the role of smoking and the associated immune response on behavioral and motor function in HIV infection, wild-type F344 and HIV-1 transgenic (HIV1Tg) rats were exposed to either smoke from nicotine-containing (regular) cigarettes, smoke from nicotine-free cigarettes, or to nicotine alone. The animals were then tested using the rotarod test (RRT), the novel object recognition test (NORT), and the open field test (OFT). Subsequently, brain frontal cortex from the rats was analyzed for levels of TNF-α, IL-1, and IL-6. On the RRT, impairment was noted for F344 rats exposed to either nicotine-free cigarette smoke or nicotine alone and for F344 and HIV1Tg rats exposed to regular cigarette smoke. Effects from the exposures on the OFT were seen only for HIV1Tg rats, for which function was worse following exposure to regular cigarette smoke as compared to exposure to nicotine alone. Expression levels for all three cytokines were overall higher for HIV1Tg than for F344 rats. For HIV1Tg rats, TNF-α, IL-1, and IL-6 gene expression levels for all exposure groups were higher than for control rats. All F344 rat exposure groups also showed significantly increased TNF-α expression levels. However, for F344 rats, IL-1 expression levels were higher only for rats exposed to nicotine-free and nicotine-containing CS, and no increase in IL-6 gene expression was noted with any of the exposures as compared to controls. These studies, therefore, demonstrate that F344 and HIV1Tg rats show differential behavioral and immune effects from these exposures. These effects may potentially reflect differences in the responsiveness of the various brain regions in the two animal species as well as the result of direct toxicity mediated by the proinflammatory cytokines that are produced by HIV proteins and by other factors that are present in regular cigarette smoke. PMID:29644536

  11. Compiling software for a hierarchical distributed processing system

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  12. The BLAZE language - A parallel language for scientific programming

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Van Rosendale, John

    1987-01-01

    A Pascal-like scientific programming language, BLAZE, is described. BLAZE contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus BLAZE should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with conceptually sequential control flow. A central goal in the design of BLAZE is portability across a broad range of parallel architectures. The multiple levels of parallelism present in BLAZE code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of BLAZE are described and it is shown how this language would be used in typical scientific programming.

  13. Integrative analysis of gene expression and copy number alterations using canonical correlation analysis.

    PubMed

    Soneson, Charlotte; Lilljebjörn, Henrik; Fioretos, Thoas; Fontes, Magnus

    2010-04-15

    With the rapid development of new genetic measurement methods, several types of genetic alterations can be quantified in a high-throughput manner. While the initial focus has been on investigating each data set separately, there is an increasing interest in studying the correlation structure between two or more data sets. Multivariate methods based on Canonical Correlation Analysis (CCA) have been proposed for integrating paired genetic data sets. The high dimensionality of microarray data imposes computational difficulties, which have been addressed for instance by studying the covariance structure of the data, or by reducing the number of variables prior to applying the CCA. In this work, we propose a new method for analyzing high-dimensional paired genetic data sets, which mainly emphasizes the correlation structure and still permits efficient application to very large data sets. The method is implemented by translating a regularized CCA to its dual form, where the computational complexity depends mainly on the number of samples instead of the number of variables. The optimal regularization parameters are chosen by cross-validation. We apply the regularized dual CCA, as well as a classical CCA preceded by a dimension-reducing Principal Components Analysis (PCA), to a paired data set of gene expression changes and copy number alterations in leukemia. Using the correlation-maximizing methods, regularized dual CCA and PCA+CCA, we show that without pre-selection of known disease-relevant genes, and without using information about clinical class membership, an exploratory analysis singles out two patient groups, corresponding to well-known leukemia subtypes. Furthermore, the variables showing the highest relevance to the extracted features agree with previous biological knowledge concerning copy number alterations and gene expression changes in these subtypes. Finally, the correlation-maximizing methods are shown to yield results which are more biologically interpretable than those resulting from a covariance-maximizing method, and provide different insight compared to when each variable set is studied separately using PCA. We conclude that regularized dual CCA as well as PCA+CCA are useful methods for exploratory analysis of paired genetic data sets, and can be efficiently implemented also when the number of variables is very large.

  14. Performance Portability Strategies for Grid C++ Expression Templates

    NASA Astrophysics Data System (ADS)

    Boyle, Peter A.; Clark, M. A.; DeTar, Carleton; Lin, Meifeng; Rana, Verinder; Vaquero Avilés-Casco, Alejandro

    2018-03-01

    One of the key requirements for the Lattice QCD Application Development as part of the US Exascale Computing Project is performance portability across multiple architectures. Using the Grid C++ expression template as a starting point, we report on the progress made with regards to the Grid GPU offloading strategies. We present both the successes and issues encountered in using CUDA, OpenACC and Just-In-Time compilation. Experimentation and performance on GPUs with a SU(3)×SU(3) streaming test will be reported. We will also report on the challenges of using current OpenMP 4.x for GPU offloading in the same code.

  15. Climate: A factor in the origin of the pole blight disease of Pinus monticola Dougl

    Treesearch

    Charles D. Leaphart; Albert R. Stage

    1971-01-01

    Measurements of cores or disc samples representing slightly more than 76,000 annual rings from 336 western white pine trees were compiled to obtain a set of deviations from normal growth of healthy trees that would express the response of these trees to variation in the environment during the last 280 years. Their growth was demonstrated to be a function of temperature...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonachea, Dan; Hargrove, P.

    GASNet is a language-independent, low-level networking layer that provides network-independent, high-performance communication primitives tailored for implementing parallel global address space SPMD languages and libraries such as UPC, UPC++, Co-Array Fortran, Legion, Chapel, and many others. The interface is primarily intended as a compilation target and for use by runtime library writers (as opposed to end users), and the primary goals are high performance, interface portability, and expressiveness. GASNet stands for "Global-Address Space Networking".

  17. Emory University: High-Throughput Protein-Protein Interaction Dataset for Lung Cancer-Associated Genes | Office of Cancer Genomics

    Cancer.gov

    To discover novel PPI signaling hubs for lung cancer, CTD2 Center at Emory utilized large-scale genomics datasets and literature to compile a set of lung cancer-associated genes. A library of expression vectors were generated for these genes and utilized for detecting pairwise PPIs with cell lysate-based TR-FRET assays in high-throughput screening format. Read the abstract.

  18. Stress regularity in quasi-static perfect plasticity with a pressure dependent yield criterion

    NASA Astrophysics Data System (ADS)

    Babadjian, Jean-François; Mora, Maria Giovanna

    2018-04-01

    This work is devoted to establishing a regularity result for the stress tensor in quasi-static planar isotropic linearly elastic - perfectly plastic materials obeying a Drucker-Prager or Mohr-Coulomb yield criterion. Under suitable assumptions on the data, it is proved that the stress tensor has a spatial gradient that is locally squared integrable. As a corollary, the usual measure theoretical flow rule is expressed in a strong form using the quasi-continuous representative of the stress.

  19. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  20. Bridging scales of crustal stress patterns using the new World Stress Map

    NASA Astrophysics Data System (ADS)

    Heidbach, O.; Rajabi, M.; Cui, X.; Fuchs, K. W.; Mueller, B.; Reinecker, J.; Reiter, K.; Tingay, M. R. P.; Wenzel, F.; Xie, F.; Ziegler, M.; Zoback, M. D.; Zoback, M. L.

    2017-12-01

    Knowledge of the contemporary crustal stress field is a key parameter for the understanding of geodynamic processes such as global plate tectonics and the earthquake cycle. It is also an essential parameter for our sustainable and safe usage of Earth's resources, which is a major challenge for energy security in the 21st century. Since 1986, the World Stress Map (WSM) project has systematically compiled present-day stress information and provides a unique public domain global database. It is a long-term project based on an international network of partners from academia and industry. All data are public and available on the project website at world-stress-map.org. For the 30th anniversary of the project a new database has been compiled, containing double the amount of data records (n=42,870) including new data records from almost 4,000 deep boreholes. The new compilation focused on areas with previously sparse data coverage in order to resolve the stress pattern on different spatial scales. The significantly higher data density can now be used to resolve stress pattern heterogeneities on regional and local scales, as well as with depth in some regions. We present three results derived from the new WSM compilation: 1.) The global comparison between absolute plate motion and the mean of the orientation of maximum horizontal stress SHmax on a regular grid shows that there is still a correlation for the North and South America plate, but deviations from this general trend are now also clearly resolved. 2.) The variability of the crustal stress pattern changes when zooming in from plate-wide scale down to basin scale at 100 km. We show examples for Eastern Australia, Oklahoma and Central Europe. This regional and local variability of the stress pattern can be used as a proxy to identify and quantify regional and local stress sources by means of geomechanical-numerical models of the 3D stress tensor. 3.) Finally we present briefly the general concept of a multi-stage 3D geomechanical-numerical model workflow based on the WSM data to describe the in situ stress tensor. 3D Geomechanical-numerical modelling of the in situ stress state is essential to derive a continuous description of the stress tensor e.g. in order to estimate the distance to a critical stress state.

  1. Historical records of the geomagnetic field

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Heilig, Balázs; Vadasz, Gergely; Valach, Fridrich; Dolinský, Peter; Hejda, Pavel; Fabian, Karl; Hammerl, Christa; Leonhardt, Roman

    2014-05-01

    Records of historical direct measurements of the geomagnetic field are invaluable sources to reconstruct temporal variations of the Earth's magnetic field. They provide information about the field evolution back to the late Middle Age. We have investigated such records with focus on Austria and some neighbouring countries. A variety of new sources and source types are examined. These include 19th century land survey and observatory records of the Imperial and Royal "Centralanstalt f. Meteorologie und Erdmagnetismus", which are not included in the existing compilations. Daily measurements at the Imperial and Royal Observatory in Prague have been digitized. The Imperial and Royal Navy carried out observations in the Adriatic Sea during several surveys. Declination values have been collected from famous mining areas in the former Austro-Hungarian Empire. In this connection, a time series for Banska Stiavnica has been compiled. In the meteorological yearbooks of the monastery Kremsmünster regular declination measurements for the first half of the 19th century were registered. Marsigli's observations during military mapping works in 1696 are also included in our collection. Moreover, compass roses on historical maps or declination values marked on compasses, sundials or globes also provide information about ancient field declination. An evaluation of church orientations in Lower Austria and Northern Germany did not support the hypothesis that church naves had been aligned along the East-West direction by means of magnetic compasses. Therefore, this potential source of information must be excluded from our collection. The gathered records are integrated into a database together with corresponding metadata, such as the used measurement instruments and methods. This information allows an assessment of quality and reliability of the historical observations. The combination of compilations of historical measurements with high quality archeo- and paleomagnetic data in a single database enables a reliable joint evaluation of all types of magnetic field records from different origins. This collection forms the basis for a combined inverse modelling of the geomagnetic field evolution.

  2. Compilation and analysis of global surface water concentrations for individual insecticide compounds.

    PubMed

    Stehle, Sebastian; Bub, Sascha; Schulz, Ralf

    2018-10-15

    The decades-long agricultural use of insecticides resulted in frequent contamination of surface waters globally regularly posing high risks for the aquatic biodiversity. However, the concentration levels of individual insecticide compounds have by now not been compiled and reported using global scale data, hampering our knowledge on the insecticide exposure of aquatic ecosystems. Here, we specify measured insecticide concentrations (MICs, comprising in total 11,300 water and sediment concentrations taken from a previous publication) for 28 important insecticide compounds covering four major insecticide classes. Results show that organochlorine and organophosphate insecticides, which dominated the global insecticide market for decades, have been detected most often and at highest concentration levels in surface waters globally. In comparison, MICs of the more recent pyrethroids and neonicotinoids were less often reported and generally at lower concentrations as a result of their later market introduction and lower application rates. An online insecticide classification calculator (ICC; available at: https://static.magic.eco/icc/v1) is provided in order to enable the comparison and classification of prospective MICs with available global insecticide concentrations. Spatial analyses of existing data show that most MICs were reported for surface waters in North America, Asia and Europe, whereas highest concentration levels were detected in Africa, Asia and South America. An evaluation of water and sediment MICs showed that theoretical organic carbon-water partition coefficients (K OC ) determined in the laboratory overestimated K OC values based on actual field concentrations by up to a factor of more than 20, with highest deviations found for highly sorptive pyrethroids. Overall, the comprehensive compilation of insecticide field concentrations presented here is a valuable tool for the classification of future surface water monitoring results and serves as important input data for more field relevant toxicity testing approaches and pesticide exposure and risk assessment schemes. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Young People’s Use of E-Cigarettes across the United Kingdom: Findings from Five Surveys 2015–2017

    PubMed Central

    Bauld, Linda; MacKintosh, Anne Marie; Eastwood, Brian; Ford, Allison; Moore, Graham; Dockrell, Martin; Arnott, Deborah; Cheeseman, Hazel

    2017-01-01

    Concern has been expressed about the use of e-cigarettes among young people. Our study reported e-cigarette and tobacco cigarette ever and regular use among 11–16 year olds across the UK. Data came from five large scale surveys with different designs and sampling strategies conducted between 2015 and 2017: The Youth Tobacco Policy Survey; the Schools Health Research Network Wales survey; two Action on Smoking and Health (ASH) Smokefree Great Britain-Youth Surveys; and the Scottish Schools Adolescent Lifestyle and Substance Use Survey. Cumulatively these surveys collected data from over 60,000 young people. For 2015/16 data for 11–16 year olds: ever smoking ranged from 11% to 20%; regular (at least weekly) smoking between 1% and 4%; ever use of e-cigarettes 7% to 18%; regular (at least weekly) use 1% to 3%; among never smokers, ever e-cigarette use ranged from 4% to 10% with regular use between 0.1% and 0.5%; among regular smokers, ever e-cigarette use ranged from 67% to 92% and regular use 7% to 38%. ASH surveys showed a rise in the prevalence of ever use of e-cigarettes from 7% (2016) to 11% (2017) but prevalence of regular use did not change remaining at 1%. In summary, surveys across the UK show a consistent pattern: most e-cigarette experimentation does not turn into regular use, and levels of regular use in young people who have never smoked remain very low. PMID:28850065

  4. Indirect effects of immunological tolerance to a regular dietary protein reduce cutaneous scar formation.

    PubMed

    Cantaruti, Thiago Anselmo; Costa, Raquel Alves; de Souza, Kênia Soares; Vaz, Nelson Monteiro; Carvalho, Cláudia Rocha

    2017-07-01

    Oral tolerance refers to the specific inhibition of immune responsiveness to T-cell-dependent antigens contacted through the oral route before parenteral immunization. Oral tolerance to one protein does not inhibit immune responses to other unrelated proteins, but parenteral injection of tolerated antigens plus adjuvant into tolerant, but not normal, mice inhibits immune responses to antigens injected concomitantly or soon thereafter. The inhibitory effect triggered by parenteral injection of tolerated proteins is known as bystander suppression or indirect effects of oral tolerance. Intraperitoneal injection of ovalbumin (OVA) plus alum adjuvant in OVA-tolerant mice soon before skin injury inhibits inflammation and improves cutaneous wound healing. However, as OVA is not a regular component of mouse chow, we tested whether indirect effects could be triggered by zein, the main protein of corn that is regularly present in mouse chow. We show that intraperitoneal injection of a single dose (10 μg) of zein plus alum adjuvant soon before skin injury in mice reduces leucocyte infiltration but increase the number of T cells and the expression of resistin-like molecule-α (a marker of alternatively activated macrophages) in the wound bed, increases the expression of transforming growth factor-β 3 in the newly formed epidermis and reduces cutaneous scar formation. These results suggest that indirect effects of oral tolerance triggered by parenteral injection of regular dietary components may be further explored as one alternative way to promote scarless wound healing. © 2017 John Wiley & Sons Ltd.

  5. Selenium modulates MMP2 expression through the TGFβ1/Smad signalling pathway in human umbilical vein endothelial cells and rabbits following lipid disturbance.

    PubMed

    Xu, Chenggui; Lu, Guihua; Li, Qinglang; Zhang, Juhong; Huang, Zhibin; Gao, Xiuren

    2017-07-01

    A high-fat diet is a major risk factor for coronary heart diseases. Matrix metalloprotease (MMP) expression is changed in many cardiovascular diseases. Selenium, which is an important trace element in animals, has a close relationship with cardiovascular diseases. The TGFβ1/Smad signalling pathway is ubiquitous in diverse tissues and cells, and it is also associated with the occurrence and development of cardiovascular diseases. Therefore, in this study, we aimed to determine selenium's effect on lipid metabolism, atherosclerotic plaque formation, and MMP2 expression, as well as the underlying functional mechanism. In vivo tests: 24 male New Zealand white rabbits were randomly divided into 4 groups: regular diet, high-fat diet, high-fat diet+selenium and regular diet+selenium groups. The high-fat diet induced the lipid disturbances of rabbits at week 12. Selenium supplementation lowered total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C) and triglyceride (TG) levels (p<0.01). Selenium supplementation also suppressed MMP2 over-expression in thoracic aortas. In vitro tests: Human umbilical vein endothelial cells (HUVECs) were treated with different concentrations of selenium or ox-LDL. Ox-LDL promoted MMP2 expression by increasing TGFβ1, pSmad2, pSmad3 and Smad3 expression (p<0.01). Selenium attenuated MMP2 over-expression by regulating the TGFβ1/Smad signalling pathway. Selenium suppressed high-fat diet-induced MMP2 over-expression in vivo by improving lipid metabolism. In vitro, selenium attenuated MMP2 over-expression through the TGFβ1/Smad signalling pathway. Copyright © 2017 Elsevier GmbH. All rights reserved.

  6. Aspirin and the Risk of Colorectal Cancer in Relation to the Expression of 15-Hydroxyprostaglandin Dehydrogenase (15-PGDH, HPGD)

    PubMed Central

    Fink, Stephen P.; Yamauchi, Mai; Nishihara, Reiko; Jung, Seungyoun; Kuchiba, Aya; Wu, Kana; Cho, Eunyoung; Giovannucci, Edward; Fuchs, Charles S.; Ogino, Shuji; Markowitz, Sanford D.; Chan, Andrew T.

    2014-01-01

    Aspirin use reduces the risk of colorectal neoplasia, at least in part, through inhibition of prostaglandin-endoperoxide synthase 2 (PTGS2, cyclooxygenase 2)-related pathways. Hydroxyprostaglandin dehydrogenase 15-(NAD) (15-PGDH, HPGD) is downregulated in colorectal cancers and functions as a metabolic antagonist of PTGS2. We hypothesized that the effect of aspirin may be antagonized by low 15-PGDH expression in the normal colon. In the Nurses’ Health Study and the Health Professionals Follow-up Study, we collected data on aspirin use and other risk factors every two years and followed up participants for diagnoses of colorectal cancer. Duplication-method Cox proportional, multivariable-adjusted, cause-specific hazards regression for competing risks data was used to compute hazard ratios (HRs) for incident colorectal cancer according to 15-PGDH mRNA expression level measured in normal mucosa from colorectal cancer resections. Among 127,865 participants, we documented 270 colorectal cancer cases that developed during 3,166,880 person-years of follow-up and from which we could assess 15-PGDH expression. Compared with nonuse, regular aspirin use was associated with lower risk of colorectal cancer that developed within a background of colonic mucosa with high 15-PGDH expression (multivariable HR=0.49; 95% CI, 0.34–0.71), but not with low 15-PGDH expression (multivariable HR=0.90; 95% CI, 0.63–1.27) (P for heterogeneity=0.018). Regular aspirin use was associated with lower incidence of colorectal cancers arising in association with high 15-PGDH expression, but not with low 15-PGDH expression in normal colon mucosa. This suggests that 15-PGDH expression level in normal colon mucosa may serve as a biomarker which may predict stronger benefit from aspirin chemoprevention. PMID:24760190

  7. Emerge - A Python environment for the modeling of subsurface transfers

    NASA Astrophysics Data System (ADS)

    Lopez, S.; Smai, F.; Sochala, P.

    2014-12-01

    The simulation of subsurface mass and energy transfers often relies on specific codes that were mainly developed using compiled languages which usually ensure computational efficiency at the expense of relatively long development times and relatively rigid software. Even if a very detailed, possibly graphical, user-interface is developed the core numerical aspects are rarely accessible and the smallest modification will always need a compilation step. Thus, user-defined physical laws or alternative numerical schemes may be relatively difficult to use. Over the last decade, Python has emerged as a popular and widely used language in the scientific community. There already exist several libraries for the pre and post-treatment of input and output files for reservoir simulators (e.g. pytough). Development times in Python are considerably reduced compared to compiled languages, and programs can be easily interfaced with libraries written in compiled languages with several comprehensive numerical libraries that provide sequential and parallel solvers (e.g. PETSc, Trilinos…). The core objective of the Emerge project is to explore the possibility to develop a modeling environment in full Python. Consequently, we are developing an open python package with the classes/objects necessary to express, discretize and solve the physical problems encountered in the modeling of subsurface transfers. We heavily relied on Python to have a convenient and concise way of manipulating potentially complex concepts with a few lines of code and a high level of abstraction. Our result aims to be a friendly numerical environment targeting both numerical engineers and physicist or geoscientists with the possibility to quickly specify and handle geometries, arbitrary meshes, spatially or temporally varying properties, PDE formulations, boundary conditions…

  8. HOPE: A Python just-in-time compiler for astrophysical computations

    NASA Astrophysics Data System (ADS)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  9. Gene expression based mouse brain parcellation using Markov random field regularized non-negative matrix factorization

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Haynor, David R.; Thompson, Carol L.; Lein, Ed; Hawrylycz, Michael

    2009-02-01

    Understanding the geography of genetic expression in the mouse brain has opened previously unexplored avenues in neuroinformatics. The Allen Brain Atlas (www.brain-map.org) (ABA) provides genome-wide colorimetric in situ hybridization (ISH) gene expression images at high spatial resolution, all mapped to a common three-dimensional 200μm3 spatial framework defined by the Allen Reference Atlas (ARA) and is a unique data set for studying expression based structural and functional organization of the brain. The goal of this study was to facilitate an unbiased data-driven structural partitioning of the major structures in the mouse brain. We have developed an algorithm that uses nonnegative matrix factorization (NMF) to perform parts based analysis of ISH gene expression images. The standard NMF approach and its variants are limited in their ability to flexibly integrate prior knowledge, in the context of spatial data. In this paper, we introduce spatial connectivity as an additional regularization in NMF decomposition via the use of Markov Random Fields (mNMF). The mNMF algorithm alternates neighborhood updates with iterations of the standard NMF algorithm to exploit spatial correlations in the data. We present the algorithm and show the sub-divisions of hippocampus and somatosensory-cortex obtained via this approach. The results are compared with established neuroanatomic knowledge. We also highlight novel gene expression based sub divisions of the hippocampus identified by using the mNMF algorithm.

  10. CAFE: an R package for the detection of gross chromosomal abnormalities from gene expression microarray data.

    PubMed

    Bollen, Sander; Leddin, Mathias; Andrade-Navarro, Miguel A; Mah, Nancy

    2014-05-15

    The current methods available to detect chromosomal abnormalities from DNA microarray expression data are cumbersome and inflexible. CAFE has been developed to alleviate these issues. It is implemented as an R package that analyzes Affymetrix *.CEL files and comes with flexible plotting functions, easing visualization of chromosomal abnormalities. CAFE is available from https://bitbucket.org/cob87icW6z/cafe/ as both source and compiled packages for Linux and Windows. It is released under the GPL version 3 license. CAFE will also be freely available from Bioconductor. sander.h.bollen@gmail.com or nancy.mah@mdc-berlin.de Supplementary data are available at Bioinformatics online.

  11. Stromal and epithelial cells react differentially to c-kit in fibroepithelial tumors of the breast.

    PubMed

    Logullo, Angela F; Nonogaki, Suely; Do Socorro Maciel, Maria; Mourão-Neto, Mário; Soares, Fernando Augusto

    2008-01-01

    The CD117 protein is a tyrosine-kinase receptor encoded by the c-kit gene that frequently bears activating mutations in gastrointestinal tumors. Conflicting findings regarding CD117 expression in other stromal tumors, including phyllodes tumors (PTs), have been reported in the literature. The purpose of this study was to evaluate c-kit expression in the stroma and epithelia of fibroepithelial breast tumors and its correlation with clinical pathological variables. Ninety-six fibroepithelial tumors of the breast, including 14 fibroadenomas (FAs), 12 juvenile FAs and 70 PTs, were classified according to stromal cellularity, atypia, epithelial hyperplasia, mitosis and borders into 45 benign (PTB), 17 borderline (PTBL) and 8 malignant (PTM) tumors. CD117 expression was identified in the stromal component in only two cases of PTBL. Overall, 38 cases (39.6%) showed positive CD117 in the epithelial component, including 20 FAs (10 regular, 10 juvenile) and 18 PTs (11 PTBs and 8 PTBLs). Other cases, including all PTMs, 6 FAs (4 regular, 2 juvenile), 34 PTBs and 10 PTBLs, showed no positivity in the epithelial component. Expression of c-kit did not correlate with diagnosis or malignancy (p>0.05). In conclusion, c-kit is expressed more often in the epithelial than in the stromal component in fibroepithelial tumors of the breast, and is associated with benign lesions.

  12. Ada (Trade Name) Compiler Validation Summary Report. Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H1200 and H800.

    DTIC Science & Technology

    1987-04-30

    AiBI 895 ADA (TRADENNANE) COMPILER VALIDATION SUMMARY REPORT / HARRIS CORPORATION HA (U) INFORMATION SYSTEMS AND TECHNOLOGY CENTER W-P AFS OH ADA...Compiler Validation Summary Report : 30 APR 1986 to 30 APR 1987 Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H1200 and H800 6...the United States Government (Ada Joint Program Office). Adae Compiler Validation mary Report : Compiler Name: HARRIS Ada Compiler, Version 1.0 1 Host

  13. Ada (Tradename) Compiler Validation Summary Report. Harris Corporation. Harris Ada Compiler, Version 1.0. Harris H700 and H60.

    DTIC Science & Technology

    1986-06-28

    Report : 28 JUN 1986 to 28 JUN 1987 Harris Corporation, HARRIS Ada Compiler, Version 1.0, Harris H700 and H60 6. PERFORMING ORG. REPORT ...CLASSIFICATION OF THIS PAGE (When Oata Entered) .. . • -- 7 1. -SUPPLEMENTARYNOTES Ada ® Compiler Validation Summary Report : Compiler Name: HARRIS Ada Compiler...AVF-VSR-43.1086 Ada® COMPILER VALIDATION SUMMARY REPORT : Harris Corporation HARRIS Ada Compiler, Version 1.0 Harris H700 and H60 Completion of

  14. CIL: Compiler Implementation Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gries, David

    1969-03-01

    This report is a manual for the proposed Compiler Implementation Language, CIL. It is not an expository paper on the subject of compiler writing or compiler-compilers. The language definition may change as work progresses on the project. It is designed for writing compilers for the IBM 360 computers.

  15. Maximal volume behind horizons without curvature singularity

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Jun; Guo, Xin-Xuan; Wang, Towe

    2018-01-01

    The black hole information paradox is related to the area of event horizon, and potentially to the volume and singularity behind it. One example is the complexity/volume duality conjectured by Stanford and Susskind. Accepting the proposal of Christodoulou and Rovelli, we calculate the maximal volume inside regular black holes, which are free of curvature singularity, in asymptotically flat and anti-de Sitter spacetimes respectively. The complexity/volume duality is then applied to anti-de Sitter regular black holes. We also present an analytical expression for the maximal volume outside the de Sitter horizon.

  16. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  17. Nonclassical states of light with a smooth P function

    NASA Astrophysics Data System (ADS)

    Damanet, François; Kübler, Jonas; Martin, John; Braun, Daniel

    2018-02-01

    There is a common understanding in quantum optics that nonclassical states of light are states that do not have a positive semidefinite and sufficiently regular Glauber-Sudarshan P function. Almost all known nonclassical states have P functions that are highly irregular, which makes working with them difficult and direct experimental reconstruction impossible. Here we introduce classes of nonclassical states with regular, non-positive-definite P functions. They are constructed by "puncturing" regular smooth positive P functions with negative Dirac-δ peaks or other sufficiently narrow smooth negative functions. We determine the parameter ranges for which such punctures are possible without losing the positivity of the state, the regimes yielding antibunching of light, and the expressions of the Wigner functions for all investigated punctured states. Finally, we propose some possible experimental realizations of such states.

  18. Modeling Regular Replacement for String Constraint Solving

    NASA Technical Reports Server (NTRS)

    Fu, Xiang; Li, Chung-Chih

    2010-01-01

    Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications

  19. High-fat diet amplifies renal renin angiotensin system expression, blood pressure elevation, and renal dysfunction caused by Ceacam1 null deletion.

    PubMed

    Li, Caixia; Culver, Silas A; Quadri, Syed; Ledford, Kelly L; Al-Share, Qusai Y; Ghadieh, Hilda E; Najjar, Sonia M; Siragy, Helmy M

    2015-11-01

    Carcinoembryonic antigen-related cell adhesion molecule 1 (CEACAMl), a substrate of the insulin receptor tyrosine kinase, regulates insulin action by promoting insulin clearance. Global null mutation of Ceacam1 gene (Cc1(-/-)) results in features of the metabolic syndrome, including insulin resistance, hyperinsulinemia, visceral adiposity, elevated blood pressure, and albuminuria. It also causes activation of the renal renin-angiotensin system (RAS). In the current study, we tested the hypothesis that high-fat diet enhances the expression of RAS components. Three-month-old wild-type (Cc1(+/+)) and Cc1(-/-) mice were fed either a regular or a high-fat diet for 8 wk. At baseline under regular feeding conditions, Cc1(-/-) mice exhibited higher blood pressure, urine albumin-to-creatinine ratio (UACR), and renal expression of angiotensinogen, renin/prorenin, angiotensin-converting enzyme, (pro)renin receptor, angiotensin subtype AT1 receptor, angiotensin II, and elevated PI3K phosphorylation, as detected by p85α (Tyr(508)) immunostaining, inflammatory response, and the expression of collagen I and collagen III. In Cc1(+/+) mice, high-fat diet increased blood pressure, UACR, the expression of angiotensin-converting enzyme and angiotensin II, PI3K phosphorylation, inflammatory response, and the expression of collagen I and collagen III. In Cc1(-/-) mice, high-fat intake further amplified these parameters. Immunohistochemical staining showed increased p-PI3K p85α (Tyr(508)) expression in renal glomeruli, proximal, distal, and collecting tubules of Cc1(-/-) mice fed a high-fat diet. Together, this demonstrates that high-fat diet amplifies the permissive effect of Ceacam1 deletion on renal expression of all RAS components, PI3K phosphorylation, inflammation, and fibrosis. Copyright © 2015 the American Physiological Society.

  20. Regular paths in SparQL: querying the NCI Thesaurus.

    PubMed

    Detwiler, Landon T; Suciu, Dan; Brinkley, James F

    2008-11-06

    OWL, the Web Ontology Language, provides syntax and semantics for representing knowledge for the semantic web. Many of the constructs of OWL have a basis in the field of description logics. While the formal underpinnings of description logics have lead to a highly computable language, it has come at a cognitive cost. OWL ontologies are often unintuitive to readers lacking a strong logic background. In this work we describe GLEEN, a regular path expression library, which extends the RDF query language SparQL to support complex path expressions over OWL and other RDF-based ontologies. We illustrate the utility of GLEEN by showing how it can be used in a query-based approach to defining simpler, more intuitive views of OWL ontologies. In particular we show how relatively simple GLEEN-enhanced SparQL queries can create views of the OWL version of the NCI Thesaurus that match the views generated by the web-based NCI browser.

  1. Word-of-Mouth Innovation: Hypothesis Generation for Supplement Repurposing based on Consumer Reviews.

    PubMed

    Fan, Jung-Wei; Lussier, Yves A

    2017-01-01

    Dietary supplements remain a relatively underexplored source for drug repurposing. A systematic approach to soliciting responses from a large consumer population is desirable to speed up innovation. We tested a workflow that mines unexpected benefits of dietary supplements from massive consumer reviews. A (non-exhaustive) list of regular expressions was used to screen over 2 million reviews on health and personal care products. The matched reviews were manually analyzed, and one supplement-disease pair was linked to biological databases for enriching the hypothesized association. The regular expressions found 169 candidate reviews, of which 45.6% described unexpected benefits of certain dietary supplements. The manual analysis showed some of the supplement-disease associations to be novel or in agreement with evidence published later in the literature. The hypothesis enrichment was able to identify meaningful function similarity between the supplement and the disease. The results demonstrated value of the workflow in identifying candidates for supplement repurposing.

  2. Grouped gene selection and multi-classification of acute leukemia via new regularized multinomial regression.

    PubMed

    Li, Juntao; Wang, Yanyan; Jiang, Tao; Xiao, Huimin; Song, Xuekun

    2018-05-09

    Diagnosing acute leukemia is the necessary prerequisite to treating it. Multi-classification on the gene expression data of acute leukemia is help for diagnosing it which contains B-cell acute lymphoblastic leukemia (BALL), T-cell acute lymphoblastic leukemia (TALL) and acute myeloid leukemia (AML). However, selecting cancer-causing genes is a challenging problem in performing multi-classification. In this paper, weighted gene co-expression networks are employed to divide the genes into groups. Based on the dividing groups, a new regularized multinomial regression with overlapping group lasso penalty (MROGL) has been presented to simultaneously perform multi-classification and select gene groups. By implementing this method on three-class acute leukemia data, the grouped genes which work synergistically are identified, and the overlapped genes shared by different groups are also highlighted. Moreover, MROGL outperforms other five methods on multi-classification accuracy. Copyright © 2017. Published by Elsevier B.V.

  3. A Standard of Knowledge for the Professional Practice of Toxicology

    PubMed Central

    Kinter, Lewis B.; Kelman, Bruce

    2015-01-01

    Background Employers, courts, and the general public judge the credibility of professionals based on credentials such as academic degrees, publications, memberships in professional organizations, board certifications, and professional registrations. However, the relevance and merit of these credentials can be difficult to determine objectively. Board certification can be a reliable indicator of proficiency if the certifying organization demonstrates, through regularly scheduled independent review, that its processes meet established standards and when a certificate holder is required to periodically demonstrate command of a body of knowledge that is essential to current professional practice. Objective We report herein a current Standard of Knowledge in general toxicology compiled from the experience and opinions of 889 certified practicing professional toxicologists. Discussion An examination is the most commonly used instrument for testing a certification candidate’s command of the body of knowledge. However, an examination-based certification is only creditable when the body of knowledge, to which a certification examination tests, is representative of the current knowledge, skills, and capabilities needed to effectively practice at the professional level. Thus, that body of knowledge must be the current “Standard of Knowledge” for the profession, compiled in a transparent fashion from current practitioners of the profession. Conclusion This work was conducted toward ensuring the scientific integrity of the products produced by professional toxicologists. Citation Hulla JE, Kinter LB, Kelman B. 2015. A Standard of Knowledge for the professional practice of toxicology. Environ Health Perspect 123:743–748; http://dx.doi.org/10.1289/ehp.1408643 PMID:25782181

  4. Integrating EarthScope Data to Constrain the Long-Term Effects of Tectonism on Continental Lithosphere

    NASA Astrophysics Data System (ADS)

    Porter, R. C.; van der Lee, S.

    2017-12-01

    One of the most significant products of the EarthScope experiment has been the development of new seismic tomography models that take advantage of the consistent station design, regular 70-km station spacing, and wide aperture of the EarthScope Transportable Array (TA) network. These models have led to the discovery and interpretation of additional compositional, thermal, and density anomalies throughout the continental US, especially within tectonically stable regions. The goal of this work is use data from the EarthScope experiment to better elucidate the temporal relationship between tectonic activity and seismic velocities. To accomplish this, we compile several upper-mantle seismic velocity models from the Incorporated Research Institute for Seismology (IRIS) Earth Model Collaboration (EMC) and compare these to a tectonic age model we compiled using geochemical ages from the Interdisciplinary Earth Data Alliance: EarthChem Database. Results from this work confirms quantitatively that the time elapsed since the most recent tectonic event is a dominant influence on seismic velocities within the upper mantle across North America. To further understand this relationship, we apply mineral-physics models for peridotite to estimate upper-mantle temperatures for the continental US from tomographically imaged shear velocities. This work shows that the relationship between the estimated temperatures and the time elapsed since the most recent tectonic event is broadly consistent with plate cooling models, yet shows intriguing scatter. Ultimately, this work constrains the long-term thermal evolution of continental mantle lithosphere.

  5. Differential roles of nonsynaptic and synaptic plasticity in operant reward learning-induced compulsive behavior.

    PubMed

    Sieling, Fred; Bédécarrats, Alexis; Simmers, John; Prinz, Astrid A; Nargeot, Romuald

    2014-05-05

    Rewarding stimuli in associative learning can transform the irregularly and infrequently generated motor patterns underlying motivated behaviors into output for accelerated and stereotyped repetitive action. This transition to compulsive behavioral expression is associated with modified synaptic and membrane properties of central neurons, but establishing the causal relationships between cellular plasticity and motor adaptation has remained a challenge. We found previously that changes in the intrinsic excitability and electrical synapses of identified neurons in Aplysia's central pattern-generating network for feeding are correlated with a switch to compulsive-like motor output expression induced by in vivo operant conditioning. Here, we used specific computer-simulated ionic currents in vitro to selectively replicate or suppress the membrane and synaptic plasticity resulting from this learning. In naive in vitro preparations, such experimental manipulation of neuronal membrane properties alone increased the frequency but not the regularity of feeding motor output found in preparations from operantly trained animals. On the other hand, changes in synaptic strength alone switched the regularity but not the frequency of feeding output from naive to trained states. However, simultaneously imposed changes in both membrane and synaptic properties reproduced both major aspects of the motor plasticity. Conversely, in preparations from trained animals, experimental suppression of the membrane and synaptic plasticity abolished the increase in frequency and regularity of the learned motor output expression. These data establish direct causality for the contributions of distinct synaptic and nonsynaptic adaptive processes to complementary facets of a compulsive behavior resulting from operant reward learning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Human action recognition with group lasso regularized-support vector machine

    NASA Astrophysics Data System (ADS)

    Luo, Huiwu; Lu, Huanzhang; Wu, Yabei; Zhao, Fei

    2016-05-01

    The bag-of-visual-words (BOVW) and Fisher kernel are two popular models in human action recognition, and support vector machine (SVM) is the most commonly used classifier for the two models. We show two kinds of group structures in the feature representation constructed by BOVW and Fisher kernel, respectively, since the structural information of feature representation can be seen as a prior for the classifier and can improve the performance of the classifier, which has been verified in several areas. However, the standard SVM employs L2-norm regularization in its learning procedure, which penalizes each variable individually and cannot express the structural information of feature representation. We replace the L2-norm regularization with group lasso regularization in standard SVM, and a group lasso regularized-support vector machine (GLRSVM) is proposed. Then, we embed the group structural information of feature representation into GLRSVM. Finally, we introduce an algorithm to solve the optimization problem of GLRSVM by alternating directions method of multipliers. The experiments evaluated on KTH, YouTube, and Hollywood2 datasets show that our method achieves promising results and improves the state-of-the-art methods on KTH and YouTube datasets.

  7. 1 / n Expansion for the Number of Matchings on Regular Graphs and Monomer-Dimer Entropy

    NASA Astrophysics Data System (ADS)

    Pernici, Mario

    2017-08-01

    Using a 1 / n expansion, that is an expansion in descending powers of n, for the number of matchings in regular graphs with 2 n vertices, we study the monomer-dimer entropy for two classes of graphs. We study the difference between the extensive monomer-dimer entropy of a random r-regular graph G (bipartite or not) with 2 n vertices and the average extensive entropy of r-regular graphs with 2 n vertices, in the limit n → ∞. We find a series expansion for it in the numbers of cycles; with probability 1 it converges for dimer density p < 1 and, for G bipartite, it diverges as |ln(1-p)| for p → 1. In the case of regular lattices, we similarly expand the difference between the specific monomer-dimer entropy on a lattice and the one on the Bethe lattice; we write down its Taylor expansion in powers of p through the order 10, expressed in terms of the number of totally reducible walks which are not tree-like. We prove through order 6 that its expansion coefficients in powers of p are non-negative.

  8. Ada (Trade Name) Compiler Validation Summary Report: Harris Corporation Harris Ada Compiler, Version 1.3 Harris HCX-7.

    DTIC Science & Technology

    1987-06-03

    Harris Corp. Harris Ada Compiler, Ver.1.3 Harris HCX-7 6. PERFORMING ORG. REPORT NUMBER 7 AUTH R(s 8. CONTRACT OR GRANT...VALIDATION SUMMARY REPORT : Harris Corporation Harris Ada Compiler, Version 1.3 Harris HCX-7 Completion of On-Site Testing: 3 June 1987 & .. . 0 Prepared...Place NTIS form here + .. . .. . .. .. Ada’ Compiler Validation Summary Report : Compiler Name: Harris Ada Compiler, Version 1.3 Host: Target: Harris

  9. Low power adder based auditory filter architecture.

    PubMed

    Rahiman, P F Khaleelur; Jayanthi, V S

    2014-01-01

    Cochlea devices are powered up with the help of batteries and they should possess long working life to avoid replacing of devices at regular interval of years. Hence the devices with low power consumptions are required. In cochlea devices there are numerous filters, each responsible for frequency variant signals, which helps in identifying speech signals of different audible range. In this paper, multiplierless lookup table (LUT) based auditory filter is implemented. Power aware adder architectures are utilized to add the output samples of the LUT, available at every clock cycle. The design is developed and modeled using Verilog HDL, simulated using Mentor Graphics Model-Sim Simulator, and synthesized using Synopsys Design Compiler tool. The design was mapped to TSMC 65 nm technological node. The standard ASIC design methodology has been adapted to carry out the power analysis. The proposed FIR filter architecture has reduced the leakage power by 15% and increased its performance by 2.76%.

  10. Regional Land Use Mapping: the Phoenix Pilot Project

    NASA Technical Reports Server (NTRS)

    Anderson, J. R.; Place, J. L.

    1971-01-01

    The Phoenix Pilot Program has been designed to make effective use of past experience in making land use maps and collecting land use information. Conclusions reached from the project are: (1) Land use maps and accompanying statistical information of reasonable accuracy and quality can be compiled at a scale of 1:250,000 from orbital imagery. (2) Orbital imagery used in conjunction with other sources of information when available can significantly enhance the collection and analysis of land use information. (3) Orbital imagery combined with modern computer technology will help resolve the problem of obtaining land use data quickly and on a regular basis, which will greatly enhance the usefulness of such data in regional planning, land management, and other applied programs. (4) Agreement on a framework or scheme of land use classification for use with orbital imagery will be necessary for effective use of land use data.

  11. Impacts of curricular change: Implications from 8 years of data in introductory physics

    NASA Astrophysics Data System (ADS)

    Pollock, Steven J.; Finkelstein, Noah

    2013-01-01

    Introductory calculus-based physics classes at the University of Colorado Boulder were significantly transformed beginning in 2004. They now regularly include: interactive engagement using clickers in large lecture settings, Tutorials in Introductory Physics with use of undergraduate Learning Assistants in recitation sections, and a staffed help-room setting where students work on personalized CAPA homework. We compile and summarize conceptual (FMCE and BEMA) pre- and post-data from over 9,000 unique students after 16 semesters of both Physics 1 and 2. Within a single institution with stable pre-test scores, we reproduce results of Hake's 1998 study that demonstrate the positive impacts of interactive engagement on student performance. We link the degree of faculty's use of interactive engagement techniques and their experience levels on student outcomes, and argue for the role of such systematic data collection in sustained course and institutional transformations.

  12. On the accuracy of ERS-1 orbit predictions

    NASA Technical Reports Server (NTRS)

    Koenig, Rolf; Li, H.; Massmann, Franz-Heinrich; Raimondo, J. C.; Rajasenan, C.; Reigber, C.

    1993-01-01

    Since the launch of ERS-1, the D-PAF (German Processing and Archiving Facility) provides regularly orbit predictions for the worldwide SLR (Satellite Laser Ranging) tracking network. The weekly distributed orbital elements are so called tuned IRV's and tuned SAO-elements. The tuning procedure, designed to improve the accuracy of the recovery of the orbit at the stations, is discussed based on numerical results. This shows that tuning of elements is essential for ERS-1 with the currently applied tracking procedures. The orbital elements are updated by daily distributed time bias functions. The generation of the time bias function is explained. Problems and numerical results are presented. The time bias function increases the prediction accuracy considerably. Finally, the quality assessment of ERS-1 orbit predictions is described. The accuracy is compiled for about 250 days since launch. The average accuracy lies in the range of 50-100 ms and has considerably improved.

  13. The Funding of Long-Term Care in Canada: What Do We Know, What Should We Know?

    PubMed

    Grignon, Michel; Spencer, Byron G

    2018-06-01

    ABSTRACTLong-term care is a growing component of health care spending but how much is spent or who bears the cost is uncertain, and the measures vary depending on the source used. We drew on regularly published series and ad hoc publications to compile preferred estimates of the share of long-term care spending in total health care spending, the private share of long-term care spending, and the share of residential care within long-term care. For each series, we compared estimates obtainable from published sources (CIHI [Canadian Institute for Health Information] and OECD [Organization for Economic Cooperation and Development]) with our preferred estimates. We conclude that using published series without adjustment would lead to spurious conclusions on the level and evolution of spending on long-term care in Canada as well as on the distribution of costs between private and public funders and between residential and home care.

  14. Meeting the challenges of clinical information provision.

    PubMed

    Spring, Hannah

    2017-12-01

    This virtual issue of the Health Information and Libraries Journal (HILJ) has been compiled to mark the 5th International Clinical Librarian Conference 2011. In considering the challenges of clinical information provision, the content selected for the virtual issue offers an international flavour of clinical information provision and covers a variety of different facets of clinical librarianship. The issue broadly covers the areas of information needs and preferences, clinical librarian roles and services, and education and training, and reflects the way in which a normal issue of the HILJ would be presented. This includes a review article, a collection of original articles, and the three regular features which comprise International Perspectives and Initiatives, Learning and Teaching in Action, and Using Evidence in Practice. All papers included in this virtual issue are available free online. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.

  15. Computing and Using Metrics in the ADS

    NASA Astrophysics Data System (ADS)

    Henneken, E. A.; Accomazzi, A.; Kurtz, M. J.; Grant, C. S.; Thompson, D.; Luker, J.; Chyla, R.; Holachek, A.; Murray, S. S.

    2015-04-01

    Finding measures for research impact, be it for individuals, institutions, instruments, or projects, has gained a lot of popularity. There are more papers written than ever on new impact measures, and problems with existing measures are being pointed out on a regular basis. Funding agencies require impact statistics in their reports, job candidates incorporate them in their resumes, and publication metrics have even been used in at least one recent court case. To support this need for research impact indicators, the SAO/NASA Astrophysics Data System (ADS) has developed a service that provides a broad overview of various impact measures. In this paper we discuss how the ADS can be used to quench the thirst for impact measures. We will also discuss a couple of the lesser-known indicators in the metrics overview and the main issues to be aware of when compiling publication-based metrics in the ADS, namely author name ambiguity and citation incompleteness.

  16. Analysis tools for the interplay between genome layout and regulation.

    PubMed

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  17. Pythran: enabling static optimization of scientific Python programs

    NASA Astrophysics Data System (ADS)

    Guelton, Serge; Brunet, Pierrick; Amini, Mehdi; Merlini, Adrien; Corbillon, Xavier; Raynaud, Alan

    2015-01-01

    Pythran is an open source static compiler that turns modules written in a subset of Python language into native ones. Assuming that scientific modules do not rely much on the dynamic features of the language, it trades them for powerful, possibly inter-procedural, optimizations. These optimizations include detection of pure functions, temporary allocation removal, constant folding, Numpy ufunc fusion and parallelization, explicit thread-level parallelism through OpenMP annotations, false variable polymorphism pruning, and automatic vector instruction generation such as AVX or SSE. In addition to these compilation steps, Pythran provides a C++ runtime library that leverages the C++ STL to provide generic containers, and the Numeric Template Toolbox for Numpy support. It takes advantage of modern C++11 features such as variadic templates, type inference, move semantics and perfect forwarding, as well as classical idioms such as expression templates. Unlike the Cython approach, Pythran input code remains compatible with the Python interpreter. Output code is generally as efficient as the annotated Cython equivalent, if not more, but without the backward compatibility loss.

  18. River-quality assessment of the Truckee and Carson River system, California and Nevada; hydrologic characteristics

    USGS Publications Warehouse

    Brown, W. M.; Nowlin, J.O.; Smith, L.H.; Flint, M.R.

    1986-01-01

    A study of the Truckee and Carson Rivers was begun in October 1978 to assess the cause and effect relations between human and natural actions, and the quality of water at different times and places along the rivers. This report deals with the compilation of basic hydrologic data and the presentation of some of the new data collected during the study. Topographic, flow, and chemical data, data from recent time-of-travel studies, and new data on river mileages and drainage areas that were determined using new , high-resolution maps, are included. The report is a guide to locating maps, aerial photographs, computer files, and reports that relate to the rivers and their basins. It describes methods for compiling and expressing hydrologic information for ease of reading and understanding by the many users of water-related data. Text, tabular data, and colored plates with detailed maps and hydrographs are extensively cross referenced. (USGS)

  19. Diversity of human lip prints: a collaborative study of ethnically distinct world populations.

    PubMed

    Sharma, Namita Alok; Eldomiaty, Magda Ahmed; Gutiérrez-Redomero, Esperanza; George, Adekunle Olufemi; Garud, Rajendra Somnath; Sánchez-Andrés, Angeles; Almasry, Shaima Mohamed; Rivaldería, Noemí; Al-Gaidi, Sami Awda; Ilesanmi, Toyosi

    2014-01-01

    Cheiloscopy is a comparatively recent counterpart to the long established dactyloscopic studies. Ethnic variability of these lip groove patterns has not yet been explored. This study was a collaborative effort aimed at establishing cheiloscopic variations amongst modern human populations from four geographically and culturally far removed nations: India, Saudi Arabia, Spain and Nigeria. Lip prints from a total of 754 subjects were collected and each was divided into four equal quadrants. The patterns were classified into six regular types (A-F), while some patterns which could not be fitted into the regular ones were segregated into G groups (G-0, G-1, G-2). Furthermore, co-dominance of more than one pattern type in a single quadrant forced us to identify the combination (COM, G-COM) patterns. The remarkable feature noted after compilation of the data included pattern C (a bifurcate/branched prototype extending the entire height of the lip) being a frequent feature of the lips of all the populations studied, save for the Nigerian population in which it was completely absent and which showed a tendency for pattern A (a vertical linear groove) and a significantly higher susceptibility for combination (COM) patterns. Chi-square test and correspondence analysis applied to the frequency of patterns appearing in the defined topographical areas indicated a significant variation for the populations studied.

  20. Expression, function and regulation of mouse cytochrome P450 enzymes: comparison with human P450 enzymes.

    PubMed

    Hrycay, E G; Bandiera, S M

    2009-12-01

    The present review focuses on the expression, function and regulation of mouse cytochrome P450 (Cyp) enzymes. Information compiled for mouse Cyp enzymes is compared with data collected for human CYP enzymes. To date, approximately 40 pairs of orthologous mouse-human CYP genes have been identified that encode enzymes performing similar metabolic functions. Recent knowledge concerning the tissue expression of mouse Cyp enzymes from families 1 to 51 is summarized. The catalytic activities of microsomal, mitochondrial and recombinant mouse Cyp enzymes are discussed and their involvement in the metabolism of exogenous and endogenous compounds is highlighted. The role of nuclear receptors, such as the aryl hydrocarbon receptor, constitutive androstane receptor and pregnane X receptor, in regulating the expression of mouse Cyp enzymes is examined. Targeted disruption of selected Cyp genes has generated numerous Cyp null mouse lines used to decipher the role of Cyp enzymes in metabolic, toxicological and biological processes. In conclusion, the laboratory mouse is an indispensable model for exploring human CYP-mediated activities.

  1. Healthy late preterm infants and supplementary artificial milk feeds: effects on breast feeding and associated clinical parameters.

    PubMed

    Mattsson, Elisabet; Funkquist, Eva-Lotta; Wickström, Maria; Nyqvist, Kerstin H; Volgsten, Helena

    2015-04-01

    to compare the influence of supplementary artificial milk feeds on breast feeding and certain clinical parameters among healthy late preterm infants given regular supplementary artificial milk feeds versus being exclusively breast fed from birth. a comparative study using quantitative methods. Data were collected via a parental diary and medical records. parents of 77 late preterm infants (34 5/7-36 6/7 weeks), whose mothers intended to breast feed, completed a diary during the infants׳ hospital stay. infants who received regular supplementary artificial milk feeds experienced a longer delay before initiation of breast feeding, were breast fed less frequently and had longer hospital stays than infants exclusively breast fed from birth. Exclusively breast-fed infants had a greater weight loss than infants with regular artificial milk supplementation. A majority of the mothers (65%) with an infant prescribed artificial milk never expressed their milk and among the mothers who used a breast-pump, milk expression commenced late (10-84 hours after birth). At discharge, all infants were breast fed to some extent, 43% were exclusively breast fed. clinical practice and routines influence the initiation of breast feeding among late preterm infants and may act as barriers to the mothers׳ establishment of exclusive breast feeding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Functional analysis of alternative transcripts of the soybean Rj2 gene that restricts nodulation with specific rhizobial strains.

    PubMed

    Tang, F; Yang, S; Zhu, H

    2016-05-01

    The Rj2 gene is a TIR-NBS-LRR-type resistance gene in soybean (Glycine max) that restricts root nodule symbiosis with a group of Bradyrhizobium japonicum strains including USDA122. Rj2 generates two distinct transcript variants in its expression profile through alternative splicing. Alternative splicing of Rj2 is caused by the retention of the 86-bp intron 4. Inclusion of intron 4 in mature mRNA introduces an in-frame stop codon; as such, the alternative transcript is predicted to encode a truncated protein consisting of the entire portion of the TIR, NBS and LRR domains but missing the C-terminal domain of the full-length Rj2 protein encoded by the regular transcript. Since alternative splicing has been shown to be essential for full activity of several plant R genes, we attempted to test whether the alternative splicing is required for Rj2-mediated nodulation restriction. Here we demonstrated that the Rj2-mediated nodulation restriction does not require the combined presence of the regular and alternative transcripts, and the expression of the regular transcript alone is sufficient to confer nodulation restriction. © 2016 German Botanical Society and The Royal Botanical Society of the Netherlands.

  3. Test battery for measuring the perception and recognition of facial expressions of emotion

    PubMed Central

    Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner

    2014-01-01

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528

  4. Fos Promotes Early Stage Teno-Lineage Differentiation of Tendon Stem/Progenitor Cells in Tendon.

    PubMed

    Chen, Jialin; Zhang, Erchen; Zhang, Wei; Liu, Zeyu; Lu, Ping; Zhu, Ting; Yin, Zi; Backman, Ludvig J; Liu, Huanhuan; Chen, Xiao; Ouyang, Hongwei

    2017-11-01

    Stem cells have been widely used in tendon tissue engineering. The lack of refined and controlled differentiation strategy hampers the tendon repair and regeneration. This study aimed to find new effective differentiation factors for stepwise tenogenic differentiation. By microarray screening, the transcript factor Fos was found to be expressed in significantly higher amounts in postnatal Achilles tendon tissue derived from 1 day as compared with 7-days-old rats. It was further confirmed that expression of Fos decreased with time in postnatal rat Achilles tendon, which was accompanied with the decreased expression of multiply tendon markers. The expression of Fos also declined during regular in vitro cell culture, which corresponded to the loss of tendon phenotype. In a cell-sheet and a three-dimensional cell culture model, the expression of Fos was upregulated as compared with in regular cell culture, together with the recovery of tendon phenotype. In addition, significant higher expression of tendon markers was found in Fos-overexpressed tendon stem/progenitor cells (TSPCs), and Fos knock-down gave opposite results. In situ rat tendon repair experiments found more normal tendon-like tissue formed and higher tendon markers expression at 4 weeks postimplantation of Fos-overexpressed TSPCs derived nonscaffold engineering tendon (cell-sheet), as compared with the control group. This study identifies Fos as a new marker and functional driver in the early stage teno-lineage differentiation of tendon, which paves the way for effective stepwise tendon differentiation and future tendon regeneration. Stem Cells Translational Medicine 2017;6:2009-2019. © 2017 The Authors Stem Cells Translational Medicine published by Wiley Periodicals, Inc. on behalf of AlphaMed Press.

  5. Carrie Eckert | NREL

    Science.gov Websites

    tools, gene knockout/expression, synthetic biology, Clustered Regularly Interspaced Short Palindromic Photosynthesis CO metabolism Education Ph.D., Molecular Biology, University of Colorado, Anschutz Campus, 2001 -2006 B.S., Biology, University of South Dakota, 1995-1999 Professional Experience Director, Center for

  6. Formal Compiler Implementation in a Logical Framework

    DTIC Science & Technology

    2003-04-29

    variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe

  7. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    DTIC Science & Technology

    2010-05-27

    programming language, threads can only communicate through fields and this assertion prohibits an alias to the object under construction from being writ- ten...1.9. We call this type of reporting “compiler-like” in the sense that the descriptive message output by the tool has to communicate the semantics of...way to communicate a “need” for further annotation to the tool user because a precise expression of both the location and content of the needed

  8. SCMOS (Scalable Complementary Metal Oxide Silicon) Silicon Compiler Organelle Design and Insertion.

    DTIC Science & Technology

    1987-12-01

    polysilicon running horizontally), with the p-type toward Vdd and the n-type toward GND. * Substrate contacts are connected by metal to supply rails...IN’) + (CIN’) Note: The single quote (’) represents the ’not’ of the variable. Figure 2.3 Logic Expressions.. * First metal and polysilicon are... polysilicon . *All external connections to 1,10, CLOCK, Vdd and G.ND end at least 2 units past first metal that is not an 1,0 point. *All external

  9. Easing The Calculation Of Bolt-Circle Coordinates

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.

    1995-01-01

    Bolt Circle Calculation (BOLT-CALC) computer program used to reduce significant time consumed in manually computing trigonometry of rectangular Cartesian coordinates of holes in bolt circle as shown on blueprint or drawing. Eliminates risk of computational errors, particularly in cases involving many holes or in cases in which coordinates expressed to many significant digits. Program assists in many practical situations arising in machine shops. Written in BASIC. Also successfully compiled and implemented by use of Microsoft's QuickBasic v4.0.

  10. Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data.

    PubMed

    Serra, Angela; Coretto, Pietro; Fratello, Michele; Tagliaferri, Roberto; Stegle, Oliver

    2018-02-15

    Microarray technology can be used to study the expression of thousands of genes across a number of different experimental conditions, usually hundreds. The underlying principle is that genes sharing similar expression patterns, across different samples, can be part of the same co-expression system, or they may share the same biological functions. Groups of genes are usually identified based on cluster analysis. Clustering methods rely on the similarity matrix between genes. A common choice to measure similarity is to compute the sample correlation matrix. Dimensionality reduction is another popular data analysis task which is also based on covariance/correlation matrix estimates. Unfortunately, covariance/correlation matrix estimation suffers from the intrinsic noise present in high-dimensional data. Sources of noise are: sampling variations, presents of outlying sample units, and the fact that in most cases the number of units is much larger than the number of genes. In this paper, we propose a robust correlation matrix estimator that is regularized based on adaptive thresholding. The resulting method jointly tames the effects of the high-dimensionality, and data contamination. Computations are easy to implement and do not require hand tunings. Both simulated and real data are analyzed. A Monte Carlo experiment shows that the proposed method is capable of remarkable performances. Our correlation metric is more robust to outliers compared with the existing alternatives in two gene expression datasets. It is also shown how the regularization allows to automatically detect and filter spurious correlations. The same regularization is also extended to other less robust correlation measures. Finally, we apply the ARACNE algorithm on the SyNTreN gene expression data. Sensitivity and specificity of the reconstructed network is compared with the gold standard. We show that ARACNE performs better when it takes the proposed correlation matrix estimator as input. The R software is available at https://github.com/angy89/RobustSparseCorrelation. aserra@unisa.it or robtag@unisa.it. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Testing-Based Compiler Validation for Synchronous Languages

    NASA Technical Reports Server (NTRS)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  12. Particle motion and Penrose processes around rotating regular black hole

    NASA Astrophysics Data System (ADS)

    Abdujabbarov, Ahmadjon

    2016-07-01

    The neutral particle motion around rotating regular black hole that was derived from the Ayón-Beato-García (ABG) black hole solution by the Newman-Janis algorithm in the preceding paper (Toshmatov et al., Phys. Rev. D, 89:104017, 2014) has been studied. The dependencies of the ISCO (innermost stable circular orbits along geodesics) and unstable orbits on the value of the electric charge of the rotating regular black hole have been shown. Energy extraction from the rotating regular black hole through various processes has been examined. We have found expression of the center of mass energy for the colliding neutral particles coming from infinity, based on the BSW (Baňados-Silk-West) mechanism. The electric charge Q of rotating regular black hole decreases the potential of the gravitational field as compared to the Kerr black hole and the particles demonstrate less bound energy at the circular geodesics. This causes an increase of efficiency of the energy extraction through BSW process in the presence of the electric charge Q from rotating regular black hole. Furthermore, we have studied the particle emission due to the BSW effect assuming that two neutral particles collide near the horizon of the rotating regular extremal black hole and produce another two particles. We have shown that efficiency of the energy extraction is less than the value 146.6 % being valid for the Kerr black hole. It has been also demonstrated that the efficiency of the energy extraction from the rotating regular black hole via the Penrose process decreases with the increase of the electric charge Q and is smaller in comparison to 20.7 % which is the value for the extreme Kerr black hole with the specific angular momentum a= M.

  13. HAL/S-FC compiler system functional specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  14. Transcriptomic correlates of neuron electrophysiological diversity

    PubMed Central

    Li, Brenna; Crichlow, Cindy-Lee; Mancarci, B. Ogan; Pavlidis, Paul

    2017-01-01

    How neuronal diversity emerges from complex patterns of gene expression remains poorly understood. Here we present an approach to understand electrophysiological diversity through gene expression by integrating pooled- and single-cell transcriptomics with intracellular electrophysiology. Using neuroinformatics methods, we compiled a brain-wide dataset of 34 neuron types with paired gene expression and intrinsic electrophysiological features from publically accessible sources, the largest such collection to date. We identified 420 genes whose expression levels significantly correlated with variability in one or more of 11 physiological parameters. We next trained statistical models to infer cellular features from multivariate gene expression patterns. Such models were predictive of gene-electrophysiological relationships in an independent collection of 12 visual cortex cell types from the Allen Institute, suggesting that these correlations might reflect general principles relating expression patterns to phenotypic diversity across very different cell types. Many associations reported here have the potential to provide new insights into how neurons generate functional diversity, and correlations of ion channel genes like Gabrd and Scn1a (Nav1.1) with resting potential and spiking frequency are consistent with known causal mechanisms. Our work highlights the promise and inherent challenges in using cell type-specific transcriptomics to understand the mechanistic origins of neuronal diversity. PMID:29069078

  15. HAL/S-FC compiler system specifications

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  16. Compiling quantum circuits to realistic hardware architectures using temporal planners

    NASA Astrophysics Data System (ADS)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  17. A P-Norm Robust Feature Extraction Method for Identifying Differentially Expressed Genes

    PubMed Central

    Liu, Jian; Liu, Jin-Xing; Gao, Ying-Lian; Kong, Xiang-Zhen; Wang, Xue-Song; Wang, Dong

    2015-01-01

    In current molecular biology, it becomes more and more important to identify differentially expressed genes closely correlated with a key biological process from gene expression data. In this paper, based on the Schatten p-norm and Lp-norm, a novel p-norm robust feature extraction method is proposed to identify the differentially expressed genes. In our method, the Schatten p-norm is used as the regularization function to obtain a low-rank matrix and the Lp-norm is taken as the error function to improve the robustness to outliers in the gene expression data. The results on simulation data show that our method can obtain higher identification accuracies than the competitive methods. Numerous experiments on real gene expression data sets demonstrate that our method can identify more differentially expressed genes than the others. Moreover, we confirmed that the identified genes are closely correlated with the corresponding gene expression data. PMID:26201006

  18. A P-Norm Robust Feature Extraction Method for Identifying Differentially Expressed Genes.

    PubMed

    Liu, Jian; Liu, Jin-Xing; Gao, Ying-Lian; Kong, Xiang-Zhen; Wang, Xue-Song; Wang, Dong

    2015-01-01

    In current molecular biology, it becomes more and more important to identify differentially expressed genes closely correlated with a key biological process from gene expression data. In this paper, based on the Schatten p-norm and Lp-norm, a novel p-norm robust feature extraction method is proposed to identify the differentially expressed genes. In our method, the Schatten p-norm is used as the regularization function to obtain a low-rank matrix and the Lp-norm is taken as the error function to improve the robustness to outliers in the gene expression data. The results on simulation data show that our method can obtain higher identification accuracies than the competitive methods. Numerous experiments on real gene expression data sets demonstrate that our method can identify more differentially expressed genes than the others. Moreover, we confirmed that the identified genes are closely correlated with the corresponding gene expression data.

  19. The effect of 1/f fluctuation in inter-stimulus intervals on auditory evoked mismatch field.

    PubMed

    Harada, Nobuyoshi; Masuda, Tadashi; Endo, Hiroshi; Nakamura, Yukihiro; Takeda, Tsunehiro; Tonoike, Mitsuo

    2005-05-13

    This study focused on the effect of regularity of environmental stimuli on the informational order extracting function of human brain. The regularity of environmental stimuli can be described with the exponent n of the fluctuation 1/f(n). We studied the effect of the exponent of the fluctuation in the inter-stimulus interval (ISI) on the elicitation of auditory evoked mismatch fields (MMF) with two sounds with alternating frequency. ISI times were given by three types of fluctuation, 1/f(0), 1/f(1), 1/f(2), and with a fixed interval (1/f(infinity)). The root mean square (RMS) value of the MMF increased significantly (F(3/9)=4.95, p=0.027) with increases in the exponent of the fluctuation. Increments in the regularity of the fluctuation provoked enhancement of the MMF, which reflected the production of a memory trace, based on the anticipation of the stimulus timing. The gradient of the curve, indicating the ratio of increments between the MMF and the exponent of fluctuation, can express a subject's capability to extract regularity from fluctuating stimuli.

  20. Schnek: A C++ library for the development of parallel simulation codes on regular grids

    NASA Astrophysics Data System (ADS)

    Schmitz, Holger

    2018-05-01

    A large number of algorithms across the field of computational physics are formulated on grids with a regular topology. We present Schnek, a library that enables fast development of parallel simulations on regular grids. Schnek contains a number of easy-to-use modules that greatly reduce the amount of administrative code for large-scale simulation codes. The library provides an interface for reading simulation setup files with a hierarchical structure. The structure of the setup file is translated into a hierarchy of simulation modules that the developer can specify. The reader parses and evaluates mathematical expressions and initialises variables or grid data. This enables developers to write modular and flexible simulation codes with minimal effort. Regular grids of arbitrary dimension are defined as well as mechanisms for defining physical domain sizes, grid staggering, and ghost cells on these grids. Ghost cells can be exchanged between neighbouring processes using MPI with a simple interface. The grid data can easily be written into HDF5 files using serial or parallel I/O.

  1. An interior-point method for total variation regularized positron emission tomography image reconstruction

    NASA Astrophysics Data System (ADS)

    Bai, Bing

    2012-03-01

    There has been a lot of work on total variation (TV) regularized tomographic image reconstruction recently. Many of them use gradient-based optimization algorithms with a differentiable approximation of the TV functional. In this paper we apply TV regularization in Positron Emission Tomography (PET) image reconstruction. We reconstruct the PET image in a Bayesian framework, using Poisson noise model and TV prior functional. The original optimization problem is transformed to an equivalent problem with inequality constraints by adding auxiliary variables. Then we use an interior point method with logarithmic barrier functions to solve the constrained optimization problem. In this method, a series of points approaching the solution from inside the feasible region are found by solving a sequence of subproblems characterized by an increasing positive parameter. We use preconditioned conjugate gradient (PCG) algorithm to solve the subproblems directly. The nonnegativity constraint is enforced by bend line search. The exact expression of the TV functional is used in our calculations. Simulation results show that the algorithm converges fast and the convergence is insensitive to the values of the regularization and reconstruction parameters.

  2. p27Kip1 is expressed in proliferating cells in its form phosphorylated on threonine 187

    PubMed Central

    Troncone, Giancarlo; Martinez, Juan C; Iaccarino, Antonino; Zeppa, Pio; Caleo, Alessia; Russo, Maria; Migliaccio, Ilenia; Motti, Maria L; Califano, Daniela; Palmieri, Emiliano A; Palombini, Lucio

    2005-01-01

    Background G1/S cell cycle progression requires p27Kip1 (p27) proteolysis, which is triggered by its phosphorylation on threonine (Thr) 187. Since its levels are abundant in quiescent and scarce in cycling cells, p27 is an approved marker for quiescent cells, extensively used in histopathology and cancer research. Methods However here we showed that by using a specific phosphorylation site (pThr187) antibody, p27 is detectable also in proliferative compartments of normal, dysplastic and neoplastic tissues. Results In fact, whereas un-phosphorylated p27 and MIB-1 showed a significant inverse correlation (Spearman R = -0.55; p < 0,001), pThr187-p27 was positively and significantly correlated with MIB-1 expression (Spearman R = 0.88; p < 0,001). Thus proliferating cells only stain for pThr187-p27, whereas they are un-reactive with the regular p27 antibodies. However increasing the sensitivity of the immunocytochemistry (ICH) by the use of an ultra sensitive detection system based on tiramide signal amplification, simultaneous expression and colocalisation of both forms of p27 was shown in proliferating compartments nuclei by double immunofluorescence and laser scanning confocal microscopy studies. Conclusion Overall, our data suggest that p27 expression also occurs in proliferating cells compartments and the combined use of both regular and phospho- p27 antibodies is suggested. PMID:15725363

  3. Gene Repression in Haloarchaea Using the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-Cas I-B System.

    PubMed

    Stachler, Aris-Edda; Marchfelder, Anita

    2016-07-15

    The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas system is used by bacteria and archaea to fend off foreign genetic elements. Since its discovery it has been developed into numerous applications like genome editing and regulation of transcription in eukaryotes and bacteria. For archaea currently no tools for transcriptional repression exist. Because molecular biology analyses in archaea become more and more widespread such a tool is vital for investigating the biological function of essential genes in archaea. Here we use the model archaeon Haloferax volcanii to demonstrate that its endogenous CRISPR-Cas system I-B can be harnessed to repress gene expression in archaea. Deletion of cas3 and cas6b genes results in efficient repression of transcription. crRNAs targeting the promoter region reduced transcript levels down to 8%. crRNAs targeting the reading frame have only slight impact on transcription. crRNAs that target the coding strand repress expression only down to 88%, whereas crRNAs targeting the template strand repress expression down to 8%. Repression of an essential gene results in reduction of transcription levels down to 22%. Targeting efficiencies can be enhanced by expressing a catalytically inactive Cas3 mutant. Genes can be targeted on plasmids or on the chromosome, they can be monocistronic or part of a polycistronic operon. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  4. Gene Repression in Haloarchaea Using the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-Cas I-B System*

    PubMed Central

    Stachler, Aris-Edda; Marchfelder, Anita

    2016-01-01

    The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas system is used by bacteria and archaea to fend off foreign genetic elements. Since its discovery it has been developed into numerous applications like genome editing and regulation of transcription in eukaryotes and bacteria. For archaea currently no tools for transcriptional repression exist. Because molecular biology analyses in archaea become more and more widespread such a tool is vital for investigating the biological function of essential genes in archaea. Here we use the model archaeon Haloferax volcanii to demonstrate that its endogenous CRISPR-Cas system I-B can be harnessed to repress gene expression in archaea. Deletion of cas3 and cas6b genes results in efficient repression of transcription. crRNAs targeting the promoter region reduced transcript levels down to 8%. crRNAs targeting the reading frame have only slight impact on transcription. crRNAs that target the coding strand repress expression only down to 88%, whereas crRNAs targeting the template strand repress expression down to 8%. Repression of an essential gene results in reduction of transcription levels down to 22%. Targeting efficiencies can be enhanced by expressing a catalytically inactive Cas3 mutant. Genes can be targeted on plasmids or on the chromosome, they can be monocistronic or part of a polycistronic operon. PMID:27226589

  5. 22q11.2 Deletion Syndrome Is Associated With Impaired Auditory Steady-State Gamma Response

    PubMed Central

    Pellegrino, Giovanni; Birknow, Michelle Rosgaard; Kjær, Trine Nørgaard; Baaré, William Frans Christiaan; Didriksen, Michael; Olsen, Line; Werge, Thomas; Mørup, Morten; Siebner, Hartwig Roman

    2018-01-01

    Abstract Background The 22q11.2 deletion syndrome confers a markedly increased risk for schizophrenia. 22q11.2 deletion carriers without manifest psychotic disorder offer the possibility to identify functional abnormalities that precede clinical onset. Since schizophrenia is associated with a reduced cortical gamma response to auditory stimulation at 40 Hz, we hypothesized that the 40 Hz auditory steady-state response (ASSR) may be attenuated in nonpsychotic individuals with a 22q11.2 deletion. Methods Eighteen young nonpsychotic 22q11.2 deletion carriers and a control group of 27 noncarriers with comparable age range (12–25 years) and sex ratio underwent 128-channel EEG. We recorded the cortical ASSR to a 40 Hz train of clicks, given either at a regular inter-stimulus interval of 25 ms or at irregular intervals jittered between 11 and 37 ms. Results Healthy noncarriers expressed a stable ASSR to regular but not in the irregular 40 Hz click stimulation. Both gamma power and inter-trial phase coherence of the ASSR were markedly reduced in the 22q11.2 deletion group. The ability to phase lock cortical gamma activity to regular auditory 40 Hz stimulation correlated with the individual expression of negative symptoms in deletion carriers (ρ = −0.487, P = .041). Conclusions Nonpsychotic 22q11.2 deletion carriers lack efficient phase locking of evoked gamma activity to regular 40 Hz auditory stimulation. This abnormality indicates a dysfunction of fast intracortical oscillatory processing in the gamma-band. Since ASSR was attenuated in nonpsychotic deletion carriers, ASSR deficiency may constitute a premorbid risk marker of schizophrenia. PMID:28521049

  6. A biological network-based regularized artificial neural network model for robust phenotype prediction from gene expression data.

    PubMed

    Kang, Tianyu; Ding, Wei; Zhang, Luoyan; Ziemek, Daniel; Zarringhalam, Kourosh

    2017-12-19

    Stratification of patient subpopulations that respond favorably to treatment or experience and adverse reaction is an essential step toward development of new personalized therapies and diagnostics. It is currently feasible to generate omic-scale biological measurements for all patients in a study, providing an opportunity for machine learning models to identify molecular markers for disease diagnosis and progression. However, the high variability of genetic background in human populations hampers the reproducibility of omic-scale markers. In this paper, we develop a biological network-based regularized artificial neural network model for prediction of phenotype from transcriptomic measurements in clinical trials. To improve model sparsity and the overall reproducibility of the model, we incorporate regularization for simultaneous shrinkage of gene sets based on active upstream regulatory mechanisms into the model. We benchmark our method against various regression, support vector machines and artificial neural network models and demonstrate the ability of our method in predicting the clinical outcomes using clinical trial data on acute rejection in kidney transplantation and response to Infliximab in ulcerative colitis. We show that integration of prior biological knowledge into the classification as developed in this paper, significantly improves the robustness and generalizability of predictions to independent datasets. We provide a Java code of our algorithm along with a parsed version of the STRING DB database. In summary, we present a method for prediction of clinical phenotypes using baseline genome-wide expression data that makes use of prior biological knowledge on gene-regulatory interactions in order to increase robustness and reproducibility of omic-scale markers. The integrated group-wise regularization methods increases the interpretability of biological signatures and gives stable performance estimates across independent test sets.

  7. The value of multidisciplinary team meetings within an early pregnancy assessment unit.

    PubMed

    Bharathan, Rasiah; Farag, Mena; Hayes, Kevin

    2016-08-01

    This is the first study to ascertain the value of multidisciplinary team (MDT) meetings within an early pregnancy assessment unit (EPAU). Our national telephone survey identified that in the United Kingdom, overall 37% of EPAU utilise regular MDT meetings. Secondary and tertiary hospitals are just as likely to hold regular MDT meetings. The participants in our interview study expressed the principal benefits of regular MDT meetings as communication, education and effective stress management. The perceived additional benefits included improved care quality, better patient experience and enhanced team cohesion. During the meetings, at least, one representative from every tier of staffing was present. The caseload of the MDT meeting comprised ectopic pregnancies and pregnancies of unknown location. We propose a number of research studies, which would build on this study. Such efforts will help enhance the effectiveness of the MDT-based EPAU service.

  8. ChloroMitoCU: Codon patterns across organelle genomes for functional genomics and evolutionary applications.

    PubMed

    Sablok, Gaurav; Chen, Ting-Wen; Lee, Chi-Ching; Yang, Chi; Gan, Ruei-Chi; Wegrzyn, Jill L; Porta, Nicola L; Nayak, Kinshuk C; Huang, Po-Jung; Varotto, Claudio; Tang, Petrus

    2017-06-01

    Organelle genomes are widely thought to have arisen from reduction events involving cyanobacterial and archaeal genomes, in the case of chloroplasts, or α-proteobacterial genomes, in the case of mitochondria. Heterogeneity in base composition and codon preference has long been the subject of investigation of topics ranging from phylogenetic distortion to the design of overexpression cassettes for transgenic expression. From the overexpression point of view, it is critical to systematically analyze the codon usage patterns of the organelle genomes. In light of the importance of codon usage patterns in the development of hyper-expression organelle transgenics, we present ChloroMitoCU, the first-ever curated, web-based reference catalog of the codon usage patterns in organelle genomes. ChloroMitoCU contains the pre-compiled codon usage patterns of 328 chloroplast genomes (29,960 CDS) and 3,502 mitochondrial genomes (49,066 CDS), enabling genome-wide exploration and comparative analysis of codon usage patterns across species. ChloroMitoCU allows the phylogenetic comparison of codon usage patterns across organelle genomes, the prediction of codon usage patterns based on user-submitted transcripts or assembled organelle genes, and comparative analysis with the pre-compiled patterns across species of interest. ChloroMitoCU can increase our understanding of the biased patterns of codon usage in organelle genomes across multiple clades. ChloroMitoCU can be accessed at: http://chloromitocu.cgu.edu.tw/. © The Author 2017. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  9. ImmunemiR - A Database of Prioritized Immune miRNA Disease Associations and its Interactome.

    PubMed

    Prabahar, Archana; Natarajan, Jeyakumar

    2017-01-01

    MicroRNAs are the key regulators of gene expression and their abnormal expression in the immune system may be associated with several human diseases such as inflammation, cancer and autoimmune diseases. Elucidation of miRNA disease association through the interactome will deepen the understanding of its disease mechanisms. A specialized database for immune miRNAs is highly desirable to demonstrate the immune miRNA disease associations in the interactome. miRNAs specific to immune related diseases were retrieved from curated databases such as HMDD, miR2disease and PubMed literature based on MeSH classification of immune system diseases. The additional data such as miRNA target genes, genes coding protein-protein interaction information were compiled from related resources. Further, miRNAs were prioritized to specific immune diseases using random walk ranking algorithm. In total 245 immune miRNAs associated with 92 OMIM disease categories were identified from external databases. The resultant data were compiled as ImmunemiR, a database of prioritized immune miRNA disease associations. This database provides both text based annotation information and network visualization of its interactome. To our knowledge, ImmunemiR is the first available database to provide a comprehensive repository of human immune disease associated miRNAs with network visualization options of its target genes, protein-protein interactions (PPI) and its disease associations. It is freely available at http://www.biominingbu.org/immunemir/. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Oppositional Decoding as an Act of Resistance.

    ERIC Educational Resources Information Center

    Steiner, Linda

    Specific social groups express themselves through their own particularized media. For example, "MS" magazine directs its communication to feminist readers, and as a part of this, regularly reprints advertisements and news clipping taken from mainstream media in its "No Comment" section. This section provides opportunities for…

  11. Kato perturbative expansion in classical mechanics and an explicit expression for the Deprit generator

    NASA Astrophysics Data System (ADS)

    Nikolaev, A. S.

    2015-03-01

    We study the structure of the canonical Poincaré-Lindstedt perturbation series in the Deprit operator formalism and establish its connection to the Kato resolvent expansion. A discussion of invariant definitions for averaging and integrating perturbation operators and their canonical identities reveals a regular pattern in the series for the Deprit generator. This regularity is explained using Kato series and the relation of the perturbation operators to the Laurent coefficients for the resolvent of the Liouville operator. This purely canonical approach systematizes the series and leads to an explicit expression for the Deprit generator in any order of the perturbation theory: , where is the partial pseudoinverse of the perturbed Liouville operator. The corresponding Kato series provides a reasonably effective computational algorithm. The canonical connection of the perturbed and unperturbed averaging operators allows describing ambiguities in the generator and transformed Hamiltonian, while Gustavson integrals turn out to be insensitive to the normalization style. We use nonperturbative examples for illustration.

  12. Anxiety, Depression and Emotion Regulation Among Regular Online Poker Players.

    PubMed

    Barrault, Servane; Bonnaire, Céline; Herrmann, Florian

    2017-12-01

    Poker is a type of gambling that has specific features, including the need to regulate one's emotion to be successful. The aim of the present study is to assess emotion regulation, anxiety and depression in a sample of regular poker players, and to compare the results of problem and non-problem gamblers. 416 regular online poker players completed online questionnaires including sociodemographic data, measures of problem gambling (CPGI), anxiety and depression (HAD scale), and emotion regulation (ERQ). The CPGI was used to divide participants into four groups according to the intensity of their gambling practice (non-problem, low risk, moderate risk and problem gamblers). Anxiety and depression were significantly higher among severe-problem gamblers than among the other groups. Both significantly predicted problem gambling. On the other hand, there was no difference between groups in emotion regulation (cognitive reappraisal and expressive suppression), which was linked neither to problem gambling nor to anxiety and depression (except for cognitive reappraisal, which was significantly correlated to anxiety). Our results underline the links between anxiety, depression and problem gambling among poker players. If emotion regulation is involved in problem gambling among poker players, as strongly suggested by data from the literature, the emotion regulation strategies we assessed (cognitive reappraisal and expressive suppression) may not be those involved. Further studies are thus needed to investigate the involvement of other emotion regulation strategies.

  13. The Impact of Endurance Training on Human Skeletal Muscle Memory, Global Isoform Expression and Novel Transcripts

    PubMed Central

    Lindholm, Maléne E; Giacomello, Stefania; Werne Solnestam, Beata; Kjellqvist, Sanela

    2016-01-01

    Regularly performed endurance training has many beneficial effects on health and skeletal muscle function, and can be used to prevent and treat common diseases e.g. cardiovascular disease, type II diabetes and obesity. The molecular adaptation mechanisms regulating these effects are incompletely understood. To date, global transcriptome changes in skeletal muscles have been studied at the gene level only. Therefore, global isoform expression changes following exercise training in humans are unknown. Also, the effects of repeated interventions on transcriptional memory or training response have not been studied before. In this study, 23 individuals trained one leg for three months. Nine months later, 12 of the same subjects trained both legs in a second training period. Skeletal muscle biopsies were obtained from both legs before and after both training periods. RNA sequencing analysis of all 119 skeletal muscle biopsies showed that training altered the expression of 3,404 gene isoforms, mainly associated with oxidative ATP production. Fifty-four genes had isoforms that changed in opposite directions. Training altered expression of 34 novel transcripts, all with protein-coding potential. After nine months of detraining, no training-induced transcriptome differences were detected between the previously trained and untrained legs. Although there were several differences in the physiological and transcriptional responses to repeated training, no coherent evidence of an endurance training induced transcriptional skeletal muscle memory was found. This human lifestyle intervention induced differential expression of thousands of isoforms and several transcripts from unannotated regions of the genome. It is likely that the observed isoform expression changes reflect adaptational mechanisms and processes that provide the functional and health benefits of regular physical activity. PMID:27657503

  14. Nimble Compiler Environment for Agile Hardware. Volume 1

    DTIC Science & Technology

    2001-10-01

    APPENDIX G . XIMA - THE NIMBLE DATAPATH COMPILER .......................................................................... 172 ABSTRACT...Approach of the Nimble Compiler Task 3 G Xima - The Nimble Datapath Compiler Task 4 H Domain Generator Tutorial for the Nimble Compiler Project Task 5 I...a loop example. Nodes A- G are basic blocks inside the loop. It is obvious that there are four distinct paths inside the loop (without counting the

  15. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  16. Ada Compiler Validation Summary Report: Harris Corporation, Harris Ada Compiler, Version 4.0, Harris HCX-9 (Host) and (Target), 880603W1.09059

    DTIC Science & Technology

    1988-06-06

    TYPE Of REPORT & PERIOD COVERED Ada Compiler Validation Summary Report : Harris 6 June 1988 to 6 June 1988 Corporation, Harris Ada Compiler, Version...4.0, Harris 1 PERFORINGDRG REPORT NUMBER HCX-9 (Host) and (Target), 880603W1.09059 7. AUTHOR(s) S. CONTRACT OR 6RANT NUMBER(s) Wright-Patterson AFB...88-03-02-HAR Ada COMPILER VALIDATION SUMMARY REPORT : Certificate Number: 880603WI.09059 A Harris Corporation AccessionFor Harris Ada Compiler, Version

  17. Predicting selective drug targets in cancer through metabolic networks

    PubMed Central

    Folger, Ori; Jerby, Livnat; Frezza, Christian; Gottlieb, Eyal; Ruppin, Eytan; Shlomi, Tomer

    2011-01-01

    The interest in studying metabolic alterations in cancer and their potential role as novel targets for therapy has been rejuvenated in recent years. Here, we report the development of the first genome-scale network model of cancer metabolism, validated by correctly identifying genes essential for cellular proliferation in cancer cell lines. The model predicts 52 cytostatic drug targets, of which 40% are targeted by known, approved or experimental anticancer drugs, and the rest are new. It further predicts combinations of synthetic lethal drug targets, whose synergy is validated using available drug efficacy and gene expression measurements across the NCI-60 cancer cell line collection. Finally, potential selective treatments for specific cancers that depend on cancer type-specific downregulation of gene expression and somatic mutations are compiled. PMID:21694718

  18. Human genetics: international projects and personalized medicine.

    PubMed

    Apellaniz-Ruiz, Maria; Gallego, Cristina; Ruiz-Pinto, Sara; Carracedo, Angel; Rodríguez-Antona, Cristina

    2016-03-01

    In this article, we present the progress driven by the recent technological advances and new revolutionary massive sequencing technologies in the field of human genetics. We discuss this knowledge in relation with drug response prediction, from the germline genetic variation compiled in the 1000 Genomes Project or in the Genotype-Tissue Expression project, to the phenome-genome archives, the international cancer projects, such as The Cancer Genome Atlas or the International Cancer Genome Consortium, and the epigenetic variation and its influence in gene expression, including the regulation of drug metabolism. This review is based on the lectures presented by the speakers of the Symposium "Human Genetics: International Projects & New Technologies" from the VII Conference of the Spanish Pharmacogenetics and Pharmacogenomics Society, held on the 20th and 21st of April 2015.

  19. Thermodynamics and Phase Transition from Regular Bardeen Black Hole Surrounded by Quintessence

    NASA Astrophysics Data System (ADS)

    Saleh, Mahamat; Thomas, Bouetou Bouetou; Kofane, Timoleon Crepin

    2018-05-01

    In this paper, thermodynamics and phase transition are investigated for the regular Bardeen black hole surrounded by quintessence. Considering the metric of the Bardeen spacetime surrounded by quintessence, we derived the Unruh-Verlinde temperature. Using the first law of thermodynamics, we derived the expressions of the Hawking temperature as well as the specific heat for the black hole. Explicitly, their behaviors were plotted. It results that the magnetic monopole charge β as well as the presence of quintessence decrease the temperature and induce a thermodynamics phase transition in the spacetime. Moreover, when increasing the density of quintessence, the transition point moves to lower entropies.

  20. Power corrections to the HTL effective Lagrangian of QED

    NASA Astrophysics Data System (ADS)

    Carignano, Stefano; Manuel, Cristina; Soto, Joan

    2018-05-01

    We present compact expressions for the power corrections to the hard thermal loop (HTL) Lagrangian of QED in d space dimensions. These are corrections of order (L / T) 2, valid for momenta L ≪ T, where T is the temperature. In the limit d → 3 we achieve a consistent regularization of both infrared and ultraviolet divergences, which respects the gauge symmetry of the theory. Dimensional regularization also allows us to witness subtle cancellations of infrared divergences. We also discuss how to generalize our results in the presence of a chemical potential, so as to obtain the power corrections to the hard dense loop (HDL) Lagrangian.

  1. Money, Money, Money.

    ERIC Educational Resources Information Center

    Shugert, Diane P., Ed.

    1982-01-01

    Intended to show that English teachers' activities outside the classroom enhance their regular classroom work and permit them to teach new, larger audiences, the articles in this journal issue emphasize how English teachers can reevaluate their skills and express those skills in a language that businesses understand. Some of the articles explain…

  2. 14 CFR 1203.302 - Combination, interrelation or compilation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INFORMATION SECURITY PROGRAM Classification Principles and Considerations § 1203.302 Combination.... Compilations of unclassified information are considered unclassified unless some additional significant factor is added in the process of compilation. For example: (a) The way unclassified information is compiled...

  3. 14 CFR 1203.302 - Combination, interrelation or compilation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INFORMATION SECURITY PROGRAM Classification Principles and Considerations § 1203.302 Combination.... Compilations of unclassified information are considered unclassified unless some additional significant factor is added in the process of compilation. For example: (a) The way unclassified information is compiled...

  4. Regular Soda Policies, School Availability, and High School Student Consumption

    PubMed Central

    Terry-McElrath, Yvonne M.; Chriqui, Jamie F.; O’Malley, Patrick M.; Chaloupka, Frank J.; Johnston, Lloyd D.

    2014-01-01

    Background Beginning in the 2014–2015 school year, all U.S. schools participating in federally reimbursable meal programs are required to implement new nutrition standards for items sold in competitive venues. Multilevel mediation modeling examining direct, mediated, and indirect pathways between policy, availability, and student consumption might provide insight into possible outcomes of implementing aspects of the new standards. Purpose To employ multilevel mediation modeling using state- and school district–level policies mandating school soda bans, school soda availability, and student soda consumption. Methods The 2010–2012 Monitoring the Future surveys obtained nationally representative data on high school student soda consumption; school administrators provided school soda availability data. State laws and district policies were compiled and coded. Analyses conducted in 2014 controlled for state-, school-, and student-level characteristics. Results State–district–school models found that state bans were associated with significantly lower school soda availability (c, p<0.05) but district bans showed no significant associations. No significant direct, mediated, or indirect associations between state policy and student consumption were observed for the overall sample. Among African American high school students, state policy was associated directly with significantly lower school soda availability (a, p<0.01), and—indirectly through lower school availability—with significantly lower soda consumption (a*b, p<0.05). Conclusions These analyses indicate state policy focused on regular soda strongly affected school soda availability, and worked through changes in school availability to decrease soda consumption among African American students, but not the overall population. PMID:25576493

  5. Effects of Exercise on Cognition: The Finnish Alzheimer Disease Exercise Trial: A Randomized, Controlled Trial.

    PubMed

    Öhman, Hannareeta; Savikko, Niina; Strandberg, Timo E; Kautiainen, Hannu; Raivio, Minna M; Laakkonen, Marja-Liisa; Tilvis, Reijo; Pitkälä, Kaisu H

    2016-04-01

    To examine whether a regular, long-term exercise program performed by individuals with Alzheimer's disease (AD) at home or as group-based exercise at an adult daycare center has beneficial effects on cognition; to examine secondary outcomes of a trial that has been published earlier. Randomized, controlled trial. Community. Community-dwelling dyads (N = 210) of individuals with AD and their spousal caregivers randomized into three groups. Two types of intervention comprising customized home-based exercise (HE) and group-based exercise (GE), each twice a week for 1 year, were compared with a control group (CG) receiving usual community care. Cognitive function was measured using the Clock Drawing Test (CDT), Verbal Fluency (VF), Clinical Dementia Rating (CDR), and Mini-Mental State Examination (MMSE) at baseline and 3, 6, and 12 months of follow-up. Executive function, measured using CDT, improved in the HE group, and changes in the score were significantly better than those of the CG at 12 months (adjusted for age, sex, and CDR, P = .03). All groups deteriorated in VF and MMSE score during the intervention, and no significant differences between the groups were detected at 12-month follow-up when analyses were adjusted for age, sex, and CDR. Regular, long-term, customized HE improved the executive function of community-dwelling older people with memory disorders, but the effects were mild and were not observed in other domains of cognition. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  6. Influence of the Yukon River on the Bering Sea

    NASA Technical Reports Server (NTRS)

    Dean, K.; Mcroy, C. P.

    1986-01-01

    The relationships between the discharge of the Yukon River to the currents and biological productivity in the northern Bering Sea were studied. Specific objectives were: to develop thermal, sediment, and chlorophyll surface maps using Thematic Mapper (TM) data of the discharge of the Yukon River and the Alaskan Coastal Current during the ice free season; to develop a historical model of the distribution of the Yukon River discharge and the Alaskan Coastal Current using LANDSAT Multispectral band scanner (MSS) and NOAA satellite imagery; and to use high resolution TM data to define the surface dynamics of the front between the Alaskan Coastal Current and the Bering Shelf/Anadyr Current. LANDSAT MSS, TM, and Advanced Very High Resolution Radiometer (AVHRR) data were recorded during the 1985 ice free period. The data coincided with shipboard measurements acquired by Inner Shelf Transfer and Recycling (ISTAR) project scientists. An integrated model of the distribution of turbid water discharged from the Yukon River was compiled. A similar model is also being compiled for the Alaskan Coastal and Bering Shelf/Anadyr water masses based on their thermal expressions seen on AVHRR imagery.

  7. Distributed memory compiler design for sparse problems

    NASA Technical Reports Server (NTRS)

    Wu, Janet; Saltz, Joel; Berryman, Harry; Hiranandani, Seema

    1991-01-01

    A compiler and runtime support mechanism is described and demonstrated. The methods presented are capable of solving a wide range of sparse and unstructured problems in scientific computing. The compiler takes as input a FORTRAN 77 program enhanced with specifications for distributing data, and the compiler outputs a message passing program that runs on a distributed memory computer. The runtime support for this compiler is a library of primitives designed to efficiently support irregular patterns of distributed array accesses and irregular distributed array partitions. A variety of Intel iPSC/860 performance results obtained through the use of this compiler are presented.

  8. Spectroscopic line parameters of NH3 and PH3 in the far infrared

    NASA Technical Reports Server (NTRS)

    Husson, N.; Goldman, A.; Orton, G.

    1982-01-01

    NH3 and PH3 rotation and rotation-inversion line parameters in the far to medium IR are calculated for remote sounding purposes of planetary atmospheres; 1607 lines of (N-14)H3, 362 lines of (N-15)H3 and 325 lines of PH3 are compiled. The absolute intensity formulation has been reviewed in the case of rotation and rotation-inversion lines of molecules with C(3v) symmetry. The justification for the general agreement between the authors, and comparisons with other published expressions are given.

  9. MSU-DOE Plant Research Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This document is the compiled progress reports of research funded through the Michigan State University/Department of Energy Plant Research Laboratory. Fourteen reports are included, covering the molecular basis of plant/microbe symbiosis, cell wall biosynthesis and proteins, gene expression, stress responses, plant hormone biosynthesis, interactions between the nuclear and organelle genomes, sensory transduction and tropisms, intracellular sorting and trafficking, regulation of lipid metabolism, molecular basis of disease resistance and plant pathogenesis, developmental biology of Cyanobacteria, and hormonal involvement in environmental control of plant growth. 320 refs., 26 figs., 3 tabs. (MHB)

  10. A learning apprentice for software parts composition

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1987-01-01

    An overview of the knowledge acquisition component of the Bauhaus, a prototype computer aided software engineering (CASE) workstation for the development of domain-specific automatic programming systems (D-SAPS) is given. D-SAPS use domain knowledge in the refinement of a description of an application program into a compilable implementation. The approach to the construction of D-SAPS was to automate the process of refining a description of a program, expressed in an object-oriented domain language, into a configuration of software parts that implement the behavior of the domain objects.

  11. 76 FR 81462 - Preliminary Plan for Retrospective Analysis of Existing Rules

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-28

    ... order to identify unnecessary or unduly burdensome regulations that may be hindering job creation and... inspection and copying during regular business hours in the Commission's Reference Information Center... Express Mail and Priority Mail) must be sent to 9300 East Hampton Drive, Capitol Heights, MD 20743...

  12. MONTO: A Machine-Readable Ontology for Teaching Word Problems in Mathematics

    ERIC Educational Resources Information Center

    Lalingkar, Aparna; Ramnathan, Chandrashekar; Ramani, Srinivasan

    2015-01-01

    The Indian National Curriculum Framework has as one of its objectives the development of mathematical thinking and problem solving ability. However, recent studies conducted in Indian metros have expressed concern about students' mathematics learning. Except in some private coaching academies, regular classroom teaching does not include problem…

  13. 77 FR 29579 - Removing Unnecessary Office on Violence Against Women Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... Office on Violence Against Women Regulations AGENCY: Office on Violence Against Women, Justice. ACTION: Proposed rule. SUMMARY: This rule proposes to remove the regulations for the STOP Violence Against Indian... through regular or express mail, they should be sent to Kathi Grasso, Office on Violence Against Women...

  14. Research Methods Teaching in Vocational Environments: Developing Critical Engagement with Knowledge?

    ERIC Educational Resources Information Center

    Gray, C.; Turner, R.; Sutton, C.; Petersen, C.; Stevens, S.; Swain, J.; Esmond, B.; Schofield, C.; Thackeray, D.

    2015-01-01

    Knowledge of research methods is regarded as crucial for the UK economy and workforce. However, research methods teaching is viewed as a challenging area for lecturers and students. The pedagogy of research methods teaching within universities has been noted as underdeveloped, with undergraduate students regularly expressing negative dispositions…

  15. The Community Schools Advisory Panel: A Texas Approach.

    ERIC Educational Resources Information Center

    Defoe, Bettye Haller

    In 1977 the Texas Education Agency (TEA) organized the Community Schools Advisory Panel (CSAP) because administrators of smaller school districts wanted regular opportunities to express their schools' views to TEA decision makers, especially the Commissioner of Education. CSAP consists of 14 representatives of Texas' 1,009 community schools…

  16. Peers for Promotion: Achieving Academic Advancement through Facilitated Peer Mentoring

    ERIC Educational Resources Information Center

    Ockene, Judith K.; Milner, Robert J.; Thorndyke, Luanne E.; Congdon, John; Cain, Joanna M.

    2017-01-01

    The promotion process is challenging, particularly for non-tenure track faculty in academic medicine. To address this challenge, we implemented a facilitated peer mentoring program that included a structured curriculum with regular meetings, guided by two senior faculty mentors. Participants expressed satisfaction with the program, showed…

  17. Coalition Network Defence Common Operational Picture

    DTIC Science & Technology

    2010-11-01

    27000 .org/ iso -27005.htm [26] ISO 8601:2004, Data elements and interchange formats - Information interchange - Representation of dates and times, http://ww.iso.org, http://en.wikipedia.org/wiki/ISO_8601 ...Regular_expression [25] ISO /IEC 27005:2008, Information technology -- Security techniques -- Information security risk management, http://ww.iso.org,; http://www

  18. Estimating the Volumes of Solid Figures with Curved Surfaces.

    ERIC Educational Resources Information Center

    Cohen, Donald

    1991-01-01

    Several examples of solid figures that calculus students can use to exercise their skills at estimating volume are presented. Although these figures are bounded by surfaces that are portions of regular cylinders, it is interesting to note that their volumes can be expressed as rational numbers. (JJK)

  19. Using Songs to Strengthen Reading Fluency

    ERIC Educational Resources Information Center

    Patel, Pooja; Laud, Leslie E.

    2007-01-01

    This study evaluated the use of songs with lyrics to increase the reading fluency rates of three middle school students. In the first condition, students heard fluent reading modeled, read regular passages repeatedly and then received feedback on accuracy, phrasing and expression. After that, students received the same intervention, except that…

  20. EVALUATION OF EXPERIMENTAL PRESCHOOL PROGRAM FOR EDUCATIONALLY DEPRIVED CHILDREN (1964).

    ERIC Educational Resources Information Center

    STEWART, LUCILLE M.

    THE AIM OF AN EXPERIMENTAL PRESCHOOL PROGRAM FOR EDUCATIONALLY DEPRIVED CHILDREN WAS TO PREPARE THEM FOR REGULAR KINDERGARTEN CLASSES. ACTIVITIES AND EXPERIENCES WERE PROVIDED WHICH HELPED THE CHILDREN EXPRESS THEMSELVES VERBALLY AND BECOME AWARE OF THEIR ENVIRONMENT. THE BUDGET FOR A 6-WEEK PROGRAM, INCLUDING STAFF, PROGRAM SUPPLIES, AND…

  1. Minimizing Wide-Area Performance Disruptions in Inter-Domain Routing

    DTIC Science & Technology

    2011-09-01

    Servers As another example, we saw the average round-trip time double for an ISP in Malaysia . The RTT increase was caused by a traffic shift to different... censorship , conduct wiretapping, or offer poor performance. This is achieved by applying regular expressions to the AS-PATH to assign lower preference

  2. 16 CFR 233.2 - Retail price comparisons; comparable value comparisons.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Retail price comparisons; comparable value... GUIDES AGAINST DECEPTIVE PRICING § 233.2 Retail price comparisons; comparable value comparisons. (a.... Expressed another way, if a number of the principal retail outlets in the area are regularly selling Brand X...

  3. 16 CFR 233.2 - Retail price comparisons; comparable value comparisons.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Retail price comparisons; comparable value... GUIDES AGAINST DECEPTIVE PRICING § 233.2 Retail price comparisons; comparable value comparisons. (a.... Expressed another way, if a number of the principal retail outlets in the area are regularly selling Brand X...

  4. 16 CFR 233.2 - Retail price comparisons; comparable value comparisons.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Retail price comparisons; comparable value... GUIDES AGAINST DECEPTIVE PRICING § 233.2 Retail price comparisons; comparable value comparisons. (a.... Expressed another way, if a number of the principal retail outlets in the area are regularly selling Brand X...

  5. 16 CFR 233.2 - Retail price comparisons; comparable value comparisons.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Retail price comparisons; comparable value... GUIDES AGAINST DECEPTIVE PRICING § 233.2 Retail price comparisons; comparable value comparisons. (a.... Expressed another way, if a number of the principal retail outlets in the area are regularly selling Brand X...

  6. The Role of Semantic Features in Verb Processing

    ERIC Educational Resources Information Center

    Bonnotte, Isabelle

    2008-01-01

    The present study examined the general hypothesis that, as for nouns, stable representations of semantic knowledge relative to situations expressed by verbs are available and accessible in long term memory in normal people. Regular associations between verbs and past tenses in French adults allowed to abstract two superordinate semantic features…

  7. The International Conference on Amorphous and Liquid Semiconductors (9th).

    DTIC Science & Technology

    1979-12-11

    loop effective action of a constant gluon field can be expressed in terms of the experimentally determinable A,.,• In the following chapter, the...regularization and Schwinger’s proper time method. The renormalization mass parameters appearing in the two treatments can then be related and the exact one

  8. Disabilities in Written Expression

    ERIC Educational Resources Information Center

    Gardner, Teresa J.

    2011-01-01

    Regular education teachers may have received inadequate preparation to work with the variety of student disabilities encountered in the classroom, or they may have received limited training regarding the full range of learning disabilities and their effects on classroom performance. Along with problems in the area of math, students may also have…

  9. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.

    1993-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.

  10. Context-sensitive trace inlining for Java.

    PubMed

    Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter

    2013-12-01

    Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.

  11. Research and Practice of the News Map Compilation Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  12. Circadian expression of steroidogenic cytochromes P450 in the mouse adrenal gland--involvement of cAMP-responsive element modulator in epigenetic regulation of Cyp17a1.

    PubMed

    Košir, Rok; Zmrzljak, Ursula Prosenc; Bele, Tanja; Acimovic, Jure; Perse, Martina; Majdic, Gregor; Prehn, Cornelia; Adamski, Jerzy; Rozman, Damjana

    2012-05-01

    The cytochrome P450 (CYP) genes Cyp51, Cyp11a1, Cyp17a1, Cyb11b1, Cyp11b2 and Cyp21a1 are involved in the adrenal production of corticosteroids, whose circulating levels are circadian. cAMP signaling plays an important role in adrenal steroidogenesis. By using cAMP responsive element modulator (Crem) knockout mice, we show that CREM isoforms contribute to circadian expression of steroidogenic CYPs in the mouse adrenal gland. Most striking was the CREM-dependent hypomethylation of the Cyp17a1 promoter at zeitgeber time 12, which resulted in higher Cyp17a1 mRNA and protein expression in the knockout adrenal glands. The data indicate that products of the Crem gene control the epigenetic repression of Cyp17 in mouse adrenal glands. © 2011 The Authors Journal compilation © 2011 FEBS.

  13. Differential expression of liver and kidney proteins in a mouse model for primary hyperoxaluria type I.

    PubMed

    Hernández-Fernaud, Juan R; Salido, Eduardo

    2010-11-01

    Mutations in the alanine-glyoxylate aminotransferase gene (AGXT) are responsible for primary hyperoxaluria type I, a rare disease characterized by excessive hepatic oxalate production that leads to renal failure. A deeper understanding of the changes in the metabolic pathways secondary to the lack of AGXT expression is needed in order to explore substrate depletion as a therapeutic strategy to limit oxalate production in primary hyperoxaluria type I. We have developed an Agxt knockout (AgxtKO) mouse that reproduces some key features of primary hyperoxaluria type I. To improve our understanding of the metabolic adjustments subsequent to AGXT deficiency, we performed a proteomic analysis of the changes in expression levels of various subcellular fractions of liver and kidney metabolism linked to the lack of AGXT. In this article, we report specific changes in the liver and kidney proteome of AgxtKO mice that point to significant variations in gluconeogenesis, glycolysis and fatty acid pathways. Journal compilation © 2010 FEBS. No claim to original German government works.

  14. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  15. Attachment in the doctor-patient relationship in general practice: a qualitative study.

    PubMed

    Frederiksen, Heidi Bøgelund; Kragstrup, Jakob; Dehlholm-Lambertsen, Birgitte

    2010-09-01

    To explore why interpersonal continuity with a regular doctor is valuable to patients. A qualitative study based on 22 interviews with patients, 12 who saw their regular general practitioner (GP) and 10 who saw an unfamiliar GP. The patients were selected after an observed consultation and sampled purposively according to reason for encounter, age, and sex. The research question was answered by means of psychological theory. A need for attachment was a central issue for the understanding of the value of interpersonal continuity for patients. The patients explained that they preferred to create a personal relationship with their GP and the majority expressed a degree of vulnerability in the doctor-patient relationship. The more sick or worried they were the more vulnerable and the more in need of a regular GP. Furthermore, patients stated that it was difficult for them to change GP even if they had a poor relationship. Attachment theory may provide an explanation for patients' need to see a regular GP. The vulnerability of being a patient creates a need for attachment to a caregiver. This need is fundamental and is activated in adults when they are sick or scared.

  16. A function space framework for structural total variation regularization with applications in inverse problems

    NASA Astrophysics Data System (ADS)

    Hintermüller, Michael; Holler, Martin; Papafitsoros, Kostas

    2018-06-01

    In this work, we introduce a function space setting for a wide class of structural/weighted total variation (TV) regularization methods motivated by their applications in inverse problems. In particular, we consider a regularizer that is the appropriate lower semi-continuous envelope (relaxation) of a suitable TV type functional initially defined for sufficiently smooth functions. We study examples where this relaxation can be expressed explicitly, and we also provide refinements for weighted TV for a wide range of weights. Since an integral characterization of the relaxation in function space is, in general, not always available, we show that, for a rather general linear inverse problems setting, instead of the classical Tikhonov regularization problem, one can equivalently solve a saddle-point problem where no a priori knowledge of an explicit formulation of the structural TV functional is needed. In particular, motivated by concrete applications, we deduce corresponding results for linear inverse problems with norm and Poisson log-likelihood data discrepancy terms. Finally, we provide proof-of-concept numerical examples where we solve the saddle-point problem for weighted TV denoising as well as for MR guided PET image reconstruction.

  17. Rhythm sensitivity in macaque monkeys

    PubMed Central

    Selezneva, Elena; Deike, Susann; Knyazeva, Stanislava; Scheich, Henning; Brechmann, André; Brosch, Michael

    2013-01-01

    This study provides evidence that monkeys are rhythm sensitive. We composed isochronous tone sequences consisting of repeating triplets of two short tones and one long tone which humans perceive as repeating triplets of two weak and one strong beat. This regular sequence was compared to an irregular sequence with the same number of randomly arranged short and long tones with no such beat structure. To search for indication of rhythm sensitivity we employed an oddball paradigm in which occasional duration deviants were introduced in the sequences. In a pilot study on humans we showed that subjects more easily detected these deviants when they occurred in a regular sequence. In the monkeys we searched for spontaneous behaviors the animals executed concomitant with the deviants. We found that monkeys more frequently exhibited changes of gaze and facial expressions to the deviants when they occurred in the regular sequence compared to the irregular sequence. In addition we recorded neuronal firing and local field potentials from 175 sites of the primary auditory cortex during sequence presentation. We found that both types of neuronal signals differentiated regular from irregular sequences. Both signals were stronger in regular sequences and occurred after the onset of the long tones, i.e., at the position of the strong beat. Local field potential responses were also significantly larger for the durational deviants in regular sequences, yet in a later time window. We speculate that these temporal pattern-selective mechanisms with a focus on strong beats and their deviants underlie the perception of rhythm in the chosen sequences. PMID:24046732

  18. High priority needs for range-wide monitoring of North American landbirds

    USGS Publications Warehouse

    Dunn, Erica H.; Altman, B.L.; Bart, J.; Beardmore, C.J.; Berlanga, H.; Blancher, P.J.; Butcher, G.S.; Demarest, D.W.; Dettmers, R.; Hunter, W.C.; Iñigo-Elias, Eduardo E.; Panjabi, A.O.; Pashley, D.N.; Ralph, C.J.; Rich, T.D.; Rosenberg, K.V.; Rustay, C.M.; Ruth, J.M.; Will, T.C.

    2005-01-01

    This document is an extension of work done for the Partners in Flight North American Landbird Conservation Plan (Rich et al. 2004). The Continental Plan reviewed conservation status of the 448 native landbird species that regularly breed in the United States and Canada. Two groups of species were identified as having high conservation importance: the PIF Watch List, made up of species for which there is conservation concern, and Stewardship Spices that are particularly characteristic of regional avifaunas. In addition, continental scale monitoring needs were identified for all species. Here we extend the monitoring needs aspect of the Plan, providing additional detail and suggesting the best means of filling the gaps in broad-scale, long-term trend monitoring. This analysis and report was compiled by the Partners in Flight (PIF) Science Committee as a contribution to current work by the North American Bird Conservation Initiative to assess the status of bird population monitoring in North America and to make recommendations for improvements.

  19. Application LANDSAT imagery to geologic mapping in the ice-free valleys of Antarctica

    NASA Technical Reports Server (NTRS)

    Houston, R. S. (Principal Investigator); Marrs, R. W.; Smithson, S. B.

    1976-01-01

    The author has identified the following significant results. Studies in the Ice-Free Valleys are resulted in the compilation of a sizeable library of maps and publications. Rock reflectance measurements were taken during the Antarctic summer of 1973. Spectral reflectance of rocks (mostly mafic lava flows) in the McMurdo and Ice-Free Valleys areas were measured using a filter wheel photometer equipped to measure reflectances in the four Landsat bands. A series of samples were collected at regular intervals across a large differentiated, mafic sill near Lake Vida. Chemical analyses of the sample suggest that the tonal variations in this sill are controlled by changes in the iron content of the rock. False color images were prepared for a number of areas by the diazo method and with an optical multispectral biviewer. These images were useful in defining boundaries of sea ice, snow cover, and in the study of ablating glaciers, but were not very useful for rock discrimination.

  20. Sea lice infestations on farmed Atlantic salmon in Scotland and the use of ectoparasitic treatments.

    PubMed

    Revie, C W; Gettinby, G; Treasurer, J W; Grant, A N; Reid, S W J

    A recently compiled national database on sea lice infestations on farmed Atlantic salmon, contains detailed records for the period 1996 to 2000 from over 30 commercial sites on the west coast of Scotland. The data indicate that the two prevalent species of lice, Lepeophtheirus salmonis and Caligus elongatus, have different trends in abundance and distinctive seasonal patterns of infestation on farmed salmon. For the economically important species L salmonis, its abundance on fish varies with the time of the production cycle, the time of year and the particular year. Weekly fluctuations in sea lice counts indicate that treatment can be very effective in controlling infestations but that the counts recover rapidly and regular treatments are necessary to ensure control. A comparison of sites using medium or large numbers of treatments suggests that they do not reduce sea lice infestations to the same levels. There is also evidence that sites using treatments based on different chemical constituents had significantly different levels of infestation.

  1. Processing large remote sensing image data sets on Beowulf clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Schmidt, Gail

    2003-01-01

    High-performance computing is often concerned with the speed at which floating- point calculations can be performed. The architectures of many parallel computers and/or their network topologies are based on these investigations. Often, benchmarks resulting from these investigations are compiled with little regard to how a large dataset would move about in these systems. This part of the Beowulf study addresses that concern by looking at specific applications software and system-level modifications. Applications include an implementation of a smoothing filter for time-series data, a parallel implementation of the decision tree algorithm used in the Landcover Characterization project, a parallel Kriging algorithm used to fit point data collected in the field on invasive species to a regular grid, and modifications to the Beowulf project's resampling algorithm to handle larger, higher resolution datasets at a national scale. Systems-level investigations include a feasibility study on Flat Neighborhood Networks and modifications of that concept with Parallel File Systems.

  2. The Seungjeongwon Ilgi as a Major Source of Korean Astronomical Records

    NASA Astrophysics Data System (ADS)

    Stephenson, F. Richard

    The importance of early Korean records of supernovae, comets, meteors and aurorae in modern astronomy is well-known. However, the most extensive Korean source of such data, the Seungjeongwon Ilgi (Daily records of the Office of Royal Secretariat), has received relatively little attention among historians of astronomy. Written in Chinese (Hanmun), the Seungjeongwon Ilgi is a day-to-day chronicle of important events. The main emphasis is on matters of court and state, but observations of a wide variety of astronomical phenomena are regularly included. Although maintenance of the chronicle began early in the Joseon Dynasty (AD 1392-1910), due to wars and rebellions only the records from AD 1623 to 1894 now survive. Nevertheless, the remaining text is substantial, containing more than 3,000 chapters. In this paper, the general format of the astronomical records in the Seungjeongwon Ilgi is discussed, together with examples of the various types of celestial observations which this huge compilation contains.

  3. [Strengthening the medical aspect of addiction care].

    PubMed

    van Brussel, G H

    2003-08-23

    The Dutch Association for Addiction Medicine and the umbrella organisation GGZ Nederland (sector organisation for mental health and addiction care) have compiled a report entitled 'Strengthening medical care in the addiction care sector'. The report argues why medical care needs to be strengthened and provides guidance as to how the present shortcomings in quality and quantity can be dealt with. Addiction is now considered to be a medical condition with patients instead of clients. This means that the care, including the financial aspects, needs to be organised in the same way as all other forms of regular health care. Furthermore, the training in addiction medicine needs to be given a clearer status in the form of departments, professorships, training institutes and certification. Within the context of this report the responsibility of addiction centres needs to be emphasised. Vacancies in the many forms of social work could be exchanged for well-trained nurses and physicians, without the need for extra financial assistance.

  4. Transparent process migration: Design alternatives and the Sprite implementation

    NASA Technical Reports Server (NTRS)

    Douglis, Fred; Ousterhout, John

    1991-01-01

    The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.

  5. Void statistics, scaling, and the origins of large-scale structure

    NASA Technical Reports Server (NTRS)

    Fry, J. N.; Giovanelli, Riccardo; Haynes, Martha P.; Melott, Adrian L.; Scherrer, Robert J.

    1989-01-01

    The probability that a volume of the universe of given size and shape spaced at random will be void of galaxies is used here to study various models of the origin of cosmological structures. Numerical simulations are conducted on hot-particle and cold-particle-modulated inflationary models with and without biasing, on isothermal or initially Poisson models, and on models where structure is seeded by loops of cosmic string. For the Pisces-Perseus redshift compilation of Giovanelli and Haynes (1985), it is found that hierarchical scaling is obeyed for subsamples constructed with different limiting magnitudes and subsamples taken at random. This result confirms that the hierarchical ansatz holds valid to high order and supports the idea that structure in the observed universe evolves by a regular process from an almost Gaussian primordial state. Neutrino models without biasing show the effect of a strong feature in the initial power spectrum. Cosmic string models do not agree well with the galaxy data.

  6. "Observation Obscurer" - Time Series Viewer, Editor and Processor

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.

    The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).

  7. A compilation of sulfur dioxide and carbon dioxide emission-rate data from Cook Inlet volcanoes (Redoubt, Spurr, Iliamna, and Augustine), Alaska during the period from 1990 to 1994

    USGS Publications Warehouse

    Doukas, Michael P.

    1995-01-01

    Airborne sulfur dioxide (SO2) gas sampling of the Cook Inlet volcanoes (Mt. Spurr, Redoubt, Iliamna, and Augustine) began in 1986 when several measurements were carried out at Augustine volcano during the eruption of 1986 (Rose and others, 1988). More systematic monitoring for SO2 began in March 1990 and for carbon dioxide (CO2) began in June, 1990 at Redoubt Volcano (Brantley, 1990 and Casadevall and others, 1994) and continues to the present. This report contains all of the available daily SO2 and CO2 emission rates determined by the U.S. Geological Survey (USGS) from March 1990 through July 1994. Intermittent measurements (four to six month intervals) at Augustine and Iliamna began in 1990 and continues to the present. Intermittent measurements began at Mt. Spurr volcano in 1991, and were continued at more regular intervals from June, 1992 through the 1992 eruption at the Crater Peak vent to the present.

  8. Crystal surface analysis using matrix textural features classified by a probabilistic neural network

    NASA Astrophysics Data System (ADS)

    Sawyer, Curry R.; Quach, Viet; Nason, Donald; van den Berg, Lodewijk

    1991-12-01

    A system is under development in which surface quality of a growing bulk mercuric iodide crystal is monitored by video camera at regular intervals for early detection of growth irregularities. Mercuric iodide single crystals are employed in radiation detectors. A microcomputer system is used for image capture and processing. The digitized image is divided into multiple overlapping sub-images and features are extracted from each sub-image based on statistical measures of the gray tone distribution, according to the method of Haralick. Twenty parameters are derived from each sub-image and presented to a probabilistic neural network (PNN) for classification. This number of parameters was found to be optimal for the system. The PNN is a hierarchical, feed-forward network that can be rapidly reconfigured as additional training data become available. Training data is gathered by reviewing digital images of many crystals during their growth cycle and compiling two sets of images, those with and without irregularities.

  9. Half-life determination for {sup 108}Ag and {sup 110}Ag

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2014-11-11

    In this work, the half-life of the short-lived silver radionuclides {sup 108}Ag and {sup 110}Ag were measured by following the activity of samples after they were irradiated in the IEA-R1 reactor. The results were then fitted using a non-paralizable dead time correction to the regular exponential decay and the individual half-life values obtained were then analyzed using both the Normalized Residuals and the Rajeval techniques, in order to reach the most exact and precise final values. To check the validity of dead-time correction, a second correction method was also employed by means of counting a long-lived {sup 60}Co radioactive sourcemore » together with the samples as a livetime chronometer. The final half-live values obtained using both dead-time correction methods were in good agreement, showing that the correction was properly assessed. The results obtained are partially compatible with the literature values, but with a lower uncertainty, and allow a discussion on the last ENSDF compilations' values.« less

  10. Summarising the National Inventory of South Africa for the Public and its Application in Heritage Management

    NASA Astrophysics Data System (ADS)

    Mlungwana, N.; Jackson, C.

    2017-08-01

    This paper will focus on the national inventory of South Africa and application in heritage management. The South African Heritage Resources Agency (SAHRA) is mandated to compile and maintain an inventory of the national estate, defined as heritage resources of cultural and other significance as per Sections 3 and 39 of the National Heritage Resources Act. No.25 of 1999. This inventory is presented in a form of a database facilitated through the South African Heritage Resources Information System (SAHRIS). SAHRA is also mandated to produce a summary and analysis of this inventory of the national estate at regular intervals as per Section 39(7) of the NHRA. This inventory and its subsequent publication facilitate accountability for the institution, access to the data by the public as well as public awareness. The national inventory is populated through numerous digitisation projects by various heritage institutions namely museums, galleries, Provincial Heritage Resources Authorities (PHRA's) and the public at large.

  11. Age of the Scan Basin (Scotia Sea)

    NASA Astrophysics Data System (ADS)

    Schreider, Al. A.; Schreider, A. A.; Galindo-Zaldivar, J.; Maldonado, A.; Sazhneva, A. E.; Evsenko, E. I.

    2017-03-01

    Integrated geological and geophysical analysis of the anomalous magnetic field along with the previously unpublished profiles of Spanish expeditions onboard the R/V Hesperides and international databases of geomagnetic data processed in the context of the global tectonics concepts made it possible to identify paleomagnetic anomalies C11-C15 and compile the first map of the bottom geochronology of the Scan Basin. Unlike in earlier known publications, the paleoaxis of spreading does extend northeast, but approximately at an angle of 345°. According to calculations, spreading began 35.294‒35.706 Ma ago during chron C15r, and the spreading paleoaxis was abandoned 29.527‒29.970 Ma ago during chron C11n.2n. Thus, the destruction of the American-Antarctic bridge in the region joining the Bruce and Discovery banks with formation of oceanic crust in the Scan Basin started about 36 Ma ago. Regular spreading of the bottom has been continuing for about 6 Ma at a average rate close to 1.8 cm/year.

  12. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Investigatory files compiled... Records § 902.57 Investigatory files compiled for law enforcement purposes. (a) Files compiled by the...) Constitute an unwarranted invasion of personal privacy; (4) Disclose the identity of a confidential source...

  13. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  14. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  15. On the Estimation of Errors in Sparse Bathymetric Geophysical Data Sets

    NASA Astrophysics Data System (ADS)

    Jakobsson, M.; Calder, B.; Mayer, L.; Armstrong, A.

    2001-05-01

    There is a growing demand in the geophysical community for better regional representations of the world ocean's bathymetry. However, given the vastness of the oceans and the relative limited coverage of even the most modern mapping systems, it is likely that many of the older data sets will remain part of our cumulative database for several more decades. Therefore, regional bathymetrical compilations that are based on a mixture of historic and contemporary data sets will have to remain the standard. This raises the problem of assembling bathymetric compilations and utilizing data sets not only with a heterogeneous cover but also with a wide range of accuracies. In combining these data to regularly spaced grids of bathymetric values, which the majority of numerical procedures in earth sciences require, we are often forced to use a complex interpolation scheme due to the sparseness and irregularity of the input data points. Consequently, we are faced with the difficult task of assessing the confidence that we can assign to the final grid product, a task that is not usually addressed in most bathymetric compilations. We approach the problem of assessing the confidence via a direct-simulation Monte Carlo method. We start with a small subset of data from the International Bathymetric Chart of the Arctic Ocean (IBCAO) grid model [Jakobsson et al., 2000]. This grid is compiled from a mixture of data sources ranging from single beam soundings with available metadata to spot soundings with no available metadata, to digitized contours; the test dataset shows examples of all of these types. From this database, we assign a priori error variances based on available meta-data, and when this is not available, based on a worst-case scenario in an essentially heuristic manner. We then generate a number of synthetic datasets by randomly perturbing the base data using normally distributed random variates, scaled according to the predicted error model. These datasets are then re-gridded using the same methodology as the original product, generating a set of plausible grid models of the regional bathymetry that we can use for standard error estimates. Finally, we repeat the entire random estimation process and analyze each run's standard error grids in order to examine sampling bias and variance in the predictions. The final products of the estimation are a collection of standard error grids, which we combine with the source data density in order to create a grid that contains information about the bathymetry model's reliability. Jakobsson, M., Cherkis, N., Woodward, J., Coakley, B., and Macnab, R., 2000, A new grid of Arctic bathymetry: A significant resource for scientists and mapmakers, EOS Transactions, American Geophysical Union, v. 81, no. 9, p. 89, 93, 96.

  16. Compiler-assisted static checkpoint insertion

    NASA Technical Reports Server (NTRS)

    Long, Junsheng; Fuchs, W. K.; Abraham, Jacob A.

    1992-01-01

    This paper describes a compiler-assisted approach for static checkpoint insertion. Instead of fixing the checkpoint location before program execution, a compiler enhanced polling mechanism is utilized to maintain both the desired checkpoint intervals and reproducible checkpoint 1ocations. The technique has been implemented in a GNU CC compiler for Sun 3 and Sun 4 (Sparc) processors. Experiments demonstrate that the approach provides for stable checkpoint intervals and reproducible checkpoint placements with performance overhead comparable to a previously presented compiler assisted dynamic scheme (CATCH) utilizing the system clock.

  17. Ada (Tradename) Compiler Validation Summary Report. Harris Corporation. Harris Ada Compiler, Version 1.0. Harris HCX-7.

    DTIC Science & Technology

    1986-06-12

    owp-fts 677 RDA (TRRDENE) COMPILER VALIDATION SUMAY REPORT III HARRIS CORPORATION MAR.. (U) INFORMATION SYSTEMS AM TECHNOLOGY CENTER N-P AFI OM ADA...Subtitle) 5. TYPE OF REPORT & PERIOD COVERED Ada Compiler Validation Summary Report : 12 .UN 1986 to 12 JUN1 1987 Harris Corporation, Harris Ada Compiler...Version 1.0, Harris HCX-7 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) 8. CONTRACT OR GRANT NUMBERs) Wright-Patterson 9. PERFORMING ORGANIZATION AND

  18. Ada Compiler Validation Summary Report: Harris Corporation, Harris Ada Compiler, Version 1.0, Harris H1200 Host. Textronix 8540A-1750A Target.

    DTIC Science & Technology

    1987-06-03

    REPORT - HARRIS U-1 CORPORATION HARRIS ADA COM (Ui) ADA JOINT PROGRAM OFFICE ARLINGTON VA 93 JUN 87 NC... Report : 3 June 1987 to 3 June 1988 Harris Corp., Harris Ada Compiler, Ver. 1.0, Harris H1200 Host. Tektronix 8540A-1750A Target 6. PERFORMING ORG. REPORT ...01 -07-HAR Ada ® COMPILER VALIDATION SUMMARY REPORT : Harris Corporation Harris Ada Compiler, Version 1.0 Harris H1200 Host Tektronix

  19. Genome-wide prediction and analysis of human tissue-selective genes using microarray expression data

    PubMed Central

    2013-01-01

    Background Understanding how genes are expressed specifically in particular tissues is a fundamental question in developmental biology. Many tissue-specific genes are involved in the pathogenesis of complex human diseases. However, experimental identification of tissue-specific genes is time consuming and difficult. The accurate predictions of tissue-specific gene targets could provide useful information for biomarker development and drug target identification. Results In this study, we have developed a machine learning approach for predicting the human tissue-specific genes using microarray expression data. The lists of known tissue-specific genes for different tissues were collected from UniProt database, and the expression data retrieved from the previously compiled dataset according to the lists were used for input vector encoding. Random Forests (RFs) and Support Vector Machines (SVMs) were used to construct accurate classifiers. The RF classifiers were found to outperform SVM models for tissue-specific gene prediction. The results suggest that the candidate genes for brain or liver specific expression can provide valuable information for further experimental studies. Our approach was also applied for identifying tissue-selective gene targets for different types of tissues. Conclusions A machine learning approach has been developed for accurately identifying the candidate genes for tissue specific/selective expression. The approach provides an efficient way to select some interesting genes for developing new biomedical markers and improve our knowledge of tissue-specific expression. PMID:23369200

  20. Endothelium-dependent control of cerebrovascular functions through age: exercise for healthy cerebrovascular aging.

    PubMed

    Bolduc, Virginie; Thorin-Trescases, Nathalie; Thorin, Eric

    2013-09-01

    Cognitive performances are tightly associated with the maximal aerobic exercise capacity, both of which decline with age. The benefits on mental health of regular exercise, which slows the age-dependent decline in maximal aerobic exercise capacity, have been established for centuries. In addition, the maintenance of an optimal cerebrovascular endothelial function through regular exercise, part of a healthy lifestyle, emerges as one of the key and primary elements of successful brain aging. Physical exercise requires the activation of specific brain areas that trigger a local increase in cerebral blood flow to match neuronal metabolic needs. In this review, we propose three ways by which exercise could maintain the cerebrovascular endothelial function, a premise to a healthy cerebrovascular function and an optimal regulation of cerebral blood flow. First, exercise increases blood flow locally and increases shear stress temporarily, a known stimulus for endothelial cell maintenance of Akt-dependent expression of endothelial nitric oxide synthase, nitric oxide generation, and the expression of antioxidant defenses. Second, the rise in circulating catecholamines during exercise not only facilitates adequate blood and nutrient delivery by stimulating heart function and mobilizing energy supplies but also enhances endothelial repair mechanisms and angiogenesis. Third, in the long term, regular exercise sustains a low resting heart rate that reduces the mechanical stress imposed to the endothelium of cerebral arteries by the cardiac cycle. Any chronic variation from a healthy environment will perturb metabolism and thus hasten endothelial damage, favoring hypoperfusion and neuronal stress.

  1. Analytic continuation of quantum Monte Carlo data by stochastic analytical inference.

    PubMed

    Fuchs, Sebastian; Pruschke, Thomas; Jarrell, Mark

    2010-05-01

    We present an algorithm for the analytic continuation of imaginary-time quantum Monte Carlo data which is strictly based on principles of Bayesian statistical inference. Within this framework we are able to obtain an explicit expression for the calculation of a weighted average over possible energy spectra, which can be evaluated by standard Monte Carlo simulations, yielding as by-product also the distribution function as function of the regularization parameter. Our algorithm thus avoids the usual ad hoc assumptions introduced in similar algorithms to fix the regularization parameter. We apply the algorithm to imaginary-time quantum Monte Carlo data and compare the resulting energy spectra with those from a standard maximum-entropy calculation.

  2. On the regularization of impact without collision: the Painlevé paradox and compliance

    NASA Astrophysics Data System (ADS)

    Hogan, S. J.; Kristiansen, K. Uldall

    2017-06-01

    We consider the problem of a rigid body, subject to a unilateral constraint, in the presence of Coulomb friction. We regularize the problem by assuming compliance (with both stiffness and damping) at the point of contact, for a general class of normal reaction forces. Using a rigorous mathematical approach, we recover impact without collision (IWC) in both the inconsistent and the indeterminate Painlevé paradoxes, in the latter case giving an exact formula for conditions that separate IWC and lift-off. We solve the problem for arbitrary values of the compliance damping and give explicit asymptotic expressions in the limiting cases of small and large damping, all for a large class of rigid bodies.

  3. The not face: A grammaticalization of facial expressions of emotion.

    PubMed

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. The Not Face: A grammaticalization of facial expressions of emotion

    PubMed Central

    Benitez-Quiroz, C. Fabian; Wilbur, Ronnie B.; Martinez, Aleix M.

    2016-01-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3–8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. PMID:26872248

  5. [A-bomb experience and Hibakushas' lives].

    PubMed

    Akiba, Tadatoshi

    2012-01-01

    The A-bomb experience of Hiroshima may shed light on the reconstruction plan of the Eastern Japan Earthquake and Tsunami and on implementing middle to long range care plans for the victims of the catastrophe. An important element in the success of Hiroshima's reconstruction was the understanding of the realities of everyday life of citizens and hibakusha by local and national government, and incorporation of those points of view into the reconstruction plan. Sharing of accurate and fair information about the disaster, restoration, and reconstruction with citizens was and still is a prerequisite for success. To convey learned lessons from the Hiroshima experience, three books are helpful: "A-bomb Mayor" by Shinzo Hamai, "The Meaning of Survival" compiled by the Chugoku Shimbun and "The Children of the A-bomb" compiled by Arata Osada. They help understand the history of hibakusha psychology from the point of view of their everyday lives and may help those affected by the Earthquake and Tsunami. To summarize the history of psychological changes among the hibakusha, three key transitional pairs of statements used widely by them over the span of 66 years help show the change in their attitude and emotional outlook. Each pair consists of an expression from the period immediately following the bombing and a second more recent expression: (1) Transition from "I would rather die." to "I am glad I am alive." (2) Transition from "I would rather forget." to "We should not forget." (3) Transition from "You will understand if you are a victim." to "No one else should ever suffer as we did".

  6. Space Science in the Kindergarten Classroom and Beyond

    NASA Astrophysics Data System (ADS)

    Bonett, D.

    2000-12-01

    With the advent of probes to our closest planet Mars and the multi-national construction of Earth's first International Space Station, it is not presumptive to introduce 5 year old school children to the space sciences. K. E. Little Elementary School is located in the community of Bacliff, Texas. It is the largest elementary school (950 students) in the Dickinson Independent School District. K. E. Little is a Title 1 school with a multi-ethnic student population. It's close proximity to the Johnson Space Center and the Lunar and Planetary Institute provide ample instructional support and material. Last fall, two kindergarten classes received space science instruction. Both were class sizes of 19 with one class predominantly children of Vietnamese immigrants. Our goal was to create curiosity and awareness through a year-long integrated space science program of instruction. Accurate information of the space sciences was conveyed through sources i.e. books and videos, as well as conventional song, movement, and artistic expression. Videotaping and photographs replaced traditional anecdotal records. Samples of student work were compiled for classroom and school display. This year, two fifth grade classes will receive space science instruction using the Jason Project XII curriculum. Students will engage in a year-long exploration of the Hawaiian Islands. Information will be conveyed via internet and live video presentations as well as traditional sources i.e. books and videos, as well as song, movement, and artistic expression. Comparison of volcanic activity in Hawaii to volcanoes on other planets will be one of several interplanetary correlations. Samples of student work will be compiled for classroom, school, and community display.

  7. The development of a multi-target compiler-writing system for flight software development

    NASA Technical Reports Server (NTRS)

    Feyock, S.; Donegan, M. K.

    1977-01-01

    A wide variety of systems designed to assist the user in the task of writing compilers has been developed. A survey of these systems reveals that none is entirely appropriate to the purposes of the MUST project, which involves the compilation of one or at most a small set of higher-order languages to a wide variety of target machines offering little or no software support. This requirement dictates that any compiler writing system employed must provide maximal support in the areas of semantics specification and code generation, the areas in which existing compiler writing systems as well as theoretical underpinnings are weakest. This paper describes an ongoing research and development effort to create a compiler writing system which will overcome these difficulties, thus providing a software system which makes possible the fast, trouble-free creation of reliable compilers for a wide variety of target computers.

  8. Anti-diabetic activity of chromium picolinate and biotin in rats with type 2 diabetes induced by high-fat diet and streptozotocin.

    PubMed

    Sahin, Kazim; Tuzcu, Mehmet; Orhan, Cemal; Sahin, Nurhan; Kucuk, Osman; Ozercan, Ibrahim H; Juturu, Vijaya; Komorowski, James R

    2013-07-28

    The objective of the present study was to evaluate anti-diabetic effects of chromium picolinate (CrPic) and biotin supplementations in type 2 diabetic rats. The type 2 diabetic rat model was induced by high-fat diet (HFD) and low-dose streptozotocin. The rats were divided into five groups as follows: (1) non-diabetic rats fed a regular diet; (2) diabetic rats fed a HFD; (3) diabetic rats fed a HFD and supplemented with CrPic (80 μg/kg body weight (BW) per d); (4) diabetic rats fed a HFD and supplemented with biotin (300 μg/kg BW per d); (5) diabetic rats fed a HFD and supplemented with both CrPic and biotin. Circulating glucose, cortisol, total cholesterol, TAG, NEFA and malondialdehyde concentrations decreased (P< 0·05), but serum insulin concentrations increased (P< 0·05) in diabetic rats treated with biotin and CrPic, particularly with a combination of the supplements. Feeding a HFD to diabetic rats decreased PPAR-γ expression in adipose tissue and phosphorylated insulin receptor substrate 1 (p-IRS-1) expression of liver, kidney and muscle tissues, while the supplements increased (P< 0·001) PPAR-γ and p-IRS-1 expressions in relevant tissues. Expression of NF-κB in the liver and kidney was greater in diabetic rats fed a HFD, as compared with rats fed a regular diet (P< 0·01). The supplements decreased the expression of NF-κB in diabetic rats (P< 0·05). Results of the present study revealed that supplementing CrPic and biotin alone or in a combination exerts anti-diabetic activities, probably through modulation of PPAR-γ, IRS-1 and NF-κB proteins.

  9. Genetic toxicology and toxicogenomic analysis of three cigarette smoke condensates in vitro reveals few differences among full-flavor, blonde, and light products

    PubMed Central

    Yauk, Carole L; Williams, Andrew; Buick, Julie K; Chen, Guosheng; Maertens, Rebecca M; Halappanavar, Sabina; White, Paul A

    2012-01-01

    Cigarette smoking leads to various detrimental health outcomes. Tobacco companies produce different brands of cigarettes that are marketed as reduced harm tobacco products. Early examples included “light” cigarettes, which differ from regular cigarettes due to filter ventilation and/or differences in chemical constituents. In order to establish baseline similarities and differences among different tobacco brands available in Canada, the present study examined the cytotoxicity, mutagenicity, clastogenicity, and gene expression profiles of cigarette smoke condensate (CSC) from three tobacco products, encompassing a full-flavor, blonde, and “light” variety. Using the Salmonella mutagenicity assay, we confirmed that the three CSCs are mutagenic, and that the potency is related to the presence of aromatic amines. Using the Muta™Mouse FE1 cell line we determined that the CSCs were clastogenic and cytotoxic, but nonmutagenic, and the results showed few differences in potencies among the three brands. There were no clear brand-specific changes in gene expression; each brand yielded highly similar expression profiles within a time point and concentration. The molecular pathways and biological functions affected by exposure included xenobiotic metabolism, oxidative stress, DNA damage response, cell cycle arrest and apoptosis, as well as inflammation. Thus, there was no appreciable difference in toxicity or gene expression profiles between regular brands and products marketed as “light,” and hence no evidence of reduced harm. The work establishes baseline CSC cytotoxicity, mutagenicity, and expression profiles that can be used as a point of reference for comparison with data generated for products marketed as reduced harm and/or modified risk tobacco products. Mol. Mutagen. 2012. © 2012 Wiley Periodicals, Inc.† PMID:22431010

  10. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  11. 12 CFR 411.600 - Semi-annual compilation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  12. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  13. 26 CFR 301.7515-1 - Special statistical studies and compilations on request.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 18 2011-04-01 2011-04-01 false Special statistical studies and compilations on... Actions by the United States § 301.7515-1 Special statistical studies and compilations on request. The... of the cost of the work to be performed, to make special statistical studies and compilations...

  14. 26 CFR 301.7515-1 - Special statistical studies and compilations on request.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Special statistical studies and compilations on... Actions by the United States § 301.7515-1 Special statistical studies and compilations on request. The... of the cost of the work to be performed, to make special statistical studies and compilations...

  15. Kokkos GPU Compiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, Nicholas

    The Kokkos Clang compiler is a version of the Clang C++ compiler that has been modified to perform targeted code generation for Kokkos constructs in the goal of generating highly optimized code and to provide semantic (domain) awareness throughout the compilation toolchain of these constructs such as parallel for and parallel reduce. This approach is taken to explore the possibilities of exposing the developer’s intentions to the underlying compiler infrastructure (e.g. optimization and analysis passes within the middle stages of the compiler) instead of relying solely on the restricted capabilities of C++ template metaprogramming. To date our current activities havemore » focused on correct GPU code generation and thus we have not yet focused on improving overall performance. The compiler is implemented by recognizing specific (syntactic) Kokkos constructs in order to bypass normal template expansion mechanisms and instead use the semantic knowledge of Kokkos to directly generate code in the compiler’s intermediate representation (IR); which is then translated into an NVIDIA-centric GPU program and supporting runtime calls. In addition, by capturing and maintaining the higher-level semantics of Kokkos directly within the lower levels of the compiler has the potential for significantly improving the ability of the compiler to communicate with the developer in the terms of their original programming model/semantics.« less

  16. Regularized Filters for L1-Norm-Based Common Spatial Patterns.

    PubMed

    Wang, Haixian; Li, Xiaomeng

    2016-02-01

    The l1 -norm-based common spatial patterns (CSP-L1) approach is a recently developed technique for optimizing spatial filters in the field of electroencephalogram (EEG)-based brain computer interfaces. The l1 -norm-based expression of dispersion in CSP-L1 alleviates the negative impact of outliers. In this paper, we further improve the robustness of CSP-L1 by taking into account noise which does not necessarily have as large a deviation as with outliers. The noise modelling is formulated by using the waveform length of the EEG time course. With the noise modelling, we then regularize the objective function of CSP-L1, in which the l1-norm is used in two folds: one is the dispersion and the other is the waveform length. An iterative algorithm is designed to resolve the optimization problem of the regularized objective function. A toy illustration and the experiments of classification on real EEG data sets show the effectiveness of the proposed method.

  17. One-loop calculations in Supersymmetric Lattice QCD

    NASA Astrophysics Data System (ADS)

    Costa, M.; Panagopoulos, H.

    2017-03-01

    We study the self energies of all particles which appear in a lattice regularization of supersymmetric QCD (N = 1). We compute, perturbatively to one-loop, the relevant two-point Green's functions using both the dimensional and the lattice regularizations. Our lattice formulation employs the Wilson fermion acrion for the gluino and quark fields. The gauge group that we consider is SU(Nc) while the number of colors, Nc and the number of flavors, Nf , are kept as generic parameters. We have also searched for relations among the propagators which are computed from our one-loop results. We have obtained analytic expressions for the renormalization functions of the quark field (Zψ), gluon field (Zu), gluino field (Zλ) and squark field (ZA±). We present here results from dimensional regularization, relegating to a forthcoming publication [1] our results along with a more complete list of references. Part of the lattice study regards also the renormalization of quark bilinear operators which, unlike the nonsupersymmetric case, exhibit a rich pattern of operator mixing at the quantum level.

  18. The Adler D-function for N = 1 SQCD regularized by higher covariant derivatives in the three-loop approximation

    NASA Astrophysics Data System (ADS)

    Kataev, A. L.; Kazantsev, A. E.; Stepanyantz, K. V.

    2018-01-01

    We calculate the Adler D-function for N = 1 SQCD in the three-loop approximation using the higher covariant derivative regularization and the NSVZ-like subtraction scheme. The recently formulated all-order relation between the Adler function and the anomalous dimension of the matter superfields defined in terms of the bare coupling constant is first considered and generalized to the case of an arbitrary representation for the chiral matter superfields. The correctness of this all-order relation is explicitly verified at the three-loop level. The special renormalization scheme in which this all-order relation remains valid for the D-function and the anomalous dimension defined in terms of the renormalized coupling constant is constructed in the case of using the higher derivative regularization. The analytic expression for the Adler function for N = 1 SQCD is found in this scheme to the order O (αs2). The problem of scheme-dependence of the D-function and the NSVZ-like equation is briefly discussed.

  19. Who Learns More? Cultural Differences in Implicit Sequence Learning

    PubMed Central

    Fu, Qiufang; Dienes, Zoltan; Shang, Junchen; Fu, Xiaolan

    2013-01-01

    Background It is well documented that East Asians differ from Westerners in conscious perception and attention. However, few studies have explored cultural differences in unconscious processes such as implicit learning. Methodology/Principal Findings The global-local Navon letters were adopted in the serial reaction time (SRT) task, during which Chinese and British participants were instructed to respond to global or local letters, to investigate whether culture influences what people acquire in implicit sequence learning. Our results showed that from the beginning British expressed a greater local bias in perception than Chinese, confirming a cultural difference in perception. Further, over extended exposure, the Chinese learned the target regularity better than the British when the targets were global, indicating a global advantage for Chinese in implicit learning. Moreover, Chinese participants acquired greater unconscious knowledge of an irrelevant regularity than British participants, indicating that the Chinese were more sensitive to contextual regularities than the British. Conclusions/Significance The results suggest that cultural biases can profoundly influence both what people consciously perceive and unconsciously learn. PMID:23940773

  20. Effects of aspirin on small-cell lung cancer mortality and metastatic presentation.

    PubMed

    Maddison, Paul

    2017-04-01

    Although meta-analysis data have shown that taking regular aspirin may reduce lung cancer mortality, individual trial data results are conflicting, and the data on the effects of aspirin on different histological subtypes of lung tumours, in particular small-cell lung cancer, are sparse. We conducted a prospective observational study of 313 patients with a new diagnosis of small-cell lung cancer and recorded use of aspirin before and after tumour diagnosis. Seventy-one (23%) patients were taking regular daily aspirin for more than 2 years at the time of tumour diagnosis. We found that regular use of aspirin had no effect on survival nor metastatic presentation compared to data from small-cell lung cancer patients not taking aspirin. The lack of survival benefit in patients with small-cell lung cancer taking long-term aspirin may be due to the low expression of cyclooxygenase-2 in small-cell lung cancer tissue. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. MEPD: a Medaka gene expression pattern database

    PubMed Central

    Henrich, Thorsten; Ramialison, Mirana; Quiring, Rebecca; Wittbrodt, Beate; Furutani-Seiki, Makoto; Wittbrodt, Joachim; Kondoh, Hisato

    2003-01-01

    The Medaka Expression Pattern Database (MEPD) stores and integrates information of gene expression during embryonic development of the small freshwater fish Medaka (Oryzias latipes). Expression patterns of genes identified by ESTs are documented by images and by descriptions through parameters such as staining intensity, category and comments and through a comprehensive, hierarchically organized dictionary of anatomical terms. Sequences of the ESTs are available and searchable through BLAST. ESTs in the database are clustered upon entry and have been blasted against public data-bases. The BLAST results are updated regularly, stored within the database and searchable. The MEPD is a project within the Medaka Genome Initiative (MGI) and entries will be interconnected to integrated genomic map databases. MEPD is accessible through the WWW at http://medaka.dsp.jst.go.jp/MEPD. PMID:12519950

  2. Widespread Enhancer Activity from Core Promoters.

    PubMed

    Medina-Rivera, Alejandra; Santiago-Algarra, David; Puthier, Denis; Spicuglia, Salvatore

    2018-06-01

    Gene expression in higher eukaryotes is precisely regulated in time and space through the interplay between promoters and gene-distal regulatory regions, known as enhancers. The original definition of enhancers implies the ability to activate gene expression remotely, while promoters entail the capability to locally induce gene expression. Despite the conventional distinction between them, promoters and enhancers share many genomic and epigenomic features. One intriguing finding in the gene regulation field comes from the observation that many core promoter regions display enhancer activity. Recent high-throughput reporter assays along with clustered regularly interspaced short palindromic repeats (CRISPR)/Cas9-related approaches have indicated that this phenomenon is common and might have a strong impact on our global understanding of genome organisation and gene expression regulation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. OSCAR API for Real-Time Low-Power Multicores and Its Performance on Multicores and SMP Servers

    NASA Astrophysics Data System (ADS)

    Kimura, Keiji; Mase, Masayoshi; Mikami, Hiroki; Miyamoto, Takamichi; Shirako, Jun; Kasahara, Hironori

    OSCAR (Optimally Scheduled Advanced Multiprocessor) API has been designed for real-time embedded low-power multicores to generate parallel programs for various multicores from different vendors by using the OSCAR parallelizing compiler. The OSCAR API has been developed by Waseda University in collaboration with Fujitsu Laboratory, Hitachi, NEC, Panasonic, Renesas Technology, and Toshiba in an METI/NEDO project entitled "Multicore Technology for Realtime Consumer Electronics." By using the OSCAR API as an interface between the OSCAR compiler and backend compilers, the OSCAR compiler enables hierarchical multigrain parallel processing with memory optimization under capacity restriction for cache memory, local memory, distributed shared memory, and on-chip/off-chip shared memory; data transfer using a DMA controller; and power reduction control using DVFS (Dynamic Voltage and Frequency Scaling), clock gating, and power gating for various embedded multicores. In addition, a parallelized program automatically generated by the OSCAR compiler with OSCAR API can be compiled by the ordinary OpenMP compilers since the OSCAR API is designed on a subset of the OpenMP. This paper describes the OSCAR API and its compatibility with the OSCAR compiler by showing code examples. Performance evaluations of the OSCAR compiler and the OSCAR API are carried out using an IBM Power5+ workstation, an IBM Power6 high-end SMP server, and a newly developed consumer electronics multicore chip RP2 by Renesas, Hitachi and Waseda. From the results of scalability evaluation, it is found that on an average, the OSCAR compiler with the OSCAR API can exploit 5.8 times speedup over the sequential execution on the Power5+ workstation with eight cores and 2.9 times speedup on RP2 with four cores, respectively. In addition, the OSCAR compiler can accelerate an IBM XL Fortran compiler up to 3.3 times on the Power6 SMP server. Due to low-power optimization on RP2, the OSCAR compiler with the OSCAR API achieves a maximum power reduction of 84% in the real-time execution mode.

  4. How Spherical Are the Archimedean Solids and Their Duals?

    ERIC Educational Resources Information Center

    Aravind, P. K.

    2011-01-01

    The Isoperimetric Quotient, or IQ, introduced by G. Polya, characterizes the degree of sphericity of a convex solid. This paper obtains closed form expressions for the surface area and volume of any Archimedean polyhedron in terms of the integers specifying the type and number of regular polygons occurring around each vertex. Similar results are…

  5. Changing Preschoolers' Attitudes toward Animals: A Zoo Program and an Evaluation.

    ERIC Educational Resources Information Center

    Reames, Judi; Rajecki, D. W.

    A zoo outreach program for preschoolers was evaluated by assessing the reactions of the children themselves. Children's attitudes toward certain animals were measured before and after live exposure to those animals in regular preschool settings. The attitudinal measure was a nonverbal expression of affect as elicited by pictures. Additionally,…

  6. Creating Success! A Program for Behaviorally and Academically At-Risk Children.

    ERIC Educational Resources Information Center

    Akin, Terri; And Others

    This document presents a program designed especially for behaviorally and academically at-risk children in kindergarten through sixth grade. It includes a collection of experiential activities that provides ways to infuse the elements of success into the regular classroom curriculum. Eight developmental areas are targeted: (1) expressing feelings;…

  7. A Study on the Affordances and Constraints of the Instructional Use of Project Read

    ERIC Educational Resources Information Center

    Cosgrove, Margaret

    2017-01-01

    This research study was designed and conducted to evaluate the regular education teacher's perspective on the effective use of Project Read to adequately instruct spelling mastery at the first grade level. The Project Read curriculum is divided into three major strands including phonics, reading comprehension, and written expression. Project…

  8. Sustainable Participation in Regular Exercise amongst Older People: Developing an Action Research Approach

    ERIC Educational Resources Information Center

    Davies, Jeanne; Lester, Carolyn; O'Neill, Martin; Williams, Gareth

    2008-01-01

    Objective: This article describes the Triangle Project's work with a post industrial community, where healthy living activities were developed in response to community members' expressed needs. Method: An action research partnership approach was taken to reduce health inequalities, with local people developing their own activities to address…

  9. Multiple Aspects of the Southern California Wildfires as Seen by NASA's AVIRIS

    NASA Image and Video Library

    2017-12-15

    NASA's Airborne Visible Infrared Imaging Spectrometer instrument (AVIRIS), flying aboard a NASA Armstrong Flight Research Center high-altitude ER-2 aircraft, observed wildfires burning in Southern California on Dec. 5-7, 2017. AVIRIS is an imaging spectrometer that observes light in visible and infrared wavelengths, measuring the full spectrum of radiated energy. Unlike regular cameras with three colors, AVIRIS has 224 spectral channels, measuring contiguously from the visible through the shortwave infrared. Data from these flights, compared against measurements acquired earlier in the year, show many ways this one instrument can improve both our understanding of fire risk and the response to fires in progress. The top row in this image compilation shows pre-fire data acquired from June 2017. At top left is a visible-wavelength image similar to what our own eyes would see. The top middle image is a map of surface composition based on analyzing the full electromagnetic spectrum, revealing green vegetated areas and non-photosynthetic vegetation that is potential fuel as well as non-vegetated surfaces that may slow an advancing fire. The image at top right is a remote measurement of the water in tree canopies, a proxy for how much moisture is in the vegetation. The bottom row in the compilation shows data acquired from the Thomas fire in progress in December 2017. At bottom left is a visible wavelength image. The bottom middle image is an infrared image, with red at 2,250 nanometers showing fire energy, green at 1,650 nanometers showing the surface through the smoke, and blue at 1,000 nanometers showing the smoke itself. The image at bottom right is a fire temperature map using spectroscopic analysis to measure fire thermal emission recorded in the AVIRIS spectra. https://photojournal.jpl.nasa.gov/catalog/PIA22194

  10. Composite regional catalogs of earthquakes in the former Soviet Union

    USGS Publications Warehouse

    Rautian, Tatyana; Leith, William

    2002-01-01

    Seismological study of the territory of the former Soviet Union developed in the 20th century with the approach of maintaining constant observations with standard instrumentation and methods of data processing, determining standardized parameters describing the seismic sources, and producing regular summary publications. For most of the century, event data were published only in Russian and were generally unavailable to the Western scientific community. Yet for many regions of this vast territory, earthquakes with magnitudes less than 2 were routinely located and characterized, especially since the early 1960s. A great volume of data on the seismicity of the Eurasian land mass is therefore available, although to date only in scattered publications and for incomplete periods of time.To address this problem, we have undertaken a comprehensive compilation, documentation and evaluation of catalogs of seismicity of the former Soviet Union. These include four principal, Soviet-published catalog sources, supplemented by other publications. We view this as the first step in compiling a complete catalog of all known seismic events in this large and important region. Completion of this work will require digitizing the remaining catalogs of the various regional seismological institutes. To make these data more useful for regional seismic investigations, as well as to be consistent with their provenance, we have prepared composite regional catalogs, dividing the territory of the former Soviet Union into 24 regions. For each of these regions, all the data available from the basic catalog sources (see below) have been combined and evaluated. Note that, for regions with low seismicity, the historical (non-instrumental, macro-seismic) data are of increased importance. Such information, if not included in any summary, were taken from various publications and marked as "historical".

  11. Geosocial process and its regularities

    NASA Astrophysics Data System (ADS)

    Vikulina, Marina; Vikulin, Alexander; Dolgaya, Anna

    2015-04-01

    Natural disasters and social events (wars, revolutions, genocides, epidemics, fires, etc.) accompany each other throughout human civilization, thus reflecting the close relationship of these phenomena that are seemingly of different nature. In order to study this relationship authors compiled and analyzed the list of the 2,400 natural disasters and social phenomena weighted by their magnitude that occurred during the last XXXVI centuries of our history. Statistical analysis was performed separately for each aggregate (natural disasters and social phenomena), and for particular statistically representative types of events. There was 5 + 5 = 10 types. It is shown that the numbers of events in the list are distributed by logarithmic law: the bigger the event, the less likely it happens. For each type of events and each aggregate the existence of periodicities with periods of 280 ± 60 years was established. Statistical analysis of the time intervals between adjacent events for both aggregates showed good agreement with Weibull-Gnedenko distribution with shape parameter less than 1, which is equivalent to the conclusion about the grouping of events at small time intervals. Modeling of statistics of time intervals with Pareto distribution allowed to identify the emergent property for all events in the aggregate. This result allowed the authors to make conclusion about interaction between natural disasters and social phenomena. The list of events compiled by authors and first identified properties of cyclicity, grouping and interaction process reflected by this list is the basis of modeling essentially unified geosocial process at high enough statistical level. Proof of interaction between "lifeless" Nature and Society is fundamental and provided a new approach to forecasting demographic crises with taking into account both natural disasters and social phenomena.

  12. Application of the CCT system and its effects on the works of compilations and publications.

    NASA Astrophysics Data System (ADS)

    Shu, Sizhu

    The present information of the compilation and composition with the microcomputer at Shanghai Observatory were introduced, in which the applications of the CCT system on the compilation and composition were also presented. The effects of the composition with the microcomputer on the works of compilations and publications in recent years were discussed.

  13. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  14. Ada Compiler Validation Summary Report: Certificate Number: 890420W1. 10066 International Business Machines Corporation, IBM Development System for the Ada Language, AIX/RT Ada Compiler, Version 1.1.1, IBM RT PC 6150-125

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation, IBM Development System. for the Ada Language AIX/RT Ada Compiler, Version 1.1.1, Wright-Patterson APB...Certificate Number: 890420V1.10066 International Business Machines Corporation IBM Development System for the Ada Language AIX/RT Ada Compiler, Version 1.1.1...TEST INFORMATION The compiler was tested using command scripts provided by International Business Machines Corporation and reviewed by the validation

  15. Metabolic markers in Ossabaw pigs fed high fat diets enriched in regular or low α-linolenic acid soy oil

    PubMed Central

    2013-01-01

    Background Soy oil is a major vegetable oil consumed in the US. A recently developed soybean variety produces oil with a lower concentration of α-linolenic acid, hence a higher (n-6)/(n-3) ratio, than regular soy oil. The study was conducted to determine the metabolic impact of the low α-linolenic acid containing soy oil. Methods Ossabaw pigs were fed diets supplemented with either 13% regular soybean oil (SBO), or 13% of the low α-linolenic soybean oil (LLO) or a control diet (CON) without extra oil supplementation, for 8 weeks. Results Serum and adipose tissue α-linolenic acid concentration was higher in pigs fed the SBO diet than those on the CON and LLO diets. In the serum, the concentration of saturated fatty acids (SFA) was lower in the LLO group than in CON and SBO groups polyunsaturated fatty acid (PUFA) concentration was higher in the LLO group compared to CON and SBO groups. Glucose, insulin, triglycerides and LDL-cholesterol were higher in pigs fed the SBO diet than those fed the CON and LLO diets. HDL-cholesterol was lower in pigs on the SBO diet than those on the CON and LLO diets. Pigs fed SBO and LLO diets had lower CRP concentration than those on the CON diet. Adipose tissue expression of Interleukin 6 (IL-6) was higher in the SBO and LLO diets than the CON. Expression of ECM genes, COLVIA and fibronectin, was significantly reduced in the SBO diet relative to the CON and LLO diets whereas expression of inflammation-related genes, cluster of differentiation 68 (CD68) and monocyte chemoattractant protein 1 (MCP-1), was not different across treatments. Conclusions Results suggest that lowering the content of α-linolenic acid in the context of a high fat diet could lead to mitigation of development of hyperinsulinemia and dyslipidemia without significant effects on adipose tissue inflammation. PMID:23497195

  16. The 3of5 web application for complex and comprehensive pattern matching in protein sequences.

    PubMed

    Seiler, Markus; Mehrle, Alexander; Poustka, Annemarie; Wiemann, Stefan

    2006-03-16

    The identification of patterns in biological sequences is a key challenge in genome analysis and in proteomics. Frequently such patterns are complex and highly variable, especially in protein sequences. They are frequently described using terms of regular expressions (RegEx) because of the user-friendly terminology. Limitations arise for queries with the increasing complexity of patterns and are accompanied by requirements for enhanced capabilities. This is especially true for patterns containing ambiguous characters and positions and/or length ambiguities. We have implemented the 3of5 web application in order to enable complex pattern matching in protein sequences. 3of5 is named after a special use of its main feature, the novel n-of-m pattern type. This feature allows for an extensive specification of variable patterns where the individual elements may vary in their position, order, and content within a defined stretch of sequence. The number of distinct elements can be constrained by operators, and individual characters may be excluded. The n-of-m pattern type can be combined with common regular expression terms and thus also allows for a comprehensive description of complex patterns. 3of5 increases the fidelity of pattern matching and finds ALL possible solutions in protein sequences in cases of length-ambiguous patterns instead of simply reporting the longest or shortest hits. Grouping and combined search for patterns provides a hierarchical arrangement of larger patterns sets. The algorithm is implemented as internet application and freely accessible. The application is available at http://dkfz.de/mga2/3of5/3of5.html. The 3of5 application offers an extended vocabulary for the definition of search patterns and thus allows the user to comprehensively specify and identify peptide patterns with variable elements. The n-of-m pattern type offers an improved accuracy for pattern matching in combination with the ability to find all solutions, without compromising the user friendliness of regular expression terms.

  17. Antagonism of Sigma-1 Receptors Blocks Compulsive-Like Eating

    PubMed Central

    Cottone, Pietro; Wang, Xiaofan; Park, Jin Won; Valenza, Marta; Blasio, Angelo; Kwak, Jina; Iyer, Malliga R; Steardo, Luca; Rice, Kenner C; Hayashi, Teruo; Sabino, Valentina

    2012-01-01

    Binge eating disorder is an addiction-like disorder characterized by episodes of rapid and excessive food consumption within discrete periods of time which occur compulsively despite negative consequences. This study was aimed at determining whether antagonism of Sigma-1 receptors (Sig-1Rs) blocked compulsive-like binge eating. We trained male wistar rats to obtain a sugary, highly palatable diet (Palatable group) or a regular chow diet (Chow control group), for 1 h a day under fixed ratio 1 operant conditioning. Following intake stabilization, we evaluated the effects of the selective Sig-1R antagonist BD-1063 on food responding. Using a light/dark conflict test, we also tested whether BD-1063 could block the time spent and the food eaten in an aversive, open compartment, where the palatable diet was offered. Furthermore, we measured Sig-1R mRNA and protein expression in several brain areas of the two groups, 24 h after the last binge session. Palatable rats rapidly developed binge-like eating, escalating the 1 h intake by four times, and doubling the eating rate and the regularity of food responding, compared to Chow rats. BD-1063 dose-dependently reduced binge-like eating and the regularity of food responding, and blocked the increased eating rate in Palatable rats. In the light/dark conflict test, BD-1063 antagonized the increased time spent in the aversive compartment and the increased intake of the palatable diet, without affecting motor activity. Finally, Palatable rats showed reduced Sig-1R mRNA expression in prefrontal and anterior cingulate cortices, and a two-fold increase in Sig-1R protein expression in anterior cingulate cortex compared to control Chow rats. These findings suggest that the Sig-1R system may contribute to the neurobiological adaptations driving compulsive-like eating, opening new avenues of investigation towards pharmacologically treating binge eating disorder. PMID:22713906

  18. High Dietary Fructose Intake on Cardiovascular Disease Related Parameters in Growing Rats.

    PubMed

    Yoo, SooYeon; Ahn, Hyejin; Park, Yoo Kyoung

    2016-12-26

    The objective of this study was to determine the effects of a high-fructose diet on cardiovascular disease (CVD)-related parameters in growing rats. Three-week-old female Sprague Dawley rats were randomly assigned to four experimental groups; a regular diet group (RD: fed regular diet based on AIN-93G, n = 8), a high-fructose diet group (30Frc: fed regular diet with 30% fructose, n = 8), a high-fat diet group (45Fat: fed regular diet with 45 kcal% fat, n = 8) or a high fructose with high-fat diet group (30Frc + 45Fat, fed diet 30% fructose with 45 kcal% fat, n = 8). After an eight-week treatment period, the body weight, total-fat weight, serum glucose, insulin, lipid profiles and pro-inflammatory cytokines, abdominal aortic wall thickness, and expressions of eNOS and ET-1 mRNA were analyzed. The result showed that total-fat weight was higher in the 30Frc, 45Fat, and 30Frc + 45Fat groups compared to the RD group ( p < 0.05). Serum triglyceride (TG) levels were highest in the 30Frc group than the other groups ( p < 0.05). The abdominal aorta of 30Frc, 45Fat, and 30Frc + 45Fat groups had higher wall thickness than the RD group ( p < 0.05). Abdominal aortic eNOS mRNA level was decreased in 30Frc, 45Fat, and 30Frc + 45Fat groups compared to the RD group ( p < 0.05), and also 45Fat and 30Frc + 45Fat groups had decreased mRNA expression of eNOS compared to the 30Frc group ( p < 0.05). ET-1 mRNA level was higher in 30Frc, 45Fat, and 30Frc + 45Fat groups than the RD group ( p < 0.05). Both high fructose consumption and high fat consumption in growing rats had similar negative effects on CVD-related parameters.

  19. Ada (Tradename) Compiler Validation Summary Report. Harris Corporation. HARRIS Ada Compiler, Version 1.0. Harris H1200 and H800.

    DTIC Science & Technology

    This Validations Summary Report (VSR) summarizes the results and conclusions of validation testing performed on the HARRIS Ada Compiler, Version 1.0...at compile time, at link time, or during execution. On-site testing was performed 28 APR 1986 through 30 APR 1986 at Harris Corporation, Ft. Lauderdale

  20. Memory management and compiler support for rapid recovery from failures in computer systems

    NASA Technical Reports Server (NTRS)

    Fuchs, W. K.

    1991-01-01

    This paper describes recent developments in the use of memory management and compiler technology to support rapid recovery from failures in computer systems. The techniques described include cache coherence protocols for user transparent checkpointing in multiprocessor systems, compiler-based checkpoint placement, compiler-based code modification for multiple instruction retry, and forward recovery in distributed systems utilizing optimistic execution.

  1. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    NASA Astrophysics Data System (ADS)

    Parekh, Ankit

    Sparsity has become the basis of some important signal processing methods over the last ten years. Many signal processing problems (e.g., denoising, deconvolution, non-linear component analysis) can be expressed as inverse problems. Sparsity is invoked through the formulation of an inverse problem with suitably designed regularization terms. The regularization terms alone encode sparsity into the problem formulation. Often, the ℓ1 norm is used to induce sparsity, so much so that ℓ1 regularization is considered to be `modern least-squares'. The use of ℓ1 norm, as a sparsity-inducing regularizer, leads to a convex optimization problem, which has several benefits: the absence of extraneous local minima, well developed theory of globally convergent algorithms, even for large-scale problems. Convex regularization via the ℓ1 norm, however, tends to under-estimate the non-zero values of sparse signals. In order to estimate the non-zero values more accurately, non-convex regularization is often favored over convex regularization. However, non-convex regularization generally leads to non-convex optimization, which suffers from numerous issues: convergence may be guaranteed to only a stationary point, problem specific parameters may be difficult to set, and the solution is sensitive to the initialization of the algorithm. The first part of this thesis is aimed toward combining the benefits of non-convex regularization and convex optimization to estimate sparse signals more effectively. To this end, we propose to use parameterized non-convex regularizers with designated non-convexity and provide a range for the non-convex parameter so as to ensure that the objective function is strictly convex. By ensuring convexity of the objective function (sum of data-fidelity and non-convex regularizer), we can make use of a wide variety of convex optimization algorithms to obtain the unique global minimum reliably. The second part of this thesis proposes a non-linear signal decomposition technique for an important biomedical signal processing problem: the detection of sleep spindles and K-complexes in human sleep electroencephalography (EEG). We propose a non-linear model for the EEG consisting of three components: (1) a transient (sparse piecewise constant) component, (2) a low-frequency component, and (3) an oscillatory component. The oscillatory component admits a sparse time-frequency representation. Using a convex objective function, we propose a fast non-linear optimization algorithm to estimate the three components in the proposed signal model. The low-frequency and oscillatory components are then used to estimate the K-complexes and sleep spindles respectively. The proposed detection method is shown to outperform several state-of-the-art automated sleep spindles detection methods.

  2. Alcohol dehydrogenase and hydrogenase transcript fluctuations during a day-night cycle in Chlamydomonas reinhardtii: the role of anoxia.

    PubMed

    Whitney, Larisa Angela Swirsky; Loreti, Elena; Alpi, Amedeo; Perata, Pierdomenico

    2011-04-01

    • The unicellular green alga Chlamydomonas reinhardtii contains two iron (Fe)-hydrogenases which are responsible for hydrogen production under anoxia. In the present work the patterns of expression of alcohol dehydrogenase, a typical anaerobic gene in plants, of the hydrogenases genes (HYD1, HYD2) and of the genes responsible for their maturation (HYDEF, HYDG), were analysed. • The expression patterns were analysed by real-time reverse-transcription polymerase chain reaction in Chlamydomonas cultures during the day-night cycle, as well as in response to oxygen availability. • The results indicated that ADH1, HYD1, HYD2, HYDEF and HYDG were expressed following precise day-night fluctuations. ADH1 and HYD2 were modulated by the day-night cycle. Low oxygen plays an important role for the induction of HYD1, HYDEF and HYDG, while ADH1 and HYD2 expression was relatively insensitive to oxygen availability. • The regulation of the anaerobic gene expression in Chlamydomonas is only partly explained by responses to anoxia. The cell cycle and light-dark cycles are equally important elements in the regulatory network modulating the anaerobic response in Chlamydomonas. © The Authors (2010). Journal compilation © New Phytologist Trust (2010).

  3. On the regularity of the covariance matrix of a discretized scalar field on the sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bilbao-Ahedo, J.D.; Barreiro, R.B.; Herranz, D.

    2017-02-01

    We present a comprehensive study of the regularity of the covariance matrix of a discretized field on the sphere. In a particular situation, the rank of the matrix depends on the number of pixels, the number of spherical harmonics, the symmetries of the pixelization scheme and the presence of a mask. Taking into account the above mentioned components, we provide analytical expressions that constrain the rank of the matrix. They are obtained by expanding the determinant of the covariance matrix as a sum of determinants of matrices made up of spherical harmonics. We investigate these constraints for five different pixelizationsmore » that have been used in the context of Cosmic Microwave Background (CMB) data analysis: Cube, Icosahedron, Igloo, GLESP and HEALPix, finding that, at least in the considered cases, the HEALPix pixelization tends to provide a covariance matrix with a rank closer to the maximum expected theoretical value than the other pixelizations. The effect of the propagation of numerical errors in the regularity of the covariance matrix is also studied for different computational precisions, as well as the effect of adding a certain level of noise in order to regularize the matrix. In addition, we investigate the application of the previous results to a particular example that requires the inversion of the covariance matrix: the estimation of the CMB temperature power spectrum through the Quadratic Maximum Likelihood algorithm. Finally, some general considerations in order to achieve a regular covariance matrix are also presented.« less

  4. Oceanography from satellites

    NASA Technical Reports Server (NTRS)

    Wilson, W. S.

    1981-01-01

    It is pointed out that oceanographers have benefited from the space program mainly through the increased efficiency it has brought to ship operations. For example, the Transit navigation system has enabled oceanographers to compile detailed maps of sea-floor properties and to more accurately locate moored subsurface instrumentation. General descriptions are given of instruments used in satellite observations (altimeter, color scanner, infrared radiometer, microwave radiometer, scatterometer, synthetic aperture radar). It is pointed out that because of the large volume of data that satellite instruments generate, the development of algorithms for converting the data into a form expressed in geophysical units has become especially important.

  5. Moloka'i Fieldtrip Guidebook: Selected Aspects of the Geology, Geography, and Coral Reefs of Moloka'i

    USGS Publications Warehouse

    Cochran, Susan A.; Roberts, Lucile M.; Evans, Kevin R.

    2002-01-01

    This guidebook was compiled with the express purpose of describing the general geology of Moloka'i and those locations with significance to the U.S. Geological Survey's study of Moloka'i's coral reef, a part of the U.S. Department of Interior's 'Protecting the Nation's Reefs' program. The first portion of the guidebook describes the island and gives the historical background. Fieldtrip stop locations are listed in a logical driving order, essentially from west to east. This order may be changed, or stops deleted, depending on time and scheduling of an individual fieldtrip.

  6. Aircraft measurement of radio frequency noise at 121.5 MHz, 243 MHz and 406 MHz

    NASA Technical Reports Server (NTRS)

    Taylor, R. E.; Hill, J. S.

    1977-01-01

    An airborne survey measurement of terrestrial radio-frequency noise over U.S. metropolitan areas was carried out at 121.5, 243 and 406 MHz with horizontal-polarization monopole antennas. Flights were at 25,000 feet altitude. Radio-noise measurements, expressed in equivalent antenna-noise temperature, indicate a steady-background noise temperature of 572,000 K, at 121.5 MHz, during daylight over New York City. This data is helpful in compiling radio-noise temperature maps; in turn useful for designing satellite-aided, emergency-distress search and rescue communication systems.

  7. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nataf, J.M.; Winkelmann, F.

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less

  8. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nataf, J.M.; Winkelmann, F.

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of thesemore » methods to solving the partial differential equations for two-dimensional heat flow is illustrated.« less

  9. Portable Just-in-Time Specialization of Dynamically Typed Scripting Languages

    NASA Astrophysics Data System (ADS)

    Williams, Kevin; McCandless, Jason; Gregg, David

    In this paper, we present a portable approach to JIT compilation for dynamically typed scripting languages. At runtime we generate ANSI C code and use the system's native C compiler to compile this code. The C compiler runs on a separate thread to the interpreter allowing program execution to continue during JIT compilation. Dynamic languages have variables which may change type at any point in execution. Our interpreter profiles variable types at both whole method and partial method granularity. When a frequently executed region of code is discovered, the compilation thread generates a specialized version of the region based on the profiled types. In this paper, we evaluate the level of instruction specialization achieved by our profiling scheme as well as the overall performance of our JIT.

  10. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.

    1995-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.

  11. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    NASA Technical Reports Server (NTRS)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  12. Learning of pitch and time structures in an artificial grammar setting.

    PubMed

    Prince, Jon B; Stevens, Catherine J; Jones, Mari Riess; Tillmann, Barbara

    2018-04-12

    Despite the empirical evidence for the power of the cognitive capacity of implicit learning of structures and regularities in several modalities and materials, it remains controversial whether implicit learning extends to the learning of temporal structures and regularities. We investigated whether (a) an artificial grammar can be learned equally well when expressed in duration sequences as when expressed in pitch sequences, (b) learning of the artificial grammar in either duration or pitch (as the primary dimension) sequences can be influenced by the properties of the secondary dimension (invariant vs. randomized), and (c) learning can be boosted when the artificial grammar is expressed in both pitch and duration. After an exposure phase with grammatical sequences, learning in a subsequent test phase was assessed in a grammaticality judgment task. Participants in both the pitch and duration conditions showed incidental (not fully implicit) learning of the artificial grammar when the secondary dimension was invariant, but randomizing the pitch sequence prevented learning of the artificial grammar in duration sequences. Expressing the artificial grammar in both pitch and duration resulted in disproportionately better performance, suggesting an interaction between the learning of pitch and temporal structure. The findings are relevant to research investigating the learning of temporal structures and the learning of structures presented simultaneously in 2 dimensions (e.g., space and time, space and objects). By investigating learning, the findings provide further insight into the potential specificity of pitch and time processing, and their integrated versus independent processing, as previously debated in music cognition research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    PubMed

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  14. Kangaroo – A pattern-matching program for biological sequences

    PubMed Central

    2002-01-01

    Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718

  15. HER2 isoforms co-expression differently tunes mammary tumor phenotypes affecting onset, vasculature and therapeutic response

    PubMed Central

    Balboni, Tania; Ianzano, Marianna L.; Laranga, Roberta; Landuzzi, Lorena; Giusti, Veronica; Ceccarelli, Claudio; Santini, Donatella; Taffurelli, Mario; Di Oto, Enrico; Asioli, Sofia; Amici, Augusto; Pupa, Serenella M.; De Giovanni, Carla; Tagliabue, Elda; Iezzi, Manuela; Nanni, Patrizia; Lollini, Pier-Luigi

    2017-01-01

    Full-length HER2 oncoprotein and splice variant Delta16 are co-expressed in human breast cancer. We studied their interaction in hybrid transgenic mice bearing human full-length HER2 and Delta16 (F1 HER2/Delta16) in comparison to parental HER2 and Delta16 transgenic mice. Mammary carcinomas onset was faster in F1 HER2/Delta16 and Delta16 than in HER2 mice, however tumor growth was slower, and metastatic spread was comparable in all transgenic mice. Full-length HER2 tumors contained few large vessels or vascular lacunae, whereas Delta16 tumors presented a more regular vascularization with numerous endothelium-lined small vessels. Delta16-expressing tumors showed a higher accumulation of i.v. injected doxorubicin than tumors expressing full-length HER2. F1 HER2/Delta16 tumors with high full-length HER2 expression made few large vessels, whereas tumors with low full-length HER2 and high Delta16 contained numerous small vessels and expressed higher levels of VEGF and VEGFR2. Trastuzumab strongly inhibited tumor onset in F1 HER2/Delta16 and Delta16 mice, but not in full-length HER2 mice. Addiction of F1 tumors to Delta16 was also shown by long-term stability of Delta16 levels during serial transplants, in contrast full-length HER2 levels underwent wide fluctuations. In conclusion, full-length HER2 leads to a faster tumor growth and to an irregular vascularization, whereas Delta16 leads to a faster tumor onset, with more regular vessels, which in turn could better transport cytotoxic drugs within the tumor, and to a higher sensitivity to targeted therapeutic agents. F1 HER2/Delta16 mice are a new immunocompetent mouse model, complementary to patient-derived xenografts, for studies of mammary carcinoma onset, prevention and therapy. PMID:28903354

  16. Short-term regular aerobic exercise reduces oxidative stress produced by acute in the adipose microvasculature.

    PubMed

    Robinson, Austin T; Fancher, Ibra S; Sudhahar, Varadarajan; Bian, Jing Tan; Cook, Marc D; Mahmoud, Abeer M; Ali, Mohamed M; Ushio-Fukai, Masuko; Brown, Michael D; Fukai, Tohru; Phillips, Shane A

    2017-05-01

    High blood pressure has been shown to elicit impaired dilation in the vasculature. The purpose of this investigation was to elucidate the mechanisms through which high pressure may elicit vascular dysfunction and determine the mechanisms through which regular aerobic exercise protects arteries against high pressure. Male C57BL/6J mice were subjected to 2 wk of voluntary running (~6 km/day) for comparison with sedentary controls. Hindlimb adipose resistance arteries were dissected from mice for measurements of flow-induced dilation (FID; with or without high intraluminal pressure exposure) or protein expression of NADPH oxidase II (NOX II) and superoxide dismutase (SOD). Microvascular endothelial cells were subjected to high physiological laminar shear stress (20 dyn/cm 2 ) or static condition and treated with ANG II + pharmacological inhibitors. Cells were analyzed for the detection of ROS or collected for Western blot determination of NOX II and SOD. Resistance arteries from exercised mice demonstrated preserved FID after high pressure exposure, whereas FID was impaired in control mouse arteries. Inhibition of ANG II or NOX II restored impaired FID in control mouse arteries. High pressure increased superoxide levels in control mouse arteries but not in exercise mouse arteries, which exhibited greater ability to convert superoxide to H 2 O 2 Arteries from exercised mice exhibited less NOX II protein expression, more SOD isoform expression, and less sensitivity to ANG II. Endothelial cells subjected to laminar shear stress exhibited less NOX II subunit expression. In conclusion, aerobic exercise prevents high pressure-induced vascular dysfunction through an improved redox environment in the adipose microvasculature. NEW & NOTEWORTHY We describe potential mechanisms contributing to aerobic exercise-conferred protection against high intravascular pressure. Subcutaneous adipose microvessels from exercise mice express less NADPH oxidase (NOX) II and more superoxide dismutase (SOD) and demonstrate less sensitivity to ANG II. In microvascular endothelial cells, shear stress reduced NOX II but did not influence SOD expression.

  17. The Cancer Cell Line Encyclopedia enables predictive modeling of anticancer drug sensitivity

    PubMed Central

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A.; Kim, Sungjoon; Wilson, Christopher J.; Lehár, Joseph; Kryukov, Gregory V.; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F.; Monahan, John E.; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A.; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H.; Cheng, Jill; Yu, Guoying K.; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D.; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C.; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P.; Gabriel, Stacey B.; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E.; Weber, Barbara L.; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L.; Meyerson, Matthew; Golub, Todd R.; Morrissey, Michael P.; Sellers, William R.; Schlegel, Robert; Garraway, Levi A.

    2012-01-01

    The systematic translation of cancer genomic data into knowledge of tumor biology and therapeutic avenues remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacologic annotation is available1. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number, and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacologic profiles for 24 anticancer drugs across 479 of the lines, this collection allowed identification of genetic, lineage, and gene expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Altogether, our results suggest that large, annotated cell line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of “personalized” therapeutic regimens2. PMID:22460905

  18. [One-hundred fifty English words and expressions in dermatology that present difficulties or pitfalls for translation into Spanish].

    PubMed

    Navarro, F A

    2008-06-01

    Every year, thousands of technical neologisms are coined in English and must be rapidly imported into the Spanish language with the utmost precision, clarity, rigor, and linguistic correctness for Spanish to remain useful as a language of culture and learning that allows us to express the medicine of today. In 1999, the author published an extensive Glossary of Doubts in English-Spanish Translation of Dermatology that contained more than 500 words and expressions in dermatology that present difficulties in translation. Nine years later, the original glossary has been extended to include new English words and expressions that were not covered at that time. The author compiles and discusses 150 dermatological neologisms and technical terms in English that present problems for translation into Spanish or generate doubts regarding their use in that language, and offers reasoned proposals for their translation. The proposed translations are well founded and reflect the necessity for accuracy and clarity that should characterize all scientific language. In most cases, they are accompanied by detailed comments on normal usage among physicians, orthographic rules in Spanish, and official guidelines based on standardized nomenclature and the recommendations of the main international organizations.

  19. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity.

    PubMed

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A; Kim, Sungjoon; Wilson, Christopher J; Lehár, Joseph; Kryukov, Gregory V; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F; Monahan, John E; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H; Cheng, Jill; Yu, Guoying K; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P; Gabriel, Stacey B; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E; Weber, Barbara L; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L; Meyerson, Matthew; Golub, Todd R; Morrissey, Michael P; Sellers, William R; Schlegel, Robert; Garraway, Levi A

    2012-03-28

    The systematic translation of cancer genomic data into knowledge of tumour biology and therapeutic possibilities remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacological annotation is available. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacological profiles for 24 anticancer drugs across 479 of the cell lines, this collection allowed identification of genetic, lineage, and gene-expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Together, our results indicate that large, annotated cell-line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of 'personalized' therapeutic regimens.

  20. Efficient Type Representation in TAL

    NASA Technical Reports Server (NTRS)

    Chen, Juan

    2009-01-01

    Certifying compilers generate proofs for low-level code that guarantee safety properties of the code. Type information is an essential part of safety proofs. But the size of type information remains a concern for certifying compilers in practice. This paper demonstrates type representation techniques in a large-scale compiler that achieves both concise type information and efficient type checking. In our 200,000-line certifying compiler, the size of type information is about 36% of the size of pure code and data for our benchmarks, the best result to the best of our knowledge. The type checking time is about 2% of the compilation time.

  1. Ada Compiler Validation Summary Report: Certificate Number 89020W1. 10073: International Business Machines Corporation, IBM Development System for the Ada Language, VM/CMS Ada Compiler, Version 2.1.1, IBM 3083 (Host and Target)

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation) IBM Development System for the Ada Language, VN11/CMS Ada Compiler, Version 2.1.1, Wright-Patterson AFB, IBM 3083...890420W1.10073 International Business Machines Corporation IBM Development System for the Ada Language VM/CMS Ada Compiler Version 2.1.1 IBM 3083... International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all default option settings except for the

  2. Surgical loupe usage among oculoplastic surgeons in North America.

    PubMed

    Wei, Chen; Wu, Albert Y

    2018-04-01

    To study the patterns of usage and perception among U.S. oculoplastic surgeons regarding surgical loupes. An anonymous 20-question survey was emailed out to the American Society of Ophthalmic Plastic and Reconstructive Surgery listserv. Data were compiled in Google Forms. SPSS was used for statistical analyses. This study was approved by the institutional review board. Of the 609 members contacted, 239 (39%) completed the survey; 95% of respondents owned loupes and 78% regularly used them. No association was observed between frequency of loupe usage and sex or years in practice. The most common magnification and brand were 2.5× and Designs for Vision, respectively. The most common problems associated with loupes were limited vision (33%) and lack of comfort (24%), with 11% citing neck and cervical spinal disorders. The most common benefits were magnification (93%) and increased visual accuracy (68%). Of the respondents, 56% believed improved patient care to be a benefit and 76% believed that loupes enhance surgical outcome. With regard to training, 67% supported incorporating loupes into residency, 35% believed in mandating loupe purchase, and 25% wanted residencies to provide loupes at no cost. Respondent support for the use of loupes in practice and training was directly correlated with how frequently they used loupes. The vast majority of respondents owned loupes. Although most loupe owners used loupes regularly, a sizable proportion operated with limited vision and lack of comfort. Overall, just over half of respondents believed that loupes improve patient care and should be integrated into residency. Copyright © 2018 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  3. School-based mass distributions of mebendazole to control soil-transmitted helminthiasis in the Munshiganj and Lakshmipur districts of Bangladesh: an evaluation of the treatment monitoring process and knowledge, attitudes, and practices of the population.

    PubMed

    Hafiz, Israt; Berhan, Meklit; Keller, Angela; Haq, Rouseli; Chesnaye, Nicholas; Koporc, Kim; Rahman, Mujibur; Rahman, Shamsur; Mathieu, Els

    2015-01-01

    Bangladesh's national deworming program targets school-age children (SAC) through bi-annual school-based distributions of mebendazole. Qualitative and quantitative methods were applied to identify challenges related to treatment monitoring within the Munshiganj and Lakshmipur Districts of Bangladesh. Key stakeholder interviews identified several obstacles for successful treatment monitoring within these districts; ambiguity in defining the target population, variances in the methods used for compiling and reporting treatment data, and a general lack of financial and human resources. A treatment coverage cluster survey revealed that bi-annual primary school-based distributions proved to be an effective strategy in reaching school-attending SAC, with rates between 63.0% and 73.3%. However, the WHO target of regular treatment of at least 75% of SAC has yet to be reached. Particularly low coverage was seen amongst non-school attending children (11.4-14.3%), most likely due to the lack of national policy to effectively target this vulnerable group. Survey findings on water and sanitation coverage were impressive with the majority of households and schools having access to latrines (98.6-99.3%) and safe drinking water (98.2-100%). The challenge now for the Bangladesh control program is to achieve the WHO target of regular treatment of at least 75% of SAC at risk, irrespective of school-enrollment status. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Surveillance and management of urologic complications after spinal cord injury.

    PubMed

    Kreydin, Evgeniy; Welk, Blayne; Chung, Doreen; Clemens, Quentin; Yang, Claire; Danforth, Teresa; Gousse, Angelo; Kielb, Stephanie; Kraus, Stephen; Mangera, Altaf; Reid, Sheilagh; Szell, Nicole; Cruz, Francisco; Chartier-Kastler, Emmanuel; Ginsberg, David A

    2018-05-29

    Neurogenic bladder due to spinal cord injury has significant consequences for patients' health and quality of life. Regular surveillance is required to assess the status of the upper and lower urinary lower urinary tracts and prevent their deterioration. In this review, we examine surveillance techniques in neurogenic bladder, describe common complications of this disease, and address strategies for their management. This work represents the efforts of SIU-ICUD joint consultation on Urologic Management of the Spinal Cord injury. For this specific topic, a workgroup was formed and comprehensive literature search of English language manuscripts regarding neurogenic bladder management was performed using key words of neurogenic bladder. Articles were compiled, and recommendations in the chapter are based on group discussion and follow the Oxford Centre for Evidence-based Medicine system for Levels of Evidence (LOEs) and Grades of Recommendation (GORs). At a minimum, patients should undergo an annual history and physical examination, renal functional testing (e.g., serum creatinine), and upper tract imaging (e.g., renal ultrasonography). The existing evidence does not support the use of other modalities, such as cystoscopy or urodynamics, for routine surveillance. Urologic complications in neurogenic bladder patients are common and often more complex than in the general population. There is a shortage of high-quality evidence to support any particular neurogenic bladder surveillance protocol. However, there is consensus regarding certain aspects of regular genitourinary system evaluation in these patients. Proper surveillance allows the clinician to avoid or address common urological complications, and to guide, alter, or maintain appropriate therapeutic regimens for individual patients.

  5. Tooth cleaning frequency in relation to socio-demographic variables and personal hygiene measures among school children of Udaipur district, India.

    PubMed

    Kumar, S; Panwar, J; Vyas, A; Sharma, J; Goutham, B; Duraiswamy, P; Kulkarni, S

    2011-02-01

    The aim of the study was to determine if frequency of tooth cleaning varies with social group, family size, bedtime and other personal hygiene habits among school children. Target population comprised schoolchildren aged 8-16 years of Udaipur district attending public schools. A two stage cluster random sampling procedure was executed to collect the representative sample, consequently final sample size accounted to 852 children. Data were collected by means of structured questionnaires which consisted of questions related to oral hygiene habits including a few general hygiene habits, bed time, family size, family income and dental visiting habits. The results show that 30.5% of the total sample cleaned their teeth twice or more daily and there was no significant difference between the genders for tooth cleaning frequency. Logistic regression analysis revealed that older children and those having less than two siblings were more likely to clean their teeth twice a day than the younger ones and children with more than two siblings. Furthermore, frequency of tooth cleaning was significantly lower among children of parents with low level of education and less annual income as compared with those of high education and more annual income. In addition, tooth cleaning habits were more regular in children using tooth paste and regularly visiting to the dentist. This study observed that tooth cleaning is not an isolated behaviour, but is a part of multifarious pattern of various social and behavioural factors. © 2009 The Authors. Journal compilation © 2009 Blackwell Munksgaard.

  6. Revised Thomas-Fermi approximation for singular potentials

    NASA Astrophysics Data System (ADS)

    Dufty, James W.; Trickey, S. B.

    2016-08-01

    Approximations for the many-fermion free-energy density functional that include the Thomas-Fermi (TF) form for the noninteracting part lead to singular densities for singular external potentials (e.g., attractive Coulomb). This limitation of the TF approximation is addressed here by a formal map of the exact Euler equation for the density onto an equivalent TF form characterized by a modified Kohn-Sham potential. It is shown to be a "regularized" version of the Kohn-Sham potential, tempered by convolution with a finite-temperature response function. The resulting density is nonsingular, with the equilibrium properties obtained from the total free-energy functional evaluated at this density. This new representation is formally exact. Approximate expressions for the regularized potential are given to leading order in a nonlocality parameter, and the limiting behavior at high and low temperatures is described. The noninteracting part of the free energy in this approximation is the usual Thomas-Fermi functional. These results generalize and extend to finite temperatures the ground-state regularization by R. G. Parr and S. Ghosh [Proc. Natl. Acad. Sci. U.S.A. 83, 3577 (1986), 10.1073/pnas.83.11.3577] and by L. R. Pratt, G. G. Hoffman, and R. A. Harris [J. Chem. Phys. 88, 1818 (1988), 10.1063/1.454105] and formally systematize the finite-temperature regularization given by the latter authors.

  7. Neurocognitive consequences of cigarette smoking in young adults--a comparison with pre-drug performance.

    PubMed

    Fried, P A; Watkinson, B; Gray, R

    2006-01-01

    The present study examined effects of current and past regular cigarette smoking in young adult subjects. One hundred and twelve 17-21-year-old subjects, assessed since infancy, were evaluated using a battery of neurocognitive tests for which commensurate measures were obtained at 9-12 years of age, prior to the initiation of regular smoking. Smokers, determined by urinalysis and self-report, were categorized as heavy (>9 cigarettes per day) and light (<9 cigarettes per day) current smokers and former smokers, the latter having smoked cigarettes regularly in the past but not for at least 6 months. A third of the subjects were currently smoking cigarettes regularly with half of these being heavy smokers. Among former smokers, the average duration of smoking was slightly less than 2 years. Overall IQ, memory, processing speed, vocabulary, attention and abstract reasoning were the primary outcomes with comparisons being made between each of the three user groups and a control group who never smoked regularly. After accounting for potentially confounding factors including clinical assessment, marihuana use and pre-drug performance in the relevant cognitive domain, current regular smokers did significantly worse than non-smokers in a variety of cognitive areas predicated upon verbal/auditory competence including receptive and expressive vocabulary, oral arithmetic, and auditory memory. This impact of current smoking appears to behave in a dose-response and duration-related fashion. In contrast, former smokers differed from the non-smokers only in the arithmetic task. These results suggest that regular smoking during early adulthood is associated with cognitive impairments in selected domains and that these deficits may be reversed upon cessation. Together, the findings add to the body of evidence to be used in persuading adolescents and young adults against the initiation of smoking and, if currently smoking, the advantages of stopping.

  8. The effect of dose reduction on the detection of anatomical structures on panoramic radiographs.

    PubMed

    Kaeppler, G; Dietz, K; Reinert, S

    2006-07-01

    The aim was to evaluate the effect of dose reduction on diagnostic accuracy using different screen-film combinations and digital techniques for panoramic radiography. Five observers assessed 201 pairs of panoramic radiographs (a total of 402 panoramic radiographs) taken with the Orthophos Plus (Sirona, Bensheim, Germany), for visualization of 11 anatomical structures on each side, using a 3-point scale -1, 0 and 1. Two radiographs of each patient were taken at two different times (conventional setting and setting with decreased dose, done by increasing tube potential settings or halving tube current). To compare the dose at different tube potential settings dose-length product was measured at the secondary collimator. Films with medium and regular intensifying screens (high and low tube potential settings) and storage phosphor plates (low tube potential setting, tube current setting equivalent to regular intensifying screen and halved) were compared. The five observers made 27 610 assessments. Intrarater agreement was expressed by Cohen's kappa coefficient. The results demonstrated an equivalence of regular screens (low tube potential setting) and medium screens (high and low tube potential settings). A significant difference existed between medium screens (low tube potential setting, mean score 0.92) and the group of regular film-screen combinations at high tube potential settings (mean score 0.89) and between all film-screen combinations and the digital system irrespective of exposure (mean score below 0.82). There were no significant differences between medium and regular screens (mean score 0.88 to 0.92) for assessment of the periodontal ligament space, but there was a significant difference compared with the digital system (mean score below 0.76). The kappa coefficient for intrarater agreement was moderate (0.55). New regular intensifying screens can replace medium screens at low tube potential settings. Digital panoramic radiographs should be taken at low tube potential levels with an exposure equivalent at least to a regular intensifying screen.

  9. Attributes for MRB_E2RF1 Catchments by Major River Basins in the Conterminous United States: Base-Flow Index, 2002

    USGS Publications Warehouse

    Wieczorek, Michael; LaMotte, Andrew E.

    2010-01-01

    This tabular data set represents the mean base-flow index expressed as a percent, compiled for every catchment of MRB_E2RF1 catchments of Major River Basins (MRBs, Crawford and others, 2006). Base flow is the component of streamflow that can be attributed to ground-water discharge into streams. The source data set is Base-Flow Index for the Conterminous United States (Wolock, 2003). The MRB_E2RF1 catchments are based on a modified version of the U.S. Environmental Protection Agency's (USEPA) ERF1_2 and include enhancements to support national and regional-scale surface-water quality modeling (Nolan and others, 2002; Brakebill and others, 2011). Data were compiled for every catchment of MRB_E2RF1 catchments for the conterminous United States covering New England and Mid-Atlantic (MRB1), South Atlantic-Gulf and Tennessee (MRB2), the Great Lakes, Ohio, Upper Mississippi, and Souris-Red-Rainy (MRB3), the Missouri (MRB4), the Lower Mississippi, Arkansas-White-Red, and Texas-Gulf (MRB5), the Rio Grande, Colorado, and the Great basin (MRB6), the Pacific Northwest (MRB7) river basins, and California (MRB8).

  10. Vector-matrix-quaternion, array and arithmetic packages: All HAL/S functions implemented in Ada

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Kwong, David D.

    1986-01-01

    The HAL/S avionics programmers have enjoyed a variety of tools built into a language tailored to their special requirements. Ada is designed for a broader group of applications. Rather than providing built-in tools, Ada provides the elements with which users can build their own. Standard avionic packages remain to be developed. These must enable programmers to code in Ada as they have coded in HAL/S. The packages under development at JPL will provide all of the vector-matrix, array, and arithmetic functions described in the HAL/S manuals. In addition, the linear algebra package will provide all of the quaternion functions used in Shuttle steering and Galileo attitude control. Furthermore, using Ada's extensibility, many quaternion functions are being implemented as infix operations; equivalent capabilities were never implemented in HAL/S because doing so would entail modifying the compiler and expanding the language. With these packages, many HAL/S expressions will compile and execute in Ada, unchanged. Others can be converted simply by replacing the implicit HAL/S multiply operator with the Ada *. Errors will be trapped and identified. Input/output will be convenient and readable.

  11. Combining constraint satisfaction and local improvement algorithms to construct anaesthetists' rotas

    NASA Technical Reports Server (NTRS)

    Smith, Barbara M.; Bennett, Sean

    1992-01-01

    A system is described which was built to compile weekly rotas for the anaesthetists in a large hospital. The rota compilation problem is an optimization problem (the number of tasks which cannot be assigned to an anaesthetist must be minimized) and was formulated as a constraint satisfaction problem (CSP). The forward checking algorithm is used to find a feasible rota, but because of the size of the problem, it cannot find an optimal (or even a good enough) solution in an acceptable time. Instead, an algorithm was devised which makes local improvements to a feasible solution. The algorithm makes use of the constraints as expressed in the CSP to ensure that feasibility is maintained, and produces very good rotas which are being used by the hospital involved in the project. It is argued that formulation as a constraint satisfaction problem may be a good approach to solving discrete optimization problems, even if the resulting CSP is too large to be solved exactly in an acceptable time. A CSP algorithm may be able to produce a feasible solution which can then be improved, giving a good, if not provably optimal, solution.

  12. 36 CFR 705.6 - Compilation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....6 Compilation. (a) Library of Congress staff acting under the general authority of the Librarian of... segment. (c) No compilation by the Librarian shall be deemed for any purpose or proceeding to be an...

  13. Methodological Quality of Meta-Analyses: Matched-Pairs Comparison over Time and between Industry-Sponsored and Academic-Sponsored Reports

    ERIC Educational Resources Information Center

    Lane, Peter W.; Higgins, Julian P. T.; Anagnostelis, Betsy; Anzures-Cabrera, Judith; Baker, Nigel F.; Cappelleri, Joseph C.; Haughie, Scott; Hollis, Sally; Lewis, Steff C.; Moneuse, Patrick; Whitehead, Anne

    2013-01-01

    Context: Meta-analyses are regularly used to inform healthcare decisions. Concerns have been expressed about the quality of meta-analyses and, in particular, about those supported by the pharmaceutical industry. Objective: The objective of this study is to compare the quality of pharmaceutical-industry-supported meta-analyses with academic…

  14. Investigating Relationship between Discourse Behavioral Patterns and Academic Achievements of Students in SPOC Discussion Forum

    ERIC Educational Resources Information Center

    Liu, Zhi; Zhang, Wenjing; Cheng, Hercy N. H.; Sun, Jianwen; Liu, Sannyuya

    2018-01-01

    As an overt expression of internal mental processes, discourses have become one main data source for the research of interactive learning. To deeply explore behavioral regularities among interactions, this article firstly adopts the content analysis method to summarize students' engagement patterns within a course forum in a small private online…

  15. Using Regular, Lowstakes Tests to Secure Pupils' Contextual Knowledge in Year 10

    ERIC Educational Resources Information Center

    Donaghy, Lee

    2014-01-01

    Lee Donaghy was concerned that his GCSE students' weak contextual knowledge was letting them down. Inspired by a mixture of cognitive science and the arguments of other teachers expressed in various blogs, he decided to tackle the problem by teaching and testing knowledge more intensively. The result was a rapid improvement in secure factual…

  16. Ausdruckskraft und Regelmaessigkeit: Was Esperanto fuer automatische Uebersetzung geeignet macht (Expressiveness and Formal Regularity: What Makes Esperanto Suitable for Machine Translation).

    ERIC Educational Resources Information Center

    Schubert, Klaus

    1988-01-01

    Describes DLT, the multilingual machine translation system that uses Esperanto as an intermediate language in which substantial portions of the translation subprocesses are carried out. The criteria for choosing an intermediate language and the reasons for preferring Esperanto over other languages are explained. (Author/DJD)

  17. Adding Statistical Machine Translation Adaptation to Computer-Assisted Translation

    DTIC Science & Technology

    2013-09-01

    are automatically searched and used to suggest possible translations; (2) spell-checkers; (3) glossaries; (4) dictionaries ; (5) alignment and...matching against TMs to propose translations; spell-checking, glossary, and dictionary look-up; support for multiple file formats; regular expressions...on Telecommunications. Tehran, 2012, 822–826. Bertoldi, N.; Federico, M. Domain Adaptation for Statistical Machine Translation with Monolingual

  18. Expression of the core antigen gene of hepatitis B virus (HBV) in Acetobacter methanolicus using broad-host-range vectors.

    PubMed

    Schröder, R; Maassen, A; Lippoldt, A; Börner, T; von Baehr, R; Dobrowolski, P

    1991-08-01

    Using the broad-host-range promoter probe vector pRS201 for cloning of phage Acm1 promoters, we established a convenient vector system for expression of heterologous genes in different Gram-negative bacteria. The usefulness of this system was demonstrated by expression of the HBV core gene in Acetobacter methanolicus. Plasmids carrying the HBV core gene downstream of different Acm1-phage promoters were transferred to A. methanolicus, a new potential host for recombinant DNA expression. Using enzyme immunoassay and immunoblot techniques, the amount and composition of core antigen produced in A. methanolicus were compared with that derived from Escherichia coli. The expression of immunoreactive core antigen in A. methanolicus exceeds by sevenfold that in E. coli using an expression system with tandemly arranged promoters. Morphological observations by electron microscopy show that the HBV core gene products isolated from both hosts are assembled into regular spherical particles with a diameter of about 28 nm that are comparable to original viral nucleocapsids.

  19. Assembling the Streptococcus thermophilus clustered regularly interspaced short palindromic repeats (CRISPR) array for multiplex DNA targeting.

    PubMed

    Guo, Lijun; Xu, Kun; Liu, Zhiyuan; Zhang, Cunfang; Xin, Ying; Zhang, Zhiying

    2015-06-01

    In addition to the advantages of scalable, affordable, and easy to engineer, the clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein (Cas) technology is superior for multiplex targeting, which is laborious and inconvenient when achieved by cloning multiple gRNA expressing cassettes. Here, we report a simple CRISPR array assembling method which will facilitate multiplex targeting usage. First, the Streptococcus thermophilus CRISPR3/Cas locus was cloned. Second, different CRISPR arrays were assembled with different crRNA spacers. Transformation assays using different Escherichia coli strains demonstrated efficient plasmid DNA targeting, and we achieved targeting efficiency up to 95% with an assembled CRISPR array with three crRNA spacers. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Representation of viruses in the remediated PDB archive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawson, Catherine L., E-mail: cathy.lawson@rutgers.edu; Dutta, Shuchismita; Westbrook, John D.

    2008-08-01

    A new data model for PDB entries of viruses and other biological assemblies with regular noncrystallographic symmetry is described. A new scheme has been devised to represent viruses and other biological assemblies with regular noncrystallographic symmetry in the Protein Data Bank (PDB). The scheme describes existing and anticipated PDB entries of this type using generalized descriptions of deposited and experimental coordinate frames, symmetry and frame transformations. A simplified notation has been adopted to express the symmetry generation of assemblies from deposited coordinates and matrix operations describing the required point, helical or crystallographic symmetry. Complete correct information for building full assemblies,more » subassemblies and crystal asymmetric units of all virus entries is now available in the remediated PDB archive.« less

  1. A Note on Compiling Fortran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L. E.

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous tomore » compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.« less

  2. Does the terrestrial phenology concept apply in water?

    NASA Astrophysics Data System (ADS)

    Winder, M.; Cloern, J. E.

    2009-12-01

    Terrestrial plants have a life history that has evolved to a circannual rhythm in concert with the seasonal climate system and overall biomass follows a regular cycle of growth and senescence having a period of 1 year. Consistency in phase and amplitude render terrestrial plant activity an effective tool to observe shifts in the seasonal life cycle in response to climate change. The other half of Earth’s primary production occurs in aquatic systems, dominated by unicellular algae having the capacity to divide daily under optimal conditions and population changes can, in principle, occur any time within a year. Given that periods of life cycles differ on land compared to aquatic systems, it can be expected that patterns of seasonal variability might differ between terrestrial and pelagic plants. We compiled 121 phytoplankton biomass time series with a median length of 16 years from estuarine-coastal and lake ecosystems in the temperate and subtropical zone and address three questions: Do aquatic pelagic plants follow the canonical seasonal pattern of terrestrial plants? What are the dominant periodicities of aquatic primary producers? How recurrent are cyclical patterns from year to year? We applied wavelet analysis to extract the phase and amplitude of these long-term phytoplankton time series. The data revealed that in about 45 % of the aquatic sites an annual cycle of 12-month periodicity was strongest expressed, corresponding to one peak per year. In about 20 % the 6-month periodicity dominated, characteristic of two peaks within a year, and about 35 % showed a pattern best attributed to the 2-5 month band periodicity and for 2 % no consistent periodicity emerged. The reoccurrence of the seasonal fluctuations varied however greatly from year to year, ranging from more predictable patterns to irregular patterns in other sites. These findings suggest that seasonal activity of chlorophyll a can be unpredictable and variable. We propose drivers that give rise to the broad pattern of seasonal phytoplankton fluctuations and discuss advantages and limitations of using phytoplankton phenology as indicators of climate change.

  3. Education and Outreach at the Earthscope National Office: 2012 Update on Activities and Broader Impacts

    NASA Astrophysics Data System (ADS)

    Semken, S. C.; Arrowsmith, R.; Fouch, M. J.; Garnero, E. J.; Taylor, W. L.; Bohon, W.; Pacheco, H. A.; Schwab, P.; Baumback, D.; Pettis, L.; Colunga, J.; Robinson, S.; Dick, C.

    2012-12-01

    The EarthScope Program (www.earthscope.org) funded by the National Science Foundation fosters interdisciplinary exploration of the geologic structure and evolution of the North American continent by means of seismology, geodesy, magnetotellurics, in-situ fault-zone sampling, geochronology, and high-resolution topographic measurements. EarthScope scientific data and findings are transforming the study of Earth structure and processes throughout the planet. These data enhance the understanding and mitigation of hazards and inform environmental and economic applications of geoscience. The EarthScope Program also offers significant resources and opportunities for education and outreach (E&O) in the Earth system sciences. The EarthScope National Office (ESNO) at Arizona State University serves all EarthScope stakeholders, including researchers, educators, students, and the general public. ESNO continues to actively support and promote E&O with programmatic activities such as a regularly updated presence on the web and social media, newsletters, biannual national conferences, workshops for E&O providers and informal educators (interpreters), collaborative interaction with other Earth science organizations, continuing education for researchers, promotion of place-based education, and support for regional K-12 teacher professional-development programs led by EarthScope stakeholders. EarthScope E&O, coordinated by ESNO, leads the compilation and dissemination of the data, findings, and legacy of the epic EarthScope Program. In this presentation we offer updated reports and outcomes from ESNO E&O activities, including web and social-media upgrades, the Earth Science E&O Provider Summit for partnering organizations, the Central Appalachian Interpretive Workshop for informal Earth science educators, the U.S. Science and Engineering Fair, and collaborative efforts with partner organizations. The EarthScope National Office is supported by the National Science Foundation under grants EAR-1101100 and EAR-1216301. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  4. Construction of a cDNA microarray derived from the ascidian Ciona intestinalis.

    PubMed

    Azumi, Kaoru; Takahashi, Hiroki; Miki, Yasufumi; Fujie, Manabu; Usami, Takeshi; Ishikawa, Hisayoshi; Kitayama, Atsusi; Satou, Yutaka; Ueno, Naoto; Satoh, Nori

    2003-10-01

    A cDNA microarray was constructed from a basal chordate, the ascidian Ciona intestinalis. The draft genome of Ciona has been read and inferred to contain approximately 16,000 protein-coding genes, and cDNAs for transcripts of 13,464 genes have been characterized and compiled as the "Ciona intestinalis Gene Collection Release I". In the present study, we constructed a cDNA microarray of these 13,464 Ciona genes. A preliminary experiment with Cy3- and Cy5-labeled probes showed extensive differential gene expression between fertilized eggs and larvae. In addition, there was a good correlation between results obtained by the present microarray analysis and those from previous EST analyses. This first microarray of a large collection of Ciona intestinalis cDNA clones should facilitate the analysis of global gene expression and gene networks during the embryogenesis of basal chordates.

  5. Tolerant (parallel) Programming

    NASA Technical Reports Server (NTRS)

    DiNucci, David C.; Bailey, David H. (Technical Monitor)

    1997-01-01

    In order to be truly portable, a program must be tolerant of a wide range of development and execution environments, and a parallel program is just one which must be tolerant of a very wide range. This paper first defines the term "tolerant programming", then describes many layers of tools to accomplish it. The primary focus is on F-Nets, a formal model for expressing computation as a folded partial-ordering of operations, thereby providing an architecture-independent expression of tolerant parallel algorithms. For implementing F-Nets, Cooperative Data Sharing (CDS) is a subroutine package for implementing communication efficiently in a large number of environments (e.g. shared memory and message passing). Software Cabling (SC), a very-high-level graphical programming language for building large F-Nets, possesses many of the features normally expected from today's computer languages (e.g. data abstraction, array operations). Finally, L2(sup 3) is a CASE tool which facilitates the construction, compilation, execution, and debugging of SC programs.

  6. A special purpose silicon compiler for designing supercomputing VLSI systems

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  7. Fundamental structures of dynamic social networks.

    PubMed

    Sekara, Vedran; Stopczynski, Arkadiusz; Lehmann, Sune

    2016-09-06

    Social systems are in a constant state of flux, with dynamics spanning from minute-by-minute changes to patterns present on the timescale of years. Accurate models of social dynamics are important for understanding the spreading of influence or diseases, formation of friendships, and the productivity of teams. Although there has been much progress on understanding complex networks over the past decade, little is known about the regularities governing the microdynamics of social networks. Here, we explore the dynamic social network of a densely-connected population of ∼1,000 individuals and their interactions in the network of real-world person-to-person proximity measured via Bluetooth, as well as their telecommunication networks, online social media contacts, geolocation, and demographic data. These high-resolution data allow us to observe social groups directly, rendering community detection unnecessary. Starting from 5-min time slices, we uncover dynamic social structures expressed on multiple timescales. On the hourly timescale, we find that gatherings are fluid, with members coming and going, but organized via a stable core of individuals. Each core represents a social context. Cores exhibit a pattern of recurring meetings across weeks and months, each with varying degrees of regularity. Taken together, these findings provide a powerful simplification of the social network, where cores represent fundamental structures expressed with strong temporal and spatial regularity. Using this framework, we explore the complex interplay between social and geospatial behavior, documenting how the formation of cores is preceded by coordination behavior in the communication networks and demonstrating that social behavior can be predicted with high precision.

  8. Fundamental structures of dynamic social networks

    PubMed Central

    Sekara, Vedran; Stopczynski, Arkadiusz; Lehmann, Sune

    2016-01-01

    Social systems are in a constant state of flux, with dynamics spanning from minute-by-minute changes to patterns present on the timescale of years. Accurate models of social dynamics are important for understanding the spreading of influence or diseases, formation of friendships, and the productivity of teams. Although there has been much progress on understanding complex networks over the past decade, little is known about the regularities governing the microdynamics of social networks. Here, we explore the dynamic social network of a densely-connected population of ∼1,000 individuals and their interactions in the network of real-world person-to-person proximity measured via Bluetooth, as well as their telecommunication networks, online social media contacts, geolocation, and demographic data. These high-resolution data allow us to observe social groups directly, rendering community detection unnecessary. Starting from 5-min time slices, we uncover dynamic social structures expressed on multiple timescales. On the hourly timescale, we find that gatherings are fluid, with members coming and going, but organized via a stable core of individuals. Each core represents a social context. Cores exhibit a pattern of recurring meetings across weeks and months, each with varying degrees of regularity. Taken together, these findings provide a powerful simplification of the social network, where cores represent fundamental structures expressed with strong temporal and spatial regularity. Using this framework, we explore the complex interplay between social and geospatial behavior, documenting how the formation of cores is preceded by coordination behavior in the communication networks and demonstrating that social behavior can be predicted with high precision. PMID:27555584

  9. Impact of Western and Mediterranean Diets and Vitamin D on Muscle Fibers of Sedentary Rats

    PubMed Central

    Purrello, Francesco

    2018-01-01

    Background: The metabolic syndrome is associated with sarcopenia. Decreased serum levels of Vitamin D (VitD) and insulin-like growth factor (IGF)-1 and their mutual relationship were also reported. We aimed to evaluate whether different dietary profiles, containing or not VitD, may exert different effects on muscle molecular morphology. Methods: Twenty-eight male rats were fed for 10 weeks in order to detect early defects induced by different dietary regimens: regular diet (R); regular diet with vitamin D supplementation (R-DS) and regular diet with vitamin D restriction (R-DR); high-fat butter-based diets (HFB-DS and HFB-DR) with 41% energy from fat; high-fat extra-virgin olive oil-based diets (HFEVO-DS and HFEVO-DR) with 41% energy from fat. IL-1β, insulin-like growth factor (IGF)1, Dickkopf-1 (DKK-1), and VitD-receptor (VDR) expressions were evaluated by immunohistochemistry. Muscle fiber perimeter was measured by histology and morphometric analysis. Results: The muscle fibers of the HEVO-DS rats were hypertrophic, comparable to those of the R-DS rats. An inverse correlation existed between the dietary fat content and the perimeter of the muscle fibers (p < 0.01). In the HFB-DR rats, the muscle fibers appeared hypotrophic with an increase of IL-1β and a dramatic decrease of IGF-1 expression. Conclusions: High-fat western diet could impair muscle metabolism and lay the ground for subsequent muscle damage. VitD associated with a Mediterranean diet showed trophic action on the muscle fibers. PMID:29462978

  10. Ada technology support for NASA-GSFC

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Utilization of the Ada programming language and environments to perform directorate functions was reviewed. The Mission and Data Operations Directorate Network (MNET) conversion effort was chosen as the first task for evaluation and assistance. The MNET project required the rewriting of the existing Network Control Program (NCP) in the Ada programming language. The DEC Ada compiler running on the VAX under WMS was used for the initial development efforts. Stress tests on the newly delivered version of the DEC Ada compiler were performed. The new Alsys Ada compiler was purchased for the IBM PC AT. A prevalidated version of the compiler was obtained. The compiler was then validated.

  11. Ada Compiler Validation Summary Report. Certificate Number: 890118W1. 10017 Harris Corporation, Computer Systems Division Harris Ada, Version 5.0 Harris HCX-9 Host and Harris NH-3800 Target

    DTIC Science & Technology

    1989-01-17

    6Is OBsO.[il I J)A s3 0,2O-L,-01,-5601 UNCLASSIFIED Ada Compiler Validation Summary Report : Compiler Name: Harris Ada, Version 5.0 Certificate Number...United States Department of Defense Washington DC 20301-3081 Ada Compiler Validation Summary Report : Compiler Name: Harris Ada, Version 5.0 Certificate...O RE[PP" 9 PEA= COVELRD Ada Corpiler Validation SummT, ary Repor6:Hnrris 17 Jan 19S9 to 17 Jan 1990 Corporation, Computer SYLeIns Di%ision, Harris Ada

  12. Zebrafish Dmrta2 regulates neurogenesis in the telencephalon.

    PubMed

    Yoshizawa, Akio; Nakahara, Yoshinari; Izawa, Toshiaki; Ishitani, Tohru; Tsutsumi, Makiko; Kuroiwa, Atsushi; Itoh, Motoyuki; Kikuchi, Yutaka

    2011-11-01

    Although recent findings showed that some Drosophila doublesex and Caenorhabditis elegans mab-3 related genes are expressed in neural tissues during development, their functions have not been fully elucidated. Here, we isolated a zebrafish mutant, ha2, that shows defects in telencephalic neurogenesis and found that ha2 encodes Doublesex and MAB-3 related transcription factor like family A2 (Dmrta2). dmrta2 expression is restricted to the telencephalon, diencephalon and olfactory placode during somitogenesis. We found that the expression of the proneural gene, neurogenin1, in the posterior and dorsal region of telencephalon (posterior-dorsal telencephalon) is markedly reduced in this mutant at the 14-somite stage without any defects in cell proliferation or cell death. In contrast, the telencephalic expression of her6, a Hes-related gene that is known to encode a negative regulator of neurogenin1, expands dramatically in the ha2 mutant. Based on over-expression experiments and epistatic analyses, we propose that zebrafish Dmrta2 controls neurogenin1 expression by repressing her6 in the posterior-dorsal telencephalon. Furthermore, the expression domains of the telencephalic marker genes, foxg1 and emx3, and the neuronal differentiation gene, neurod, are downregulated in the ha2 posterior-dorsal telencephalon during somitogenesis. These results suggest that Dmrta2 plays important roles in the specification of the posterior-dorsal telencephalic cell fate during somitogenesis. © 2011 The Authors. Journal compilation © 2011 by the Molecular Biology Society of Japan/Blackwell Publishing Ltd.

  13. CSAX: Characterizing Systematic Anomalies in eXpression Data.

    PubMed

    Noto, Keith; Majidi, Saeed; Edlow, Andrea G; Wick, Heather C; Bianchi, Diana W; Slonim, Donna K

    2015-05-01

    Methods for translating gene expression signatures into clinically relevant information have typically relied upon having many samples from patients with similar molecular phenotypes. Here, we address the question of what can be done when it is relatively easy to obtain healthy patient samples, but when abnormalities corresponding to disease states may be rare and one-of-a-kind. The associated computational challenge, anomaly detection, is a well-studied machine-learning problem. However, due to the dimensionality and variability of expression data, existing methods based on feature space analysis or individual anomalously expressed genes are insufficient. We present a novel approach, CSAX, that identifies pathways in an individual sample in which the normal expression relationships are disrupted. To evaluate our approach, we have compiled and released a compendium of public expression data sets, reformulated to create a test bed for anomaly detection. We demonstrate the accuracy of CSAX on the data sets in our compendium, compare it to other leading methods, and show that CSAX aids in both identifying anomalies and explaining their underlying biology. We describe an approach to characterizing the difficulty of specific expression anomaly detection tasks. We then illustrate CSAX's value in two developmental case studies. Confirming prior hypotheses, CSAX highlights disruption of platelet activation pathways in a neonate with retinopathy of prematurity and identifies, for the first time, dysregulated oxidative stress response in second trimester amniotic fluid of fetuses with obese mothers. Our approach provides an important step toward identification of individual disease patterns in the era of precision medicine.

  14. Establishing Malware Attribution and Binary Provenance Using Multicompilation Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramshaw, M. J.

    2017-07-28

    Malware is a serious problem for computer systems and costs businesses and customers billions of dollars a year in addition to compromising their private information. Detecting malware is particularly difficult because malware source code can be compiled in many different ways and generate many different digital signatures, which causes problems for most anti-malware programs that rely on static signature detection. Our project uses a convolutional neural network to identify malware programs but these require large amounts of data to be effective. Towards that end, we gather thousands of source code files from publicly available programming contest sites and compile themmore » with several different compilers and flags. Building upon current research, we then transform these binary files into image representations and use them to train a long-term recurrent convolutional neural network that will eventually be used to identify how a malware binary was compiled. This information will include the compiler, version of the compiler and the options used in compilation, information which can be critical in determining where a malware program came from and even who authored it.« less

  15. Circular RNA: New Regulatory Molecules.

    PubMed

    Belousova, E A; Filipenko, M L; Kushlinskii, N E

    2018-04-01

    Circular RNA are a family of covalently closed circular RNA molecules, formed from pre-mRNA of coding genes by means of splicing (canonical and alternative noncanonical splicing). Maturation of circular RNA is regulated by cis- and trans-elements. Complete list of biological functions of these RNA is not yet compiled; however, their capacity to interact with specific microRNA and play a role of a depot attracts the greatest interest. This property makes circular RNA active regulatory transcription factors. Circular RNA have many advantages over their linear analogs: synthesis of these molecules is conservative, they are universal, characterized by clearly determined specificity, and are resistant to exonucleases. In addition, the level of their expression is often higher than that of their linear forms. It should be noted that expression of circular RNA is tissue-specific. Moreover, some correlations between changes in the repertoire and intensity of expression of circular RNA and the development of some pathologies have been detected. Circular RNA have certain advantages and can serve as new biomarkers for the diagnosis, prognosis, and evaluation of response to therapy.

  16. Ground Operations Aerospace Language (GOAL). Volume 2: Compiler

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The principal elements and functions of the Ground Operations Aerospace Language (GOAL) compiler are presented. The technique used to transcribe the syntax diagrams into machine processable format for use by the parsing routines is described. An explanation of the parsing technique used to process GOAL source statements is included. The compiler diagnostics and the output reports generated during a GOAL compilation are explained. A description of the GOAL program package is provided.

  17. HAL/S-360 compiler test activity report

    NASA Technical Reports Server (NTRS)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  18. Federal COBOL Compiler Testing Service Compiler Validation Request Information.

    DTIC Science & Technology

    1977-05-09

    background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the

  19. Obtaining correct compile results by absorbing mismatches between data types representations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementingmore » step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.« less

  20. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-03-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  1. Obtaining correct compile results by absorbing mismatches between data types representations

    DOEpatents

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-11-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  2. Performing aggressive code optimization with an ability to rollback changes made by the aggressive optimizations

    DOEpatents

    Gschwind, Michael K

    2013-07-23

    Mechanisms for aggressively optimizing computer code are provided. With these mechanisms, a compiler determines an optimization to apply to a portion of source code and determines if the optimization as applied to the portion of source code will result in unsafe optimized code that introduces a new source of exceptions being generated by the optimized code. In response to a determination that the optimization is an unsafe optimization, the compiler generates an aggressively compiled code version, in which the unsafe optimization is applied, and a conservatively compiled code version in which the unsafe optimization is not applied. The compiler stores both versions and provides them for execution. Mechanisms are provided for switching between these versions during execution in the event of a failure of the aggressively compiled code version. Moreover, predictive mechanisms are provided for predicting whether such a failure is likely.

  3. A structured sparse regression method for estimating isoform expression level from multi-sample RNA-seq data.

    PubMed

    Zhang, L; Liu, X J

    2016-06-03

    With the rapid development of next-generation high-throughput sequencing technology, RNA-seq has become a standard and important technique for transcriptome analysis. For multi-sample RNA-seq data, the existing expression estimation methods usually deal with each single-RNA-seq sample, and ignore that the read distributions are consistent across multiple samples. In the current study, we propose a structured sparse regression method, SSRSeq, to estimate isoform expression using multi-sample RNA-seq data. SSRSeq uses a non-parameter model to capture the general tendency of non-uniformity read distribution for all genes across multiple samples. Additionally, our method adds a structured sparse regularization, which not only incorporates the sparse specificity between a gene and its corresponding isoform expression levels, but also reduces the effects of noisy reads, especially for lowly expressed genes and isoforms. Four real datasets were used to evaluate our method on isoform expression estimation. Compared with other popular methods, SSRSeq reduced the variance between multiple samples, and produced more accurate isoform expression estimations, and thus more meaningful biological interpretations.

  4. mESAdb: microRNA Expression and Sequence Analysis Database

    PubMed Central

    Kaya, Koray D.; Karakülah, Gökhan; Yakıcıer, Cengiz M.; Acar, Aybar C.; Konu, Özlen

    2011-01-01

    microRNA expression and sequence analysis database (http://konulab.fen.bilkent.edu.tr/mirna/) (mESAdb) is a regularly updated database for the multivariate analysis of sequences and expression of microRNAs from multiple taxa. mESAdb is modular and has a user interface implemented in PHP and JavaScript and coupled with statistical analysis and visualization packages written for the R language. The database primarily comprises mature microRNA sequences and their target data, along with selected human, mouse and zebrafish expression data sets. mESAdb analysis modules allow (i) mining of microRNA expression data sets for subsets of microRNAs selected manually or by motif; (ii) pair-wise multivariate analysis of expression data sets within and between taxa; and (iii) association of microRNA subsets with annotation databases, HUGE Navigator, KEGG and GO. The use of existing and customized R packages facilitates future addition of data sets and analysis tools. Furthermore, the ability to upload and analyze user-specified data sets makes mESAdb an interactive and expandable analysis tool for microRNA sequence and expression data. PMID:21177657

  5. mESAdb: microRNA expression and sequence analysis database.

    PubMed

    Kaya, Koray D; Karakülah, Gökhan; Yakicier, Cengiz M; Acar, Aybar C; Konu, Ozlen

    2011-01-01

    microRNA expression and sequence analysis database (http://konulab.fen.bilkent.edu.tr/mirna/) (mESAdb) is a regularly updated database for the multivariate analysis of sequences and expression of microRNAs from multiple taxa. mESAdb is modular and has a user interface implemented in PHP and JavaScript and coupled with statistical analysis and visualization packages written for the R language. The database primarily comprises mature microRNA sequences and their target data, along with selected human, mouse and zebrafish expression data sets. mESAdb analysis modules allow (i) mining of microRNA expression data sets for subsets of microRNAs selected manually or by motif; (ii) pair-wise multivariate analysis of expression data sets within and between taxa; and (iii) association of microRNA subsets with annotation databases, HUGE Navigator, KEGG and GO. The use of existing and customized R packages facilitates future addition of data sets and analysis tools. Furthermore, the ability to upload and analyze user-specified data sets makes mESAdb an interactive and expandable analysis tool for microRNA sequence and expression data.

  6. Runtime support and compilation methods for user-specified data distributions

    NASA Technical Reports Server (NTRS)

    Ponnusamy, Ravi; Saltz, Joel; Choudhury, Alok; Hwang, Yuan-Shin; Fox, Geoffrey

    1993-01-01

    This paper describes two new ideas by which an HPF compiler can deal with irregular computations effectively. The first mechanism invokes a user specified mapping procedure via a set of compiler directives. The directives allow use of program arrays to describe graph connectivity, spatial location of array elements, and computational load. The second mechanism is a simple conservative method that in many cases enables a compiler to recognize that it is possible to reuse previously computed information from inspectors (e.g. communication schedules, loop iteration partitions, information that associates off-processor data copies with on-processor buffer locations). We present performance results for these mechanisms from a Fortran 90D compiler implementation.

  7. AICPA allows low-cost options for compiled financial statements.

    PubMed

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  8. A software methodology for compiling quantum programs

    NASA Astrophysics Data System (ADS)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  9. Development of Nautical Almanac at Korea Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Han, In-Woo; Shin, Junho

    1994-12-01

    In Korea Astronomy Observatory, we developed a S/W package to compile the Korean Nautical Almanac. We describe the motivation to develop the S/W and explain the S/W package in general terms. In appendix, we describe the procedure to calculate the polaris table in more detail. When we developed the S/W, we paid much attention to produce accurate data. We also made great effort to automate the compilation of Nautical Almanac as far as possible, since the compilation is time consuming labour extensive. As a result, the S/W we developed turns out to be very accurate and efficient to compile Nautical Almanac. In fact, we could compile a Korean Nautical Almanac in a few days.

  10. Building a Community of Writers for the 21st Century: A Compilation of the Teaching Demonstrations, Personal and Professional Writings, and Daily Activities of the Samford University Writing Project (July 6-August 6, 1992).

    ERIC Educational Resources Information Center

    Roberts, David H., Ed.; And Others

    This compilation presents materials associated with the 5-week summer session of the Samford University Writing Project, 1992. The compilation begins with curriculum vitae of staff, teacher consultants, and guest speakers. The compilation also presents lists of group and committee members and daily logs written in by participants in a wide variety…

  11. Toward ADA: The Continuing Development of an ADA Compiler.

    DTIC Science & Technology

    1981-12-01

    the compiler. 1.2 Background Augusta Ada Byron, Countess Lovelace, the daughter of the poet Lord Byron, was a colleague of Charles Babbage and author of...continuing development of the AFIT-Ada compiler. The encouragement I received from Dr. Charles W. Roark, who taught the compiler sequence, and Roie R...thank my advisor, Roie R. Black, for his continuing counsel and advice. Many thanks to my readers, Dr James P. Rutledge and Charles W. Richard, for

  12. A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D; Panas, T

    2010-01-25

    OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is oftenmore » complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.« less

  13. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Tick, E.

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. Themore » fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.« less

  14. High anger expression exacerbates the relationship between age and metabolic syndrome.

    PubMed

    Boylan, Jennifer Morozink; Ryff, Carol D

    2015-01-01

    Building on prior work linking high anger expression to poor health, this cross-sectional study addressed whether anger expression exacerbated age-related risk for metabolic syndrome in a national sample of adults, known as MIDUS (Midlife in the United States). Respondents reported anger expression via survey assessments and completed an overnight clinic visit. Unadjusted metabolic syndrome prevalence was 40.6%. Men, less educated individuals, and those who reported not getting regular physical activity were at significantly higher risk for metabolic syndrome. Anger expression did not predict higher risk for metabolic syndrome in main effects models, but it moderated the relationship between age and metabolic syndrome. Age-associated risk for metabolic syndrome was significant only for adults with high anger expression. Among older adults, anger expression predicted higher prevalence of metabolic syndrome. Older adults reporting low anger expression had metabolic syndrome rates comparable to younger adults. Results highlight that failing to show the frequently observed decline in anger expression with age may have pernicious health concomitants. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Collective relaxation dynamics of small-world networks

    NASA Astrophysics Data System (ADS)

    Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc

    2015-05-01

    Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N , average degree k , and topological randomness q . We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q , including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.

  16. Collective relaxation dynamics of small-world networks.

    PubMed

    Grabow, Carsten; Grosskinsky, Stefan; Kurths, Jürgen; Timme, Marc

    2015-05-01

    Complex networks exhibit a wide range of collective dynamic phenomena, including synchronization, diffusion, relaxation, and coordination processes. Their asymptotic dynamics is generically characterized by the local Jacobian, graph Laplacian, or a similar linear operator. The structure of networks with regular, small-world, and random connectivities are reasonably well understood, but their collective dynamical properties remain largely unknown. Here we present a two-stage mean-field theory to derive analytic expressions for network spectra. A single formula covers the spectrum from regular via small-world to strongly randomized topologies in Watts-Strogatz networks, explaining the simultaneous dependencies on network size N, average degree k, and topological randomness q. We present simplified analytic predictions for the second-largest and smallest eigenvalue, and numerical checks confirm our theoretical predictions for zero, small, and moderate topological randomness q, including the entire small-world regime. For large q of the order of one, we apply standard random matrix theory, thereby overarching the full range from regular to randomized network topologies. These results may contribute to our analytic and mechanistic understanding of collective relaxation phenomena of network dynamical systems.

  17. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  18. Moving force identification based on redundant concatenated dictionary and weighted l1-norm regularization

    NASA Astrophysics Data System (ADS)

    Pan, Chu-Dong; Yu, Ling; Liu, Huan-Lin; Chen, Ze-Peng; Luo, Wen-Feng

    2018-01-01

    Moving force identification (MFI) is an important inverse problem in the field of bridge structural health monitoring (SHM). Reasonable signal structures of moving forces are rarely considered in the existing MFI methods. Interaction forces are complex because they contain both slowly-varying harmonic and impact signals due to bridge vibration and bumps on a bridge deck, respectively. Therefore, the interaction forces are usually hard to be expressed completely and sparsely by using a single basis function set. Based on the redundant concatenated dictionary and weighted l1-norm regularization method, a hybrid method is proposed for MFI in this study. The redundant dictionary consists of both trigonometric functions and rectangular functions used for matching the harmonic and impact signal features of unknown moving forces. The weighted l1-norm regularization method is introduced for formulation of MFI equation, so that the signal features of moving forces can be accurately extracted. The fast iterative shrinkage-thresholding algorithm (FISTA) is used for solving the MFI problem. The optimal regularization parameter is appropriately chosen by the Bayesian information criterion (BIC) method. In order to assess the accuracy and the feasibility of the proposed method, a simply-supported beam bridge subjected to a moving force is taken as an example for numerical simulations. Finally, a series of experimental studies on MFI of a steel beam are performed in laboratory. Both numerical and experimental results show that the proposed method can accurately identify the moving forces with a strong robustness, and it has a better performance than the Tikhonov regularization method. Some related issues are discussed as well.

  19. Effects of dark chocolate on azoxymethane-induced colonic aberrant crypt foci.

    PubMed

    Hong, Mee Young; Nulton, Emily; Shelechi, Mahshid; Hernández, Lisa M; Nemoseck, Tricia

    2013-01-01

    Epidemiologic evidence supports that diets rich in polyphenols promote health and may delay the onset of colon cancer. Cocoa and chocolate products have some of the highest polyphenolic concentrations compared to other polyphenolic food sources. This study tested the hypothesis that a diet including dark chocolate can protect against colon cancer by inhibiting aberrant crypt foci (ACF) formation, downregulating gene expression of inflammatory mediators, and favorably altering cell kinetics. We also investigated whether bloomed dark chocolate retains the antioxidant capacity and protects against colon cancer. Forty-eight rats received either a diet containing control (no chocolate), regular dark chocolate, or bloomed dark chocolate and were injected subcutaneously with saline or azoxymethane. Relative to control, both regular and bloomed dark chocolate diets lowered the total number of ACF (P = 0.022). Chocolate diet-fed animals downregulated transcription levels of COX-2 (P = 0.035) and RelA (P = 0.045). Both chocolate diets lowered the proliferation index (P = 0.001). These results suggest that a diet including dark chocolate can reduce cell proliferation and some gene expression involving inflammation, which may explain the lower number of early preneoplastic lesions. These results provide new insight on polyphenol-rich chocolate foods and colon cancer prevention.

  20. Deletion and Gene Expression Analyses Define the Paxilline Biosynthetic Gene Cluster in Penicillium paxilli

    PubMed Central

    Scott, Barry; Young, Carolyn A.; Saikia, Sanjay; McMillan, Lisa K.; Monahan, Brendon J.; Koulman, Albert; Astin, Jonathan; Eaton, Carla J.; Bryant, Andrea; Wrenn, Ruth E.; Finch, Sarah C.; Tapper, Brian A.; Parker, Emily J.; Jameson, Geoffrey B.

    2013-01-01

    The indole-diterpene paxilline is an abundant secondary metabolite synthesized by Penicillium paxilli. In total, 21 genes have been identified at the PAX locus of which six have been previously confirmed to have a functional role in paxilline biosynthesis. A combination of bioinformatics, gene expression and targeted gene replacement analyses were used to define the boundaries of the PAX gene cluster. Targeted gene replacement identified seven genes, paxG, paxA, paxM, paxB, paxC, paxP and paxQ that were all required for paxilline production, with one additional gene, paxD, required for regular prenylation of the indole ring post paxilline synthesis. The two putative transcription factors, PP104 and PP105, were not co-regulated with the pax genes and based on targeted gene replacement, including the double knockout, did not have a role in paxilline production. The relationship of indole dimethylallyl transferases involved in prenylation of indole-diterpenes such as paxilline or lolitrem B, can be found as two disparate clades, not supported by prenylation type (e.g., regular or reverse). This paper provides insight into the P. paxilli indole-diterpene locus and reviews the recent advances identified in paxilline biosynthesis. PMID:23949005

  1. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    NASA Technical Reports Server (NTRS)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where each variable was used. The listings summarize all optimizations, listing the objective functions, design variables, and constraints. The compiler offers error-checking specific to optimization problems, so that simple mistakes will not cost hours of debugging time. The optimization engine used by and included with the SOL compiler is a version of Vanderplatt's ADS system (Version 1.1) modified specifically to work with the SOL compiler. SOL allows the use of the over 100 ADS optimization choices such as Sequential Quadratic Programming, Modified Feasible Directions, interior and exterior penalty function and variable metric methods. Default choices of the many control parameters of ADS are made for the user, however, the user can override any of the ADS control parameters desired for each individual optimization. The SOL language and compiler were developed with an advanced compiler-generation system to ensure correctness and simplify program maintenance. Thus, SOL's syntax was defined precisely by a LALR(1) grammar and the SOL compiler's parser was generated automatically from the LALR(1) grammar with a parser-generator. Hence unlike ad hoc, manually coded interfaces, the SOL compiler's lexical analysis insures that the SOL compiler recognizes all legal SOL programs, can recover from and correct for many errors and report the location of errors to the user. This version of the SOL compiler has been implemented on VAX/VMS computer systems and requires 204 KB of virtual memory to execute. Since the SOL compiler produces FORTRAN code, it requires the VAX FORTRAN compiler to produce an executable program. The SOL compiler consists of 13,000 lines of Pascal code. It was developed in 1986 and last updated in 1988. The ADS and other utility subroutines amount to 14,000 lines of FORTRAN code and were also updated in 1988.

  2. On search guide phrase compilation for recommending home medical products.

    PubMed

    Luo, Gang

    2010-01-01

    To help people find desired home medical products (HMPs), we developed an intelligent personal health record (iPHR) system that can automatically recommend HMPs based on users' health issues. Using nursing knowledge, we pre-compile a set of "search guide" phrases that provides semantic translation from words describing health issues to their underlying medical meanings. Then iPHR automatically generates queries from those phrases and uses them and a search engine to retrieve HMPs. To avoid missing relevant HMPs during retrieval, the compiled search guide phrases need to be comprehensive. Such compilation is a challenging task because nursing knowledge updates frequently and contains numerous details scattered in many sources. This paper presents a semi-automatic tool facilitating such compilation. Our idea is to formulate the phrase compilation task as a multi-label classification problem. For each newly obtained search guide phrase, we first use nursing knowledge and information retrieval techniques to identify a small set of potentially relevant classes with corresponding hints. Then a nurse makes the final decision on assigning this phrase to proper classes based on those hints. We demonstrate the effectiveness of our techniques by compiling search guide phrases from an occupational therapy textbook.

  3. On the efficient bio-incorporation of 5-hydroxy-tryptophan in recombinant proteins expressed in Escherichia coli with T7 RNA polymerase-based vectors.

    PubMed

    Oliveira-Souza, Wellington P; Bronze, Fellipe; Broos, Jaap; Marcondes, Marcelo F M; Oliveira, Vitor

    2017-10-21

    Biosynthetic incorporation of non-canonic amino acids is an attractive strategy to introduce new properties in recombinant proteins. Trp analogs can be incorporated in recombinant proteins replacing regular Trp during protein translation into a Trp-auxotrophic cell host. This straightforward method however, is limited to few analogs recognized and accepted by the cellular protein production machinery. 5-hydroxy-tryptophan (5OH-Trp) can be bio-incorporated using E. coli as expression host however; we have experienced very low incorporation yields - amount of protein containing regular Trp/amount of protein containing the Trp analog - during expressions of 5OH-Trp labeled proteins. Furthermore, this low incorporation yield were verified especially when the widely-used vectors based on the T7 RNA polymerase were used. Testing different 5OH-Trp incorporation protocols we verified that in these T7-based systems, the production of the T7 RNA polymerase is driven by the same elements - lac promoter/IPTG - as the target protein. Consequently, the bio-incorporation of the 5OH-Trp residues also occurs in this crucial enzyme, but, the produced T7 RNA polymerase labeled with 5OH-Trp is inactive or much less active. In the present work, we describe an efficient method to overcome this mentioned problem and bio-incorporate 5OH-Trp in proteins expressed in E. coli., using vectors based on the T7 RNA polymerase-T7 promoter. The two-step induction protocol here described showed incorporation efficiencies of 5OH-Trp higher than 90%. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Clustered Regularly Interspaced Short Palindromic Repeat-Dependent, Biofilm-Specific Death of Pseudomonas aeruginosa Mediated by Increased Expression of Phage-Related Genes

    PubMed Central

    Heussler, Gary E.; Cady, Kyle C.; Koeppen, Katja; Bhuju, Sabin; Stanton, Bruce A.

    2015-01-01

    ABSTRACT The clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated (CRISPR/Cas) system is an adaptive immune system present in many archaea and bacteria. CRISPR/Cas systems are incredibly diverse, and there is increasing evidence of CRISPR/Cas systems playing a role in cellular functions distinct from phage immunity. Previously, our laboratory reported one such alternate function in which the type 1-F CRISPR/Cas system of the opportunistic pathogen Pseudomonas aeruginosa strain UCBPP-PA14 (abbreviated as P. aeruginosa PA14) inhibits both biofilm formation and swarming motility when the bacterium is lysogenized by the bacteriophage DMS3. In this study, we demonstrated that the presence of just the DMS3 protospacer and the protospacer-adjacent motif (PAM) on the P. aeruginosa genome is necessary and sufficient for this CRISPR-dependent loss of these group behaviors, with no requirement of additional DMS3 sequences. We also demonstrated that the interaction of the CRISPR system with the DMS3 protospacer induces expression of SOS-regulated phage-related genes, including the well-characterized pyocin operon, through the activity of the nuclease Cas3 and subsequent RecA activation. Furthermore, our data suggest that expression of the phage-related genes results in bacterial cell death on a surface due to the inability of the CRISPR-engaged strain to downregulate phage-related gene expression, while these phage-related genes have minimal impact on growth and viability under planktonic conditions. Deletion of the phage-related genes restores biofilm formation and swarming motility while still maintaining a functional CRISPR/Cas system, demonstrating that the loss of these group behaviors is an indirect effect of CRISPR self-targeting. PMID:25968642

  5. Improved photobio-H2 production regulated by artificial miRNA targeting psbA in green microalga Chlamydomonas reinhardtii.

    PubMed

    Li, Hui; Liu, Yanmei; Wang, Yuting; Chen, Meirong; Zhuang, Xiaoshan; Wang, Chaogang; Wang, Jiangxin; Hu, Zhangli

    2018-01-01

    Sulfur-deprived cultivation of Chlamydomonas reinhardtii , referred as "two-stage culture" transferring the cells from regular algal medium to sulfur-deplete one, has been extensively studied to improve photobio-H 2 production in this green microalga. During sulfur-deprivation treatment, the synthesis of a key component of photosystem II complex, D1 protein, was inhibited and improved photobio-H 2 production could be established in C. reinhardtii . However, separation of algal cells from a regular liquid culture medium to a sulfur-deprived one is not only a discontinuous process, but also a cost- and time-consuming operation. More applicable and economic alternatives for sustained H 2 production by C. reinhardtii are still highly required. In the present study, a significant improvement in photobio-H 2 production was observed in the transgenic green microalga C. reinhardtii , which employed a newly designed strategy based on a heat-inducible artificial miRNA (amiRNA) expression system targeting D1-encoded gene, psbA . A transgenic algal strain referred as "amiRNA-D1" has been successfully obtained by transforming the expression vector containing a heat-inducible promoter. After heat shock conducted in the same algal cultures, the expression of amiRNA-D1 was detected increased 15-fold accompanied with a 73% decrease of target gene psbA . More interestingly, this transgenic alga accumulated about 60% more H 2 content than the wild-type strain CC-849 at the end of 7-day cultivation. The photobio-H 2 production in the engineered transgenic alga was significantly improved. Without imposing any nutrient-deprived stress, this novel strategy provided a convenient and efficient way for regulation of photobio-H 2 production in green microalga by simply "turn on" the expression of a designed amiRNA.

  6. PCAL: Language Support for Proof-Carrying Authorization Systems

    DTIC Science & Technology

    2009-10-16

    behavior of a compiled program is the same as that of the source program (Theorem 4.1) and that successfully compiled programs cannot fail due to access...semantics, formalize our compilation procedure and show that it preserves the behavior of programs. For simplicity of presentation, we abstract various...H;L ` s (6) if γ :: H;L ` s then H;L ` s↘ γ′ for some γ′. We can now show that compilation preserves the behavior of programs. More precisely, if

  7. Temporal Planning for Compilation of Quantum Approximate Optimization Algorithm Circuits

    NASA Technical Reports Server (NTRS)

    Venturelli, Davide; Do, Minh Binh; Rieffel, Eleanor Gilbert; Frank, Jeremy David

    2017-01-01

    We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus our initial experiments on Quantum Approximate Optimization Algorithm (QAOA) circuits that have few ordering constraints and allow highly parallel plans. We report on experiments using several temporal planners to compile circuits of various sizes to a realistic hardware. This early empirical evaluation suggests that temporal planning is a viable approach to quantum circuit compilation.

  8. Parallel machine architecture and compiler design facilities

    NASA Technical Reports Server (NTRS)

    Kuck, David J.; Yew, Pen-Chung; Padua, David; Sameh, Ahmed; Veidenbaum, Alex

    1990-01-01

    The objective is to provide an integrated simulation environment for studying and evaluating various issues in designing parallel systems, including machine architectures, parallelizing compiler techniques, and parallel algorithms. The status of Delta project (which objective is to provide a facility to allow rapid prototyping of parallelized compilers that can target toward different machine architectures) is summarized. Included are the surveys of the program manipulation tools developed, the environmental software supporting Delta, and the compiler research projects in which Delta has played a role.

  9. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    PubMed

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  10. In defense of compilation: A response to Davis' form and content in model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard

    1990-01-01

    In a recent paper entitled 'Form and Content in Model Based Reasoning', Randy Davis argues that model based reasoning research aimed at compiling task specific rules from underlying device models is mislabeled, misguided, and diversionary. Some of Davis' claims are examined and his basic conclusions are challenged about the value of compilation research to the model based reasoning community. In particular, Davis' claim is refuted that model based reasoning is exempt from the efficiency benefits provided by knowledge compilation techniques. In addition, several misconceptions are clarified about the role of representational form in compilation. It is concluded that techniques have the potential to make a substantial contribution to solving tractability problems in model based reasoning.

  11. Solidify, An LLVM pass to compile LLVM IR into Solidity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kothapalli, Abhiram

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. Inmore » another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.« less

  12. The nuisance due to the noise of automobile traffic: An investigation in the neighborhoods of freeways

    NASA Technical Reports Server (NTRS)

    Lamure, C.; Bacelon, M.

    1980-01-01

    An inquiry was held among 400 people living near freeways in an attempt to determine the characteristics of traffic noise nuisance. A nuisance index was compiled, based on the answers to a questionnaire. Nuisance expressed in these terms was then compared with the noise level measured on the most exposed side of each building. Correlation between the nuisance indexes and the average noise levels is quite good for dwellings with facades parallel to the freeway. At equal noise levels on the most exposed side, the nuisance given for these latter dwellings is lower than for others.

  13. Expression of the Mir-133 and Bcl-2 could be affected by swimming training in the heart of ovariectomized rats.

    PubMed

    Habibi, Parisa; Alihemmati, Alireza; NourAzar, Alireza; Yousefi, Hadi; Mortazavi, Safieh; Ahmadiasl, Nasser

    2016-04-01

    The beneficial and more potent role of exercise to prevent heart apoptosis in ovariectomized rats has been known. The aim of this study was to examine the effects of swimming training on cardiac expression of Bcl-2, and Mir-133 levels and glycogen changes in the myocyte. Forty animals were separated into four groups as control, sham, ovariectomy (OVX) and ovariectomized group with 8 weeks swimming training (OVX.E). Training effects were evaluated by measuring lipid profiles, Bcl-2 and Mir-133 expression levels in the cardiac tissue. Grafts were analyzed by reverse transcription-polymerase chain reaction for Bcl-2 mRNA and Mir-133 and by Western blot for Bcl-2 protein. Ovariectomy down-regulated Bcl-2 and Mir-133 expression levels in the cardiac tissue, and swimming training up-regulated their expression significantly (P<0.05). Our results showed that regular exercise as a physical replacement therapy could prevent and improve the effects of estrogen deficiency in the cardia.

  14. The food additive vanillic acid controls transgene expression in mammalian cells and mice.

    PubMed

    Gitzinger, Marc; Kemmer, Christian; Fluri, David A; El-Baba, Marie Daoud; Weber, Wilfried; Fussenegger, Martin

    2012-03-01

    Trigger-inducible transcription-control devices that reversibly fine-tune transgene expression in response to molecular cues have significantly advanced the rational reprogramming of mammalian cells. When designed for use in future gene- and cell-based therapies the trigger molecules have to be carefully chosen in order to provide maximum specificity, minimal side-effects and optimal pharmacokinetics in a mammalian organism. Capitalizing on control components that enable Caulobacter crescentus to metabolize vanillic acid originating from lignin degradation that occurs in its oligotrophic freshwater habitat, we have designed synthetic devices that specifically adjust transgene expression in mammalian cells when exposed to vanillic acid. Even in mice transgene expression was robust, precise and tunable in response to vanillic acid. As a licensed food additive that is regularly consumed by humans via flavoured convenience food and specific fresh vegetable and fruits, vanillic acid can be considered as a safe trigger molecule that could be used for diet-controlled transgene expression in future gene- and cell-based therapies.

  15. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  16. Advanced compilation techniques in the PARADIGM compiler for distributed-memory multicomputers

    NASA Technical Reports Server (NTRS)

    Su, Ernesto; Lain, Antonio; Ramaswamy, Shankar; Palermo, Daniel J.; Hodges, Eugene W., IV; Banerjee, Prithviraj

    1995-01-01

    The PARADIGM compiler project provides an automated means to parallelize programs, written in a serial programming model, for efficient execution on distributed-memory multicomputers. .A previous implementation of the compiler based on the PTD representation allowed symbolic array sizes, affine loop bounds and array subscripts, and variable number of processors, provided that arrays were single or multi-dimensionally block distributed. The techniques presented here extend the compiler to also accept multidimensional cyclic and block-cyclic distributions within a uniform symbolic framework. These extensions demand more sophisticated symbolic manipulation capabilities. A novel aspect of our approach is to meet this demand by interfacing PARADIGM with a powerful off-the-shelf symbolic package, Mathematica. This paper describes some of the Mathematica routines that performs various transformations, shows how they are invoked and used by the compiler to overcome the new challenges, and presents experimental results for code involving cyclic and block-cyclic arrays as evidence of the feasibility of the approach.

  17. RPython high-level synthesis

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radoslaw; Linczuk, Maciej

    2016-09-01

    The development of FPGA technology and the increasing complexity of applications in recent decades have forced compilers to move to higher abstraction levels. Compilers interprets an algorithmic description of a desired behavior written in High-Level Languages (HLLs) and translate it to Hardware Description Languages (HDLs). This paper presents a RPython based High-Level synthesis (HLS) compiler. The compiler get the configuration parameters and map RPython program to VHDL. Then, VHDL code can be used to program FPGA chips. In comparison of other technologies usage, FPGAs have the potential to achieve far greater performance than software as a result of omitting the fetch-decode-execute operations of General Purpose Processors (GPUs), and introduce more parallel computation. This can be exploited by utilizing many resources at the same time. Creating parallel algorithms computed with FPGAs in pure HDL is difficult and time consuming. Implementation time can be greatly reduced with High-Level Synthesis compiler. This article describes design methodologies and tools, implementation and first results of created VHDL backend for RPython compiler.

  18. Kefir and health: a contemporary perspective.

    PubMed

    Ahmed, Zaheer; Wang, Yanping; Ahmad, Asif; Khan, Salman Tariq; Nisa, Mehrun; Ahmad, Hajra; Afreen, Asma

    2013-01-01

    Kefir and its related products are renowned nutraceutical dairy products produced through fermentation of yeasts and bacteria naturally present in grains of kefir. The nutritional attributes of this self-carbonated beverage are due to presence of vital nutrients such as carbohydrates, proteins, minerals, vitamins, and some nutraceutical components. Antimicrobial activity, better gut health, anticarcinogenic activity, control on serum glucose and cholesterol, control on lactose intolerance and better immune system can be achieved through its regular consumption. Moreover, on the one side kefir is good dietetic beverage, and of particular interest of athletes, and on the other side the whole kefir is good for feeding small babies and pre-schoolers for good tolerance against disease and quick weight gain. Lots of works have been done on kefir from a health point of view. This study summarizes all the data that have been compiled to date. The purpose of this review is to gather information about microbiological, chemical, nutritional, and therapeutic aspects of kefir and kefir-like products to provide justification for its consumption. This review leads us to conclude that kefir begins a new dawn of food for the mankind.

  19. Production Maintenance Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason Gabler, David Skinner

    2005-11-01

    PMI is a XML framework for formulating tests of software and software environments which operate in a relatively push button manner, i.e., can be automated, and that provide results that are readily consumable/publishable via RSS. Insofar as possible the tests are carried out in manner congruent with real usage. PMI drives shell scripts via a perl program which is charge of timing, validating each test, and controlling the flow through sets of tests. Testing in PMI is built up hierarchically. A suite of tests may start by testing basic functionalities (file system is writable, compiler is found and functions, shellmore » environment behaves as expected, etc.) and work up to large more complicated activities (execution of parallel code, file transfers, etc.) At each step in this hierarchy a failure leads to generation of a text message or RSS that can be tagged as to who should be notified of the failure. There are two functionalities that PMI has been directed at. 1) regular and automated testing of multi user environments and 2) version-wise testing of new software releases prior to their deployment in a production mode.« less

  20. Sulfur Dioxide Emission Rates from Kilauea Volcano, Hawai`i, an Update: 1998-2001

    USGS Publications Warehouse

    Elias, Tamar; Sutton, A. Jefferson

    2002-01-01

    Introduction Sulfur dioxide (SO2) emission rates from Kilauea Volcano were first measured by Stoiber and Malone (1975) and have been measured on a regular basis since 1979 (Greenland and others, 1985; Casadevall and others, 1987; Elias and others, 1998; Sutton and others, 2001). A compilation of SO2 emission-rate and wind-vector data from 1979 through 1997 is available as Open-File Report 98-462 (Elias and others, 1998) and on the web at http://hvo.wr.usgs.gov/products/OF98462/. The purpose of this report is to update the existing database through 2001. Kilauea releases SO2 gas predominantly from its summit caldera and east rift zone (ERZ) (fig. 1), as described in previous reports (Elias and others, 1998; Sutton and others, 2001). These two distinct sources are quantified independently. The summit and east rift zone emission rates reported here were derived using vehicle-based Correlation Spectrometry (COSPEC) measurements as described in Elias and others (1998). In 1998 and 1999, these measurements were augmented with airborne and tripod-based surveys.

  1. Creating effective scholarly posters: a guide for DNP students.

    PubMed

    Christenbery, Thomas L; Latham, Tiffany G

    2013-01-01

    Dissemination of scholarly project outcomes is an essential component of Doctor of Nursing Practice (DNP) education. This article provides guidelines for professional poster development and presentation as well as suggestions for integrating poster development as part of the DNP curriculum. This article was prepared by reviewing both theoretical and research-based literature regarding professional poster development. Evidence indicates that poster presentations at professional conferences are an excellent venue for DNP students to successfully share the results of their scholarly projects. For posters to be both well perceived and received at conferences, certain guidelines must be followed regarding poster development. Guidelines include emphasizing a consistent message, clear focus, logical format, and esthetically pleasing design. Poster development guidelines and strategies need to be taught early and regularly throughout the DNP student's education. DNP scholarly projects provide forward-looking solutions to some of society's most formidable healthcare challenges. The dissemination of knowledge gleaned from the DNP scholarly projects is vital to 21st century global health. Effective poster presentations are critical to the dissemination of scholarly knowledge. ©2012 The Author(s) Journal compilation ©2012 American Association of Nurse Practitioners.

  2. Regularized lattice Boltzmann model for immiscible two-phase flows with power-law rheology

    NASA Astrophysics Data System (ADS)

    Ba, Yan; Wang, Ningning; Liu, Haihu; Li, Qiang; He, Guoqiang

    2018-03-01

    In this work, a regularized lattice Boltzmann color-gradient model is developed for the simulation of immiscible two-phase flows with power-law rheology. This model is as simple as the Bhatnagar-Gross-Krook (BGK) color-gradient model except that an additional regularization step is introduced prior to the collision step. In the regularization step, the pseudo-inverse method is adopted as an alternative solution for the nonequilibrium part of the total distribution function, and it can be easily extended to other discrete velocity models no matter whether a forcing term is considered or not. The obtained expressions for the nonequilibrium part are merely related to macroscopic variables and velocity gradients that can be evaluated locally. Several numerical examples, including the single-phase and two-phase layered power-law fluid flows between two parallel plates, and the droplet deformation and breakup in a simple shear flow, are conducted to test the capability and accuracy of the proposed color-gradient model. Results show that the present model is more stable and accurate than the BGK color-gradient model for power-law fluids with a wide range of power-law indices. Compared to its multiple-relaxation-time counterpart, the present model can increase the computing efficiency by around 15%, while keeping the same accuracy and stability. Also, the present model is found to be capable of reasonably predicting the critical capillary number of droplet breakup.

  3. [The impact of a 14- day regular physical exercise regime on the concentration of the classes and subclasses of lipoprotein particles in young subjects with a sedentary lifestyle].

    PubMed

    Sabaka, P; Dukát, A; Oravec, S; Mistríková, L; Baláž, D; Bendžala, M; Gašpar, L

    2013-10-01

    Recommendations from the cardiological professional companies working in the area of primary prevention of cardiovascular diseases put an emphasis on regular aerobic physical activity. Its positive effect on both cardiovascular and overall mortality has repea-tedly been proven by the observations of prospective and cross sectional epidemiological studies. One of the possible explanations of this positive effect is a change in the concentration of lipoprotein classes and their subclasses, which is expressed as a change in their average size. In a group of young healthy men and women with a sedentary lifestyle we observed the effect of medium intensive physical exercise in the form of a 30- minute slow run per day lasting for 14 days. The concentration of lipoprotein classes and subclasses were determined through the method of a linear electrophoresis in polyacrylamide gel. In the observed group we found a statistically significant decrease of VLDL, large IDL particles, medium sized LDL, small dense LDL, and medium sized HDL particles. In the light of current knowledge all these lipoprotein particles are deemed as atherogenic. Thus, as little as 14 days of regular exercising has a positive effect on the concentration of plasmatic lipoproteins, and emphasises the role of regular physical activity in the primary prevention of cardiovascular diseases.

  4. Heat adaptation from regular hot water immersion decreases proinflammatory responses, HSP70 expression, and physical heat stress.

    PubMed

    Yang, Fwu-Lin; Lee, Chia-Chi; Subeq, Yi-Maun; Lee, Chung-Jen; Ke, Chun-Yen; Lee, Ru-Ping

    2017-10-01

    Hot-water immersion (HWI) is a type of thermal therapy for treating various diseases. In our study, the physiological responses to occasional and regular HWI have been explored. The rats were divided into a control group, occasional group (1D), and regular group (7D). The 1D and 7D groups received 42°C during 15mins HWI for 1 and 7 days, respectively. The blood samples were collected for proinflammatory cytokines examinations, the heart, liver and kidney were excised for subsequent IHC analysis to measure the level of heat shock protein 70 (HSP70). The results revealed that the body temperature increased significantly during HWI on Day 3 and significantly declined on Days 6 and 7. For the 7D group, body weight, heart rate, hematocrit, platelet, osmolarity, and lactate level were lower than those in the 1D group. Furthermore, the levels of granulocyte counts, tumor necrosis factor-α, and interleukin-6 were lower in the 7D group than in the 1D group. The induction of HSP70 in the 1D group was higher than in the other groups. Physiological responses to occasional HWI are disadvantageous because of heat stress. However, adaptation to heat from regular HWI resulted in decreased proinflammatory responses and physical heat stress. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Prenatal metformin exposure in mice programs the metabolic phenotype of the offspring during a high fat diet at adulthood.

    PubMed

    Salomäki, Henriikka; Vähätalo, Laura H; Laurila, Kirsti; Jäppinen, Norma T; Penttinen, Anna-Maija; Ailanen, Liisa; Ilyasizadeh, Juan; Pesonen, Ullamari; Koulu, Markku

    2013-01-01

    The antidiabetic drug metformin is currently used prior and during pregnancy for polycystic ovary syndrome, as well as during gestational diabetes mellitus. We investigated the effects of prenatal metformin exposure on the metabolic phenotype of the offspring during adulthood in mice. Metformin (300 mg/kg) or vehicle was administered orally to dams on regular diet from the embryonic day E0.5 to E17.5. Gene expression profiles in liver and brain were analysed from 4-day old offspring by microarray. Body weight development and several metabolic parameters of offspring were monitored both during regular diet (RD-phase) and high fat diet (HFD-phase). At the end of the study, two doses of metformin or vehicle were given acutely to mice at the age of 20 weeks, and Insig-1 and GLUT4 mRNA expressions in liver and fat tissue were analysed using qRT-PCR. Metformin exposed fetuses were lighter at E18.5. There was no effect of metformin on the maternal body weight development or food intake. Metformin exposed offspring gained more body weight and mesenteric fat during the HFD-phase. The male offspring also had impaired glucose tolerance and elevated fasting glucose during the HFD-phase. Moreover, the expression of GLUT4 mRNA was down-regulated in epididymal fat in male offspring prenatally exposed to metformin. Based on the microarray and subsequent qRT-PCR analyses, the expression of Insig-1 was changed in the liver of neonatal mice exposed to metformin prenatally. Furthermore, metformin up-regulated the expression of Insig-1 later in development. Gene set enrichment analysis based on preliminary microarray data identified several differentially enriched pathways both in control and metformin exposed mice. The present study shows that prenatal metformin exposure causes long-term programming effects on the metabolic phenotype during high fat diet in mice. This should be taken into consideration when using metformin as a therapeutic agent during pregnancy.

  6. Deployment of the MARSIS Radar Antennas On-Board Mars Express

    NASA Technical Reports Server (NTRS)

    Denis, Michel; Moorhouse, A.; Smith, A.; McKay, Mike; Fischer, J.; Jayaraman, P.; Mounzer, Z.; Schmidt, R.; Reddy, J.; Ecale, E.; hide

    2006-01-01

    On the first European planetary mission, the deployment of the two 20-meter long MARSIS antennas onboard the ESA Mars Express spacecraft has represented an unprecedented technological challenge, in the middle of a successful science mission. While Mars Express was already performing regular observations at Mars, a complex process has been performed on Earth, involving the ESA Project, coordination between ESA, NASA and ASI, the Mars Science community, the spacecraft manufacturer EADS Astrium and the Mission Control Centre at ESOC. This paper describes the steps that led from an initial nogo in 2004 to deployment one year later, as well as the conditions and difficulties encountered during the actual deployment. It provides insights in the technical and managerial processes that made it a success, and analyses the rationale behind the decisions.

  7. Characterization and improvement of RNA-Seq precision in quantitative transcript expression profiling.

    PubMed

    Łabaj, Paweł P; Leparc, Germán G; Linggi, Bryan E; Markillie, Lye Meng; Wiley, H Steven; Kreil, David P

    2011-07-01

    Measurement precision determines the power of any analysis to reliably identify significant signals, such as in screens for differential expression, independent of whether the experimental design incorporates replicates or not. With the compilation of large-scale RNA-Seq datasets with technical replicate samples, however, we can now, for the first time, perform a systematic analysis of the precision of expression level estimates from massively parallel sequencing technology. This then allows considerations for its improvement by computational or experimental means. We report on a comprehensive study of target identification and measurement precision, including their dependence on transcript expression levels, read depth and other parameters. In particular, an impressive recall of 84% of the estimated true transcript population could be achieved with 331 million 50 bp reads, with diminishing returns from longer read lengths and even less gains from increased sequencing depths. Most of the measurement power (75%) is spent on only 7% of the known transcriptome, however, making less strongly expressed transcripts harder to measure. Consequently, <30% of all transcripts could be quantified reliably with a relative error<20%. Based on established tools, we then introduce a new approach for mapping and analysing sequencing reads that yields substantially improved performance in gene expression profiling, increasing the number of transcripts that can reliably be quantified to over 40%. Extrapolations to higher sequencing depths highlight the need for efficient complementary steps. In discussion we outline possible experimental and computational strategies for further improvements in quantification precision. rnaseq10@boku.ac.at

  8. AUTO_DERIV: Tool for automatic differentiation of a Fortran code

    NASA Astrophysics Data System (ADS)

    Stamatiadis, S.; Farantos, S. C.

    2010-10-01

    AUTO_DERIV is a module comprised of a set of FORTRAN 95 procedures which can be used to calculate the first and second partial derivatives (mixed or not) of any continuous function with many independent variables. The mathematical function should be expressed as one or more FORTRAN 77/90/95 procedures. A new type of variables is defined and the overloading mechanism of functions and operators provided by the FORTRAN 95 language is extensively used to define the differentiation rules. Proper (standard complying) handling of floating-point exceptions is provided by using the IEEE_EXCEPTIONS intrinsic module (Technical Report 15580, incorporated in FORTRAN 2003). New version program summaryProgram title: AUTO_DERIV Catalogue identifier: ADLS_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADLS_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2963 No. of bytes in distributed program, including test data, etc.: 10 314 Distribution format: tar.gz Programming language: Fortran 95 + (optionally) TR-15580 (Floating-point exception handling) Computer: all platforms with a Fortran 95 compiler Operating system: Linux, Windows, MacOS Classification: 4.12, 6.2 Catalogue identifier of previous version: ADLS_v1_0 Journal reference of previous version: Comput. Phys. Comm. 127 (2000) 343 Does the new version supersede the previous version?: Yes Nature of problem: The need to calculate accurate derivatives of a multivariate function frequently arises in computational physics and chemistry. The most versatile approach to evaluate them by a computer, automatically and to machine precision, is via user-defined types and operator overloading. AUTO_DERIV is a Fortran 95 implementation of them, designed to evaluate the first and second derivatives of a function of many variables. Solution method: The mathematical rules for differentiation of sums, products, quotients, elementary functions in conjunction with the chain rule for compound functions are applied. The function should be expressed as one or more Fortran 77/90/95 procedures. A new type of variables is defined and the overloading mechanism of functions and operators provided by the Fortran 95 language is extensively used to implement the differentiation rules. Reasons for new version: The new version supports Fortran 95, handles properly the floating-point exceptions, and is faster due to internal reorganization. All discovered bugs are fixed. Summary of revisions:The code was rewritten extensively to benefit from features introduced in Fortran 95. Additionally, there was a major internal reorganization of the code, resulting in faster execution. The user interface described in the original paper was not changed. The values that the user must or should specify before compilation (essentially, the number of independent variables) were moved into ad_types module. There were many minor bug fixes. One important bug was found and fixed; the code did not handle correctly the overloading of ∗ in aλ when a=0. The case of division by zero and the discontinuity of the function at the requested point are indicated by standard IEEE exceptions ( IEEE_DIVIDE_BY_ZERO and IEEE_INVALID respectively). If the compiler does not support IEEE exceptions, a module with the appropriate name is provided, imitating the behavior of the 'standard' module in the sense that it raises the corresponding exceptions. It is up to the compiler (through certain flags probably) to detect them. Restrictions: None imposed by the program. There are certain limitations that may appear mostly due to the specific implementation chosen in the user code. They can always be overcome by recoding parts of the routines developed by the user or by modifying AUTO_DERIV according to specific instructions given in [1]. The common restrictions of available memory and the capabilities of the compiler are the same as the original version. Additional comments: The program has been tested using the following compilers: Intel ifort, GNU gfortran, NAGWare f95, g95. Running time: The typical running time for the program depends on the compiler and the complexity of the differentiated function. A rough estimate is that AUTO_DERIV is ten times slower than the evaluation of the analytical ('by hand') function value and derivatives (if they are available). References:S. Stamatiadis, R. Prosmiti, S.C. Farantos, AUTO_DERIV: tool for automatic differentiation of a Fortran code, Comput. Phys. Comm. 127 (2000) 343.

  9. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns.

    PubMed

    Gruel, Jérémy; LeBorgne, Michel; LeMeur, Nolwenn; Théret, Nathalie

    2011-09-12

    Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks.

  10. Simple Shared Motifs (SSM) in conserved region of promoters: a new approach to identify co-regulation patterns

    PubMed Central

    2011-01-01

    Background Regulation of gene expression plays a pivotal role in cellular functions. However, understanding the dynamics of transcription remains a challenging task. A host of computational approaches have been developed to identify regulatory motifs, mainly based on the recognition of DNA sequences for transcription factor binding sites. Recent integration of additional data from genomic analyses or phylogenetic footprinting has significantly improved these methods. Results Here, we propose a different approach based on the compilation of Simple Shared Motifs (SSM), groups of sequences defined by their length and similarity and present in conserved sequences of gene promoters. We developed an original algorithm to search and count SSM in pairs of genes. An exceptional number of SSM is considered as a common regulatory pattern. The SSM approach is applied to a sample set of genes and validated using functional gene-set enrichment analyses. We demonstrate that the SSM approach selects genes that are over-represented in specific biological categories (Ontology and Pathways) and are enriched in co-expressed genes. Finally we show that genes co-expressed in the same tissue or involved in the same biological pathway have increased SSM values. Conclusions Using unbiased clustering of genes, Simple Shared Motifs analysis constitutes an original contribution to provide a clearer definition of expression networks. PMID:21910886

  11. Cold ischemia contributes to the development of chronic rejection and mitochondrial injury after cardiac transplantation.

    PubMed

    Schneeberger, Stefan; Amberger, Albert; Mandl, Julia; Hautz, Theresa; Renz, Oliver; Obrist, Peter; Meusburger, Hugo; Brandacher, Gerald; Mark, Walter; Strobl, Daniela; Troppmair, Jakob; Pratschke, Johann; Margreiter, Raimund; Kuznetsov, Andrey V

    2010-12-01

    Chronic rejection (CR) remains an unsolved hurdle for long-term heart transplant survival. The effect of cold ischemia (CI) on progression of CR and the mechanisms resulting in functional deficit were investigated by studying gene expression, mitochondrial function, and enzymatic activity. Allogeneic (Lew→F344) and syngeneic (Lew→Lew) heart transplantations were performed with or without 10 h of CI. After evaluation of myocardial contraction, hearts were excised at 2, 10, 40, and 60 days for investigation of vasculopathy, gene expression, enzymatic activities, and mitochondrial respiration. Gene expression studies identified a gene cluster coding for subunits of the mitochondrial electron transport chain regulated in response to CI and CR. Myocardial performance, mitochondrial function, and mitochondrial marker enzyme activities declined in all allografts with time after transplantation. These declines were more rapid and severe in CI allografts (CR-CI) and correlated well with progression of vasculopathy and fibrosis. Mitochondria related gene expression and mitochondrial function are substantially compromised with the progression of CR and show that CI impacts on progression, gene profile, and mitochondrial function of CR. Monitoring mitochondrial function and enzyme activity might allow for earlier detection of CR and cardiac allograft dysfunction. © 2010 The Authors. Journal compilation © 2010 European Society for Organ Transplantation.

  12. Proteomic analysis of hair shafts from monozygotic twins: Expression profiles and genetically variant peptides.

    PubMed

    Wu, Pei-Wen; Mason, Katelyn E; Durbin-Johnson, Blythe P; Salemi, Michelle; Phinney, Brett S; Rocke, David M; Parker, Glendon J; Rice, Robert H

    2017-07-01

    Forensic association of hair shaft evidence with individuals is currently assessed by comparing mitochondrial DNA haplotypes of reference and casework samples, primarily for exclusionary purposes. Present work tests and validates more recent proteomic approaches to extract quantitative transcriptional and genetic information from hair samples of monozygotic twin pairs, which would be predicted to partition away from unrelated individuals if the datasets contain identifying information. Protein expression profiles and polymorphic, genetically variant hair peptides were generated from ten pairs of monozygotic twins. Profiling using the protein tryptic digests revealed that samples from identical twins had typically an order of magnitude fewer protein expression differences than unrelated individuals. The data did not indicate that the degree of difference within twin pairs increased with age. In parallel, data from the digests were used to detect genetically variant peptides that result from common nonsynonymous single nucleotide polymorphisms in genes expressed in the hair follicle. Compilation of the variants permitted sorting of the samples by hierarchical clustering, permitting accurate matching of twin pairs. The results demonstrate that genetic differences are detectable by proteomic methods and provide a framework for developing quantitative statistical estimates of personal identification that increase the value of hair shaft evidence. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Sport Psychology Service Provision: Preferences for Consultant Characteristics and Mode of Delivery among Elite Malaysian Athletes

    PubMed Central

    Ponnusamy, Vellapandian; Grove, J. Robert

    2014-01-01

    Factors relevant to the working alliance between athletes and sport psychology consultants were investigated in a sample of elite Malaysian athletes (n = 217). The athletes represented a variety of team and individual sports, and they provided information about the perceived importance of seven consultant characteristics/behaviors as well as seven program delivery options. At a full-sample level, general preferences were expressed for consultants to lead a physically active lifestyle, regularly attend training sessions and competitions, and have prior experience as an athlete or coach. General preferences were also expressed for program content to be determined by the coach or consultant, and for regular, small doses of mental skills training to be delivered in a face-to-face context throughout the year. At a sub-group level, team sport athletes had stronger preferences than individual sport athletes for program delivery on a group/team basis, while individual sport athletes had stronger preferences than team sport athletes for having a role in determining program content. Findings are discussed in relation to dominant value themes within Malaysian society and the reinforcement of these themes within specific sport subcultures. Key points Consultant characteristics and program delivery methods have an impact on the effectiveness of sport psychology services. Preferred consultant characteristics and preferred methods of delivery may be affected by cultural and subcultural values. Elite Malaysian athletes prefer consultants to lead a physically active lifestyle; to regularly attend training/competition; and to have prior experience as an athlete or coach. Elite Malaysian athletes also prefer that the coach or consultant determine program content, and that mental skills training take place in a face-to-face context throughout the year. PMID:25177193

  14. Sport Psychology Service Provision: Preferences for Consultant Characteristics and Mode of Delivery among Elite Malaysian Athletes.

    PubMed

    Ponnusamy, Vellapandian; Grove, J Robert

    2014-09-01

    Factors relevant to the working alliance between athletes and sport psychology consultants were investigated in a sample of elite Malaysian athletes (n = 217). The athletes represented a variety of team and individual sports, and they provided information about the perceived importance of seven consultant characteristics/behaviors as well as seven program delivery options. At a full-sample level, general preferences were expressed for consultants to lead a physically active lifestyle, regularly attend training sessions and competitions, and have prior experience as an athlete or coach. General preferences were also expressed for program content to be determined by the coach or consultant, and for regular, small doses of mental skills training to be delivered in a face-to-face context throughout the year. At a sub-group level, team sport athletes had stronger preferences than individual sport athletes for program delivery on a group/team basis, while individual sport athletes had stronger preferences than team sport athletes for having a role in determining program content. Findings are discussed in relation to dominant value themes within Malaysian society and the reinforcement of these themes within specific sport subcultures. Key pointsConsultant characteristics and program delivery methods have an impact on the effectiveness of sport psychology services.Preferred consultant characteristics and preferred methods of delivery may be affected by cultural and subcultural values.Elite Malaysian athletes prefer consultants to lead a physically active lifestyle; to regularly attend training/competition; and to have prior experience as an athlete or coach.Elite Malaysian athletes also prefer that the coach or consultant determine program content, and that mental skills training take place in a face-to-face context throughout the year.

  15. Sparse Poisson noisy image deblurring.

    PubMed

    Carlavan, Mikael; Blanc-Féraud, Laure

    2012-04-01

    Deblurring noisy Poisson images has recently been a subject of an increasing amount of works in many areas such as astronomy and biological imaging. In this paper, we focus on confocal microscopy, which is a very popular technique for 3-D imaging of biological living specimens that gives images with a very good resolution (several hundreds of nanometers), although degraded by both blur and Poisson noise. Deconvolution methods have been proposed to reduce these degradations, and in this paper, we focus on techniques that promote the introduction of an explicit prior on the solution. One difficulty of these techniques is to set the value of the parameter, which weights the tradeoff between the data term and the regularizing term. Only few works have been devoted to the research of an automatic selection of this regularizing parameter when considering Poisson noise; therefore, it is often set manually such that it gives the best visual results. We present here two recent methods to estimate this regularizing parameter, and we first propose an improvement of these estimators, which takes advantage of confocal images. Following these estimators, we secondly propose to express the problem of the deconvolution of Poisson noisy images as the minimization of a new constrained problem. The proposed constrained formulation is well suited to this application domain since it is directly expressed using the antilog likelihood of the Poisson distribution and therefore does not require any approximation. We show how to solve the unconstrained and constrained problems using the recent alternating-direction technique, and we present results on synthetic and real data using well-known priors, such as total variation and wavelet transforms. Among these wavelet transforms, we specially focus on the dual-tree complex wavelet transform and on the dictionary composed of curvelets and an undecimated wavelet transform.

  16. Physicians' Views on Advance Care Planning and End-of-Life Care Conversations.

    PubMed

    Fulmer, Terry; Escobedo, Marcus; Berman, Amy; Koren, Mary Jane; Hernández, Sandra; Hult, Angela

    2018-05-23

    To evaluate physicians' views on advance care planning, goals of care, and end-of-life conversations. Random sample telephone survey. United States. Physicians (primary care specialists; pulmonology, cardiology, oncology subspecialists) actively practicing medicine and regularly seeing patients aged 65 and older (N=736; 81% male, 75% white, 66% aged ≥50. A 37-item telephone survey constructed by a professional polling group with national expert oversight measured attitudes and perceptions of barriers and facilitators to advance care planning. Summative data are presented here. Ninety-nine percent of participants agreed that it is important to have end-of-life conversations, yet only 29% reported that they have formal training for such conversations. Those most likely to have training included younger physicians and those caring for a racially and ethnically diverse population. Patient values and preferences were the strongest motivating factors in having advance care planning conversations, with 92% of participants rating it extremely important. Ninety-five percent of participants reported that they supported a new Medicare fee-for-service benefit reimbursing advance care planning. The biggest barrier mentioned was time availability. Other barriers included not wanting a patient to give up hope and feeling uncomfortable. With more than half of physicians reporting that they feel educationally unprepared, there medical school curricula need to be strengthened to ensure readiness for end-of-life conversations. Clinician barriers need to be addressed to meet the needs of older adults and families. Policies that focus on payment for quality should be evaluated at regular intervals to monitor their effect on advance care planning. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  17. A cross-sport comparison of performance-based outcomes of professional athletes following primary microfracture of the knee.

    PubMed

    Schallmo, Michael S; Singh, Sameer K; Barth, Kathryn A; Freshman, Ryan D; Mai, Harry T; Hsu, Wellington K

    2018-05-08

    The purpose of this study was to compare performance-based outcomes among professional athletes in four major North American sports following microfracture to treat symptomatic chondral defects of the knee. Major League Baseball (MLB), National Basketball Association (NBA), National Football League (NFL), and National Hockey League (NHL) athletes who underwent primary unilateral microfracture of the knee were identified through a previously reported protocol based on public sources. Successful return-to-play was defined as returning for at least one professional regular season game after surgery. Regular season player statistics and sport-specific performance scores were compiled for each player. Each player served as his own control, with the season prior to surgery defined as baseline. Comparisons across sports were enabled by adjusting for expected season and career length differences between sports and by calculating percent changes in performance. One hundred thirty one professional athletes who underwent microfracture were included. One hundred three athletes (78.6%) successfully returned to play. The ratio of games started-to-games played before surgery was found to be a significant positive independent predictor of returning (p = 0.002). Compared with their preoperative season, basketball and baseball players demonstrated significantly decreased performance one season after surgery (-14.8%, p = 0.029 and -12.9%, p = 0.002, respectively) that was recoverable to baseline by postoperative seasons 2-3 for baseball players but not for basketball players (-9.7%, p = 0.024). Knee microfracture surgery is associated with a high rate of return to the professional level. However, the impact of this procedure on postoperative performance varied significantly depending on sport. Copyright © 2018. Published by Elsevier B.V.

  18. Control of gene expression by CRISPR-Cas systems

    PubMed Central

    2013-01-01

    Clustered regularly interspaced short palindromic repeats (CRISPR) loci and their associated cas (CRISPR-associated) genes provide adaptive immunity against viruses (phages) and other mobile genetic elements in bacteria and archaea. While most of the early work has largely been dominated by examples of CRISPR-Cas systems directing the cleavage of phage or plasmid DNA, recent studies have revealed a more complex landscape where CRISPR-Cas loci might be involved in gene regulation. In this review, we summarize the role of these loci in the regulation of gene expression as well as the recent development of synthetic gene regulation using engineered CRISPR-Cas systems. PMID:24273648

  19. 24 CFR 87.600 - Semi-annual compilation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Semi-annual compilation. 87.600 Section 87.600 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development NEW RESTRICTIONS ON LOBBYING Agency Reports § 87.600 Semi-annual compilation. (a) The head of each...

  20. 12 CFR 1003.4 - Compilation of loan data.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Compilation of loan data. 1003.4 Section 1003.4....4 Compilation of loan data. (a) Data format and itemization. A financial institution shall collect data regarding applications for, and originations and purchases of, home purchase loans, home...

Top